Photo credit: #WOCinTech Chat
According to a 2012 research study commissioned by the Pew Research Center, “91% of adults use search engines to find information on the web [and] on any given day, 59% of those using the internet use search engines.” It’s also widely known that software developers use search engines for everything from troubleshooting to finding documentation. (And if you’re really curious, I found a peer-reviewed research paper about this, too.) Developers and general internet users alike often start their research as a query in a search engine bar.
This simple mechanism (to us end-users) allows us to retrieve information at a moment’s notice. The complexities of search are abstracted for us. We don’t have to think as hard about how a search engine knew exactly what you meant and how to was able to crawl through billions of sites to serve up the most relevant article to your query. On the one hand, we have algorithms that power search engines. We don’t know exactly how these algorithms work but can posit based on behavior (and some do share a few factors that affect how sites rank). And on the other hand, there are numerous professionals trying to figure out how to optimize websites so they land on the first page of search results. This process is called search engine optimization, or SEO.
So how does SEO relate to software developers? You may have a freelancing business or a personal blog, or maybe you work on a site engineering team. In all of these cases, you likely want your site to get indexed in search engines to attract visitors who may not know about you, but would find your content (or business) interesting. Since the vast majority of internet users start their research in search engines, you want your site to be there.
Let’s say you’ve started a personal blog on a domain you own. Even if you’re writing for yourself, chances are you want other people to read your work. You likely share your blog posts on social media or syndicate them to developer blogging platforms like DEV Community. You may have looked up techniques for building an audience or getting more readers on your blog. What if you don’t blog; you screencast on YouTube instead? SEO will benefit you all the same, whether a visitor starts their search in the Google search bar or in YouTube’s. Using SEO helps you send the right signals to your prospective visitors. It’s how you tell them, “Hey, I have the answer to your question!” or “I offer the solution to your problem.”
Search engine optimization (SEO) is the process of improving site content to rank higher in search results. According to a 2017 article from Forbes, only 25% of internet users scroll the first page of search results. (There’s an industry term for these pages: search engine results page or SERPs.) Thus, companies, individuals, and other entities have two options to make it to the first page: through paid search advertising or search engine optimization. And people pay lots of money for tools and access to experts in order to rank higher in organic search results. (Website traffic coming from non-ad search results is called organic search. Traffic coming from ads in search results is called paid search.)
But the sobering truth is that no one — not even the most expensive and experienced SEO consultant — can guarantee you’ll end up on the first page, no matter how much you optimize your content for search engines. There are lots of ways people try to “game” search algorithms using SEO even though their content isn’t good or relevant. This is called SEO spam, and it’s lucrative. In other words, you could spend considerable time and money to learn and use the many SEO tools and techniques out there and still not make it to the first page.
When many developers think about site structure or architecture, the first thing that comes to mind may be a web development framework. Developers will consider how the framework structures information, how and where content is stored, and how to make changes and updates to a page from the site.
But content strategists and even search engines see structure differently. Site structure, from this perspective, refers to the way information (webpages, text, images, and other content) is organized. If your website is a house, the site structure is the frame (the wooden or steam beams that are the house’s “skeleton”). The site structure, built from a blueprint, serves as the foundation. It’s what differentiates the rooms of a house and their access points, so house frames also account for doors, windows, walkways, and other elements pertaining to a specific room and the surrounding space.
We create websites for a specific purpose, and people visit our sites to fulfill a specific need. They visit sites to get informed, access information, or change information. Our job is to help people take whatever actions they need on our site by providing them with the right information and pointing them in the right direction.
Search engines work by crawling websites to see how information is organized. Oftentimes, websites will create and submit a sitemap (in XML format) to search engines, which shows the way webpages are categorized. Sites will also submit a “robot exclusion standard” file, also known as a robots.txt file, which further tells search engines which sites to crawl and index and which sites to omit from search results. For example, an e-Commerce site might want its products, blog, and about pages indexed, but will want to omit the checkout page and the purchase confirmation page. If you have members-only pages, you can also omit those, too, from search engines by listing them in your robots.txt file.
On the site itself, search engines look for information about information (or metadata), like page titles, headers, descriptions, links, and structured JSON. They will also look for keywords that they will use to index pages in relevant search results. Most of this metadata is available in HTML tags and attributes. (Many site builders and static site generators will simplify this for you, so you don’t actually need to touch HTML in order to have this metadata added.)
Search engine crawlers are not “smart” in that they cannot infer what something is about based on “looking” at a page the way we can. They cannot tell whether the content of a webpage is valuable, relevant, or high-quality. All it can do is determine relevance based on the keywords being used and the relationships between various pages on a site.
While we don’t know everything about how individual search engines work (the Google search algorithm reportedly factors in over 300 elements in search results), we know that search engines also consider page load times and HTML tags in how high a website ranks.
You can decide to “hide” a page from search results, like checkout pages on e-Commerce sites, pages that are a part of a member’s only section, or pages that have been removed from your site.
No matter the platform you’re using to host your site, you want it to allow you to do the following:
- Create or generate a sitemap
- Customize your URLs
- Create redirects as necessary
- Add structured data
- Craft descriptions and edit page titles
- Add alt-text to images
- Compress images
- Create 404 pages just in case
In conjunction with free SEO tools, having the dexterity to edit these aspects of your site SEO gives you more ability to influence the way your content is structured which affects not just rankings, but your site experience.
In 2017, I was responsible for managing Bundler’s content strategy. (Bundler is the dependency manager for the Ruby programming language.) Part of my role was to help make the documentation on the Bundler.io documentation site easier to find for visitors.
I conducted an SEO audit of the Bundler site and competitive analysis of three other open-source software documentation websites to better understand what they were doing well and how we could apply those best practices in our documentation.
The SEO audit was the most enlightening part. I spotted issues with titles, which made it hard for visitors to understand where they were in relation to the rest of the docs site and didn’t clarify what visitors would be able to do or learn. I created a standardized titling scheme, updated titles for the most recent release, and added more copy to the documentation landing page.
In the quarter that followed the audit, we observed the following on our documentation site:
- An 8.38% increase in total page views over the previous period
- A 7.17% increase in total users finding the pages via organic search over the previous period
- A 7.78% increase in average time on page over the previous period
In summary, these were really impressive changes! We were able to accomplish this work by setting aside the necessary time and effort to understand the problem and leverage SEO to help us create more clear messaging and improve organic search results.
There is no shortage of sites, books, and tools that you can use to learn more about SEO. I encourage you to search through the official documentation of any framework or platform not listed for SEO-related guidance.
The following tools are all free SEO tools that I have used to improve my site's SEO. They range from coming up with content ideas, to creating structured JSON for different pages on my site, to monitoring site performance and getting ranked on search engines:
- Ubersuggest: https://neilpatel.com/ubersuggest/
- Answer The Public: https://answerthepublic.com/
- Schema.org (to learn about structured JSON for SEO)
- Technical SEO Markup Generator: https://technicalseo.com/tools/schema-markup-generator/
- WebPageTest: https://www.webpagetest.org/
- Google Search Console: https://search.google.com/search-console/about
- Bing Webmaster Tools: https://www.bing.com/webmasters/about
I'm Stephanie, a Content Strategist and Technical PM. Visit developersguidetocontent.com to learn more about my work!