DEV Community

Blake Teeple
Blake Teeple

Posted on

An Open Letter to Web Developers from an Enterprise SEO

Dear Web Developers,

We need you.

I like to consider myself a Technical SEO (sometimes I get fancy with "SEO Developer"), or the "middle" between SEO and the Web Developers. I have built web servers, applications, scripts, and more. I get it. Development is way more technical and "harder" than SEO, but that's not what this post is about. This is a plea that we need you...to respect SEO on the front-end.

You're probably laughing right now or think this is some joke or rant and that's ok. SEO, what a joke - who needs that? The problem is - everyone needs SEO. SEO (as a whole) is a major influence on how your audience, customers, clients, or however else you want to name them get to you or your company. If it were not for SEO then many major companies would falter and smaller ones would pretty much non-exist when it comes to any sort of "search".

Lately I have been thinking about how I can approach the software team at my current workplace to institute what we call "best practices" in SEO. These range from minifying files, image compression, canonical tags, redirect chains... and even more. I'm not going to break each and every one of my bullet points down for you that I plan to share with them but what I will stress is a few concepts that everyone should start to implement on their servers or with their web applications.

Page Speed Enhancements

These are various enhancements you can make to the server, application, or system to improve your overall Page Speed (which is an official ranking factor w/ Google).

Minimize HTTP Requests

I believe the title to this to be pretty self explanatory. If you don't know what a HTTP request is or does - a HTTP request is a request that is made "downloading" parts of the webpage (scripts, stylesheets, images, etc.).

This is very critical to the overall load time the webpages take. HTTP requests typically account for at least 70% of the total load time of a website or page.

If you're not sure how many requests are being made or how long it's taking the site to fully load (there are multiple checkpoints that are checked by Google) you can open up Developer Tools in Chrome or similar in other browsers and view the "Network" tab. This will basically tell you everything.

The goal is to reduce the amount of requests as much as possible so the site loads as fast as possible when the user hits the page.

Minify & Combine Files

This one is actually the hardest for a lot of teams to accomplish due to the nature of always working in the files in different environments. The management of this is actually pretty hectic and annoying. The process of "minifying" files you should be familiar with already just working with any .min type files.

The process we have at my current place is that when the files are pushed to production they go through the minifying process prior to going "live". It takes a lot of drive to keep the process going and it annoys the developers that like to make changes right on live (they shouldn't be doing that anyways). This process is automated with scripts.

Defer Unnecessary JavaScript

To defer a file means to prevent it from loading until all other elements are loaded. The concept behind this is that we load a bunch of JS to run different elements of the page and make it as user friendly as possible. By deferring some to load after the content or the First Contentful Paint (FCP) you can improve the overall user experience and the site speed for when search engine crawlers like Google hit your website. To make a long story short - they see the content on the page faster and not all the bells and whistles that come later.

Image Compression

I believe this one to be pretty self explanatory as well but I will elaborate. The images should be compressed to be the lowest file size as possible before pushed to production but still of the highest quality possible. This will improve the time it takes to load each image on the page.

SEO Enhancements

I can go into many enhancements one could make to their page or application to improve the overall SEO value. However, here are a few quick pointers you can build into your templates to make life easier in the future (especially if you don't have a SEO handy).

Basic SEO Tags

These tags are often overlooked but are completely underrated. These tags are often the first time a user gets greeted by your company or brand - wouldn't you want to make a good first impression?

I am talking about your title and description tags of course! These are outlined as follows:

<title>BRAND</title>

<meta name="description" content="We are the greatest brand in the world!" />
Enter fullscreen mode Exit fullscreen mode

These two tags should be prevalent on every page and filled out accordingly. Keep in mind this may be the first impression on a user so try your best to win them over!

Heading Tag Structure

This is probably one of the biggest things developers don't abide by. There is a hierarchical structure that you are "suggested" to use to structure a webpage and it goes as follows (in pseudo form):

<h1></h1>
    <h2></h2>
        <h3></h3>
        <h3></h3>
    <h2></h2>
        <h3></h3>
            <h4></h4>
    <h2></h2>
Enter fullscreen mode Exit fullscreen mode

I think you get the idea from the code sample above. Is this going to make or break your website or application's SEO? Probably not. But, it is good to note that Google LOVES when things are organized and easier to "read".

Canonicalization

This is a very simple concept and yet so easily screwed and/or forgot about. In my career this is probably the most "missed" element on the websites I have worked on.

The code sample goes as follows:

<link rel="canonical" href="PAGE URL" />
Enter fullscreen mode Exit fullscreen mode

Simple, right? As you can plainly see the "PAGE URL" part is in fact the URL that is currently being served to the user. Crazy that it is needed.. but believe me - it is. What this little tag tells Google is that this is the preferred version of the content we want them to judge and index. Let me take a step back - preferred version? It tells Google how to navigate our site? Can we force Google to do so?

In short, yes we can. You see many sites out there have what we call "duplicate" content that is so very similar we deem it as the same but it serves a different purpose for the user because at the end of the day we ultimately care about the user, audience, client, etc. We may want to show the user different things than Google (think personalized pages, geo-based pages, etc.) but we don't want Google to get caught up in the mess so we show them a generic page. Hence the canonical tag is born.

The suggested way to handle this is to just have a page "self canonical" meaning you just insert the currently served URL into that tag above and build it dynamically. To do anything further - consult your SEO team as they may have a plan for how it should be implemented.

URL Structure & Crawl Depth

Query strings are the death of SEO. Ok, that was a little over the top. But it is somewhat true in some cases. You want to serve Google and other search engines as friendly of a URL as possible. The easiest I can relate this is comparing the syntax of Assembly to Python. You can't easily read Assembly without prior experience but with Python you can get a basic idea of what is happening if the programmer uses descriptive naming conventions and clean code.

So coming back to URL Structure - shouldn't we give the user and Google the Python version of a URL? A user friendly and readable one? I'm sure you know the answer to that by now. Keep in mind that the amount of directories the file is in is not the crawl depth. We want to have as little of a crawl depth as possible throughout the website (aka how many clicks does it take Google to find the page). If a page has a crawl depth of 4+ (it took 4 or more clicks to get there) it will have almost 0 value to Google. Keep this in mind when designing out the infrastructure for your website.

Robots.txt & Google Analytics Tracking Software

Finally, this really is just a couple things that often get over looked when pushing files and have to be fixed after the fact. It's best practice to have a robots.txt file in the directory of the website for Google to use as a reference for how to crawl your website. You can limit where Google can go (stay away from private files) and only crawl directories you want Google in.

By making sure you have analytics tracking on the site (even if you don't use Google Analytics) you're allowing the SEO or marketing team to do their job and make better informed data driven decisions for your brand/company.


Phew, that was a long one. But are you catching on yet? The whole idea of SEO is to make a website usable and relevant to users AND as easy as possible to crawl/navigate by search engines. It's a pain in the you know what to try and master. It's basically a long term A/B test with some industry accepted ways of handling the simpler things. Everything above I have routinely ran into during my time at an agency and conducting enterprise SEO at major brands.

If you're in need of any more resources to learn please let me know in the comments - I am ALWAYS happy to share knowledge and experience.

Top comments (5)

Collapse
 
africasiaeuro profile image
Heinz Rainer • Edited

Cool. Here's another one. All my sites been switched to Google amp, includes structured data. There are millions of sites out there which don't confrom to Google s speed requirements. Amp solves this issue. Page loads under 2 sec. without cdn.

Collapse
 
teepleb profile image
Blake Teeple

Google has publically stated that there are no true "speed" requirements but you should optimize for fastest possible.

They even stated their current update for speed only targets the slowest of sites and speed as a ranking factor is actually very small. The idea behind speed is for the user experience. It helps you rank indirectly more than directly.

In all honesty, AMP is fairly bad and Google doesnt even like it hence "Web Light" coming out where Google publically stated they modify your pages for mobile users as they choose.

AMP also is easy for you to lose users as they can swipe left/right and be gone. There is no true conversion factor there.

I would suggest leaving AMP to content publications that don't rely on conversions, if anything.

Collapse
 
africasiaeuro profile image
Heinz Rainer

Thanks for your comments. We got lots of advice from a number of sources prior to yours stating the opposite. After reading comments on indexing mobile sites ( and desktop sites ) on their ( Google's) WM blog, I feel our efforts wasted. It was stated clearly that mobile friendly desktop sites will be indexed first. Will skip AMP now.

Thread Thread
 
teepleb profile image
Blake Teeple

Yup, we highly considered it as it was promoted so heavily by Google.

We found 0 actual impact from it and our bounce rate and metrics got so much worse with crap data.

We removed and nothing even happened while using it or when it was removed - ranks stayed the same.

Google announcing Web Light pretty much sealed the deal on AMP. They publicly said they make mobile optimized pages of your site as they see fit whether its AMP or not.

Your best bet is to make a mobile responsive website and not spend time working through stuff like AMP.

At the end of the day Google will do as Google wants so make sure you're catering to your customer or users as much as possible and you'll do fine.

Collapse
 
eli7316 profile image
eli73

Hi, Thank for your article. I need some information about Video Carousel on google page. We have a problem on site with video content page developed with React. In particular we find our content on Video Search but the same contents aren't visible on Video Carousel. Any suggestion? Thanks