5 SEO Guidelines for Web Developers
As a Developer, if you've decided that you are going to rely on organic search engine results (as opposed to paid search traffic or display advertising) as the primary driver of traffic to your website, you need to take that into account when coding pages.
SEO is about much more than keywords, synonyms and content marketing - there are a lot of technical aspects going on behind the scenes that help determine where a page ranks in search results.
The first step is to make sure your page is accessible to search engines, and that their robots can see the page content. In Google Search Console, use Fetch as Google in the Crawl section to see how your page appears to search engines. Remember, crawlers can't access iframes and are limited when indexing content in Flash or Silverlight, so if you've got important content, keep it in HTML.
Once your site basics can be seen, crawled and indexed by search engines, use these guidelines to make sure robots can properly figure out what your pages are about, how they relate to keywords and what sort of user experience they will provide.
Write URLs for SEO
Clean URLs
A page's URL is an integral part of its user experience and SEO. In fact, it's the first thing search engine crawlers see and, ideally, it tells them a lot regarding the page and its content. That means your URLs need to be clean, easy to read, descriptive and free of URL parameters. Take two URLs for example:
https://www.example.com/category/index.jsp?category_id=123&product_id=456&referrer=789 https://www.example.com/category/product-name
The first URL has unnecessary parameters that are likely to confuse robots and people since they can't see what category or product the page is for. In this case, people are much less likely to share or click on this URL, and search engines will have trouble determining its relevance to a keyword. The second URL is much more preferable. It's easier to read, tells you what category and product you will find on the page and doesn't contain any confusing parameters or query strings.
You often wind up with URL parameters due either to analytics and tracking programs, and when your CMS serves dynamic page elements like filters and sorting. If you're using an advanced CMS, like WordPress, you can rewrite your URLs by changing the permalink settings in the admin main menu.
Optimized URLs
The structure and words you use in your URLs are also very important for SEO. The URL's path helps search engines understand the page's relationship and importance to the rest of the site. The words used in the URL tells them how relevant that page is to a particular topic or keyword. A well-optimized URL structure has the following elements:
- Short and descriptive: Ideally your use of keywords will describe the page content. If, for whatever reason, you don't use keywords in your URLs, keep the path as efficient as possible. Use as few words as possible and avoid stop words (the, a, of, on, etc.) altogether.
- Hyphens instead of underscores: When separating words in URLs, always use hyphens. Search engines use hyphens as word separators, so they are able to recognize urls-written-like-this. They don't use underscores to denote anything, so they don't recognize them. That means they'll see urls_written_like_this the same as urlswrittenlikethis. If you write URLs like that, they'll really struggle to interpret them and recognize keywords.
- Keywords used at the beginning: Put your most important keywords at the beginning of the URL. Search engine crawlers assign more value to these words. This is another reason to keep your URLs short: The fewer words in the URL, the more value search engines place on each one. However, it's absolutely vital that you use keywords naturally. Otherwise your page could come across as low quality or spammy. If you're targeting a longtail keyword, consider removing your category and sub-category names to keep your URLs short.
Optimizing URLs using keywords also makes it more likely that the anchor text for your links will use relevant keywords.
Meta Tags
Your code is important not just because it creates a quality page for users. Search engines also look at meta tags to learn things about your page. Even if you don't write your meta tags yourself (this is often done by marketers), you should still understand how they work for SEO. There are three meta tags that are especially important for SEO:
- Title tag: The title tag is one of the most important on page SEO signals. They are perhaps the strongest hint you can give to search engines about your page's topic. Therefore, use your most important target keyword at the beginning of the title so search engines can see if the page is relevant to a given search. A well-optimized title tag is no more than 60 characters, including spaces and punctuation, with 50-60 being the ideal length. If you use more than one keyword or include your brand, separate them using pipes (the | character). Your title tag should look like:
- If you're optimizing for local search, use your target location, business and industry in your title tag as well as your keyword. So your local title tag might be something like <title>Smith & Sons | Construction | Toledo</title>.
- Meta description: Meta descriptions aren't used directly as a ranking signal, but they are still important for SEO. Search engines still sometimes look at them to help determine a page's topic and they're combined with title tags to form your search snippet. Search snippets are the title, link and page description displayed in search results. They essentially work as a free text ad for your page, with keywords matching search queries displayed in bold. Having a clear and accurate meta description will help increase click-through rate (CTR) and decrease bounce rate, both of which look good to search engines and can help improve your rank. If relevant, include words like "cheap", "sale", "free shipping" or "reviews" to attract in-market searchers. The meta description tag looks like this:
- <meta name="description" content="Your short page description, no more than 160 characters." />`
- Robots: The robots meta tag is used to tell search engine crawlers if they can or cannot index a page or follow the links on that page. This meta tag will keep search engines from indexing pages they find by following links on other sites, which would not be prevented by your robots.txt file. The robots meta tag looks like this:
- <meta name="robots" content="noindex"/>
- You can also prevent search engines from following the links on your page by adding the "nofollow" value to the content attribute. This would be advisable if your page has a lot of links that you don't really want to pass value or if your page includes several paid links via native marketing. A robots meta tag using "nofollow" would look like this:
- <meta name="robots" content="noindex, nofollow" />
- Note that disallowing pages using the robots.txt file does not negate your need for a robots meta tag. While Google won't index these pages, they may still show them in the search results, replacing the meta description with 'A description for this result is not available because of this site's robots.txt'. If you're using the meta robots noindex tag, make sure you don't also disallow the page in your robots.txt file, as this will prevent the crawlers from ever seeing it.
Redirects
Developers need to move content around a site all the time, often hosting it at a new URL and setting up a redirect to send visitors to the new page. Redirects are good for your SEO because search engines like when there's one canonical version of something. So if you have two or more paths to get to the same destination, like if you've temporarily moved content to a new folder or copied pages to a subdomain, they tend to get a little confused and will treat your pages as duplicate content. Using redirects on your old pages pointing to your new pages will make sure that users not only wind up in the right place, but that search engine spiders do too. If you don't use redirects, you risk search engines serving the wrong page in search results, and assigning trust and authority to outdated URLs.
One of the biggest benefits of using 301 (permanent) and 302 (temporary) redirects it that they pass full link juice on to the destination page. This allows you to move content without suffering much in terms of ranking and traffic. It's better for users as well because they won't have to deal with dead links and 404 pages.
Note that until relatively recently, it was SEO best practice to use 301 instead of 302 redirects because the latter didn't pass link juice. That's no longer the case. Google treats 302 redirects as if they were 301s and passes full PageRank to the destination page. Also, avoid using more than one redirect in a row. Search engines really don't like redirect chains, and it's really inefficient for your server as well.
If you're planning to do a site migration where the URL strings will remain the same, use the .htaccess rewrite methods shown above to save you time.
Schema Markup
Schema.org markup gives meaning to the content on your page in a way that search engines can understand. You can use it on your About Us page to differentiate between address, opening hours and prices, for example. It's used by Google in its Knowledge Graph rich snippets and it's a huge boost for your SEO. If you want to see schema markup in action, just search for your favorite recipe. In the search results you'll see the title, URL and meta description just like a normal search snippet. But there will also be a picture of the dish and likely a star rating.
When search engines can more easily understand the content on the page, the easier it is for them to determine what the content is about and how it relates to different topics. The better they understand page content, the more likely it is to rank highly for searches related to your keywords.
Semantic markup also helps assistive applications like screen readers, making for an improved user experience for your site's visitors.
Mobile Friendliness
Earlier this year Google released its mobile search algorithm - a time that many referred to as "Mobilegeddon" - and introduced a new ranking signal knows as "mobile friendliness". Mobile friendliness is measured by several different criteria. If you've already created the mobile version of your site, find out if it's mobile friendly using Google's Mobile-Friendly Test to get an idea how your site is performing and where you should improve to enhance your mobile user experience and SEO.
Find your site's mobile friendliness using WooRank's website audit. Find issues that could be hurting your site's mobile friendliness such as text size, embedded objects using Flash or unoptimal tap target size.
Mobile Page Speed
Page loading time is a huge part of mobile friendliness. 40% of users will abandon a mobile page if it hasn't loaded in three seconds, while Google expects your page to render above the fold (ATF) content in no longer than one second. After the normal process of DNS lookup, TCP handshake and HTTP request and response, you've really only got about 400 milliseconds to load your ATF content. Optimize your mobile page speed by:
- Optimizing image size: Don't use HTML to reduce your image size. That just changes the appearance of the image. Use an image editor, like Photoshop, to reduce the size of your images.
- Relying on browser caching: Leverage browser caching to reduce the number of HTTP requests.
- Minimize ATF content size: The first TCP connection isn't able to fully utilize a connection's bandwidth on the first roundtrip, which means the number of packets it can send is very limited. In order to render your ATF content it needs to be 148 kb or less. Keep your server up to date to avoid further limiting the number of packets you can send in the first connection.
- Minifying code: Remove extraneous characters from JavaScript and stylesheets using YUI Compressor or JSMIn. Minifying code can improve caching and reduce bandwidth usage.
- Using Google AMP: Google stores pages using the Accelerated Mobile Page markup in a dedicated cache and serves them nearly instantly from there. Read more on how AMP can impact your SEO here.
If you're struggling with your page speed, use Google's PageSpeed Insights tool. PageSpeed measures the performance of your page as both mobile and desktop user-agents and evaluates time to render ATF content and the time to render the entire page. Or use browser tools like Developer Console in Chrome, Web Console in Firefox or Tools Console in Internet Explorer to find bottlenecks and errors on your page.
Mobile Site Structure
You have three options when structuring the mobile version of your website. You can use responsive design, mobile subdomains and dynamic design.
- Responsive design: This is Google's recommended way to create a mobile site. It doesn't require any changes to your current code other than setting the viewport meta tag. The viewport tells browsers to display a page based on device screen size. The viewport meta tag for responsive design looks like this:
- <meta name="viewport" content="width-device-width, initial-scale=1.0"/>
- Dynamic Design: This method requires more time and effort than responsive design. It requires detecting the user-agent and serving different HTML code to mobile and desktop browsers. Use the vary: user-agent HTTP header to tell search engines that you will be serving different HTML based on user-agent.
- To add vary user-agent in Apache, add this code to your .htaccess:
- Header append Vary User-Agent
- If you use Wordpress, add the following code in functions.php:
- function add_vary_header($headers) { $headers['Vary'] = 'User-Agent'; return $headers; } add_filter('wp_headers', 'add_vary_header');
- To set vary user-agent via PHP, add this code:
- <?php header("Vary: User-Agent, Accept"); ?>
- Mobile subdomain: This will require much more time and effort than the previous two methods, as it requires building an entirely separate mobile website and hosting it on a subdomain, usually mobile.example.com or m.example.com. Googlebot won't be able to tell that these pages are reserved for mobile users, so you'll need to use the rel="canonical" tag to show the relationship between any duplicate pages you have. This method is complicated and expensive compared to the other two, especially for large sites. It's generally not recommended.
Conclusion
At the end of the day, search engines aren't out to get you, no matter how much it might seem like they are. All they want to do is provide their users with the best possible pages based on their search queries, which is really your goal too. And while there's never any guarantee that you'll get the number one ranking, if you optimize your URLs, use redirects intelligently, implement schema markup and make your site user friendly, you'll be well on your way to high rankings and increased search traffic.
Does your site have any elements that could be blocking crawlers from accessing your content? Is it structured for SEO? Audit your site with WooRank to evaluate its performance across more than 70 technical and on page criteria.
Article updated 18th October 2020