Technical SEO Best Practices
31 Jul 2020 •SEO
Technical SEO is crucial for gaining the edge over your competitors in search rankings. But what exactly is it, and how can it be achieved? Read on for a comprehensive guide to technical SEO best practices, including practical examples you can start implementing on your site today!
What is Technical SEO?
Like all forms of search engine optimisation, technical SEO is all about getting your pages to rank higher in search engine results, thereby increasing exposure and traffic to your website. While content SEO focuses on the keywords used within your page copy, technical SEO focuses on making your pages as fast loading and easy to crawl as possible. Getting these factors right will ensure optimal use of your crawl budget (the resources a search engine dedicates to crawling your site) and also make for a better user experience.
With that in mind, here are the top five areas in which to focus your efforts to improve technical SEO and increase your rankings. Read on for a detailed breakdown of each, including practical examples.
1. Search Awareness
Forget ranking well in search engines for a moment, what about ranking at all? If you want your pages to even appear in search results, the first thing you need to do is make search providers aware you exist.
XML Sitemap
Generate a sitemap.xml file containing a list of all URLs on your site available for crawling. This is the most efficient way for search engines to understand which pages to crawl and how often to expect the content to change.
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xhtml="http://www.w3.org/1999/xhtml">
<url>
<loc>https://www.example.com</loc>
<lastmod>2020-06-24</lastmod>
<changefreq>daily</changefreq>
<priority>1.0</priority>
</url>
<url>
<loc>https://www.example.com/terms-and-conditions</loc>
<lastmod>2018-08-12</lastmod>
<changefreq>yearly</changefreq>
<priority>0.1</priority>
</url>
</urlset>
If you’re using a CMS to manage your site, this step can be most easily achieved with an xml sitemap generator plugin such as Yoast SEO for WordPress or the official Sitemap Plugin for Grav.
Lastly, don’t forget to submit the sitemap via your chosen provider’s search management tool such as Google Search Console.
Robots.txt
Use a robots.txt file to indicate to search engines any areas or pages of your site that you do not wish to be crawled in order to save on crawl budget. It is also good practice to provide a link to your sitemap.xml file within your robots.txt file.
User-agent: *
Disallow: /logs/
Sitemap: https://www.example.com/sitemap.xml
Refer to this Google Help Center article for specifics on syntax and a handy robots.txt testing tool.
2. Document Rendering
You’ve told search engines that you want your pages crawled, now it’s time to ensure they can be crawled. The following tags and attributes are used by web browsers to render your HTML code into a document that can be understood by both search engines and humans. Without these indicators your pages may not interpreted correctly, making them less likely to rank in search.
Doctype Declaration
Ensure all pages begin with a document type declaration to indicate (you guessed it) the document type.
<!DOCTYPE html>
HTML lang attribute
Include an appropriate lang attribute in your opening html tag to indicate the language of your page’s content.
<html lang="en">
HTML Encoding (Character Set)
Include a meta charset attribute to indicate the HTML encoding (character set) used.
<meta charset="UTF-8">
Alternate and Canonical Links
If you have a legitimate reason to serve multiple versions of the same page with slightly differing content (for example one in British English and one in US English), use alternate and canonical link elements to indicate to search engines which version of the page you want to appear in search results. Neglecting to do this can negatively impact the ranking of both pages due to duplicate content and also means your precious crawl budget is wasted on multiple versions of the same page.
<link rel="alternate" href="https://www.example.com/us" hreflang="en-us" />
<link rel="alternate" href="https://www.example.com/uk" hreflang="en-gb" />
<link rel="canonical" href="https://www.example.com/us" />
0 Errors
A 4XX status code means your page cannot be accessed, usually because the link is incorrect and the URL does not exist. A 5XX status code indicates a server error, meaning the URL points somewhere, but there is a problem in displaying the page. Both are bad news as they stand in the way of users and search engines accessing your content.
Once again, Google Search Console is your best friend here. The Index Coverage report will notify you of any URLs in your sitemap that produce an error when crawled.
Responsive Design
On July 1 2019 Google enabled mobile-first indexing for all new websites, meaning if your content doesn’t display well on mobile, your rankings are going to suffer. At the very least, include a viewport meta tag to appropriately scale your pages for viewing on mobile devices.
Use Google’s Mobile-Friendly Test to determine whether Google considers your pages to be mobile friendly and also gain tips on how you can make it more so.
3. Content Indexing
Your pages are code-compliant, error free ready to be crawled. The next step is to focus on how search engines will categorise and index your page’s content. The following properties are used by search engines and humans to determine what your pages are all about.
Title Tag
A unique title tag of around 50 to 60 characters containing your most important keywords is essential for your page to rank in search. It is also the most prominent piece of information about your page that appears in search engine results.
<title>Example Pet Co.: Dog Toys and Accessories</title>
Meta Description
The meta description is the blurb that appears beneath the page title in search engine results. It should be between 120 and 160 characters long and unique to each page. A good meta description will not only give users an idea of what your page is about, but will entice them to click through to your page to find out more.
<meta name="description" content="We offer a complete range of dog toys and accessories to keep your pet happy, healthy and entertained." />
Structured Data
Structured data is in essence a code block in your page’s source code containing information about the page, your website, your organisation and a myriad of other things depending on your preferences. It is formatted using the Schema.org data vocabulary and is the most efficient way for search providers to understand the relationship between the content it finds on your page.
Imagine a search engine robot crawling the source code of your page. How will it determine which words are the name of your company versus a company name you mention in a blog about? Of course search robots are incredibly intelligent and can eventually determine this type of information, but why make it harder for them and waste your precious crawl budget in the process?
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "Organization",
"name": "Example Pet Co.",
"url": "https://www.example.com"
}
</script>
Implementing structured data for each page of your site manually is not for the faint of heart. If you are using a CMS to manage your site consider using a schema markup generator plugin such as the Aura Plugin for Grav or Yoast SEO for Wordpress to do the heavy lifting for you.
4. Performance
Like any business, search engines want to provide the best customer experience possible. A huge part of this is of course matching search queries to relevant results. But it also means connecting users to their desired content as quickly as possible. By providing fast loading pages you increase your chances of ranking above competitors whose page performance may not be as good. Google PageSpeed Insights considers the following metrics to provide an overall performance score for your pages on both mobile and desktop platforms. As the name implies it will also provide insights on what you can do to improve your scores.
First Contentful Paint
The time it takes for the browser to display the first text or image in the page. In other words, the amount of time between the user’s direction to load a new page and the first sign from their device that the request is being fulfilled. The shorter the wait, the better the user experience.
Time to Interactive
The amount of time until the page becomes fully interactive. Have you ever navigated to a page and as the header appears you try and scroll down but find you need to wait a few more seconds? This normally means the browser is still busy loading resources that could potentially have been deferred. This can be a frustrating experience and Google agrees. The shorter the wait time the better your pages will score here.
Total Blocking Time
The total amount of time user input is blocked between the above two metrics; First Contentful Paint and Time to Interactive. As noted above, the longer a user is blocked from interacting with the page, the poorer the overall user experience.
Speed Index
Here a video is captured of the page loading, then a comparison is made between the first and last frames to determine load time from a visual perspective. As with all load time metrics, faster is better.
Largest Contentful Paint
The time at which the largest image or text block is rendered. Google says LCP is the most relevant indicator of when the main content of the page is loaded, and therefore a vital metric for measuring perceived load speed, or how fast pages “feel”.
Cumulative Layout Shift
A measurement of the movement of page elements as the page loads. I’m sure you’ve had the experience of reading something on a webpage when suddenly the layout shifts and you lose your place. Even more annoying is when you go to click a button and the same thing happens causing you to click on something else. This creates a negative user experience and should therefore be avoided where possible.
5. Security
Last but by no means least, let’s take a look at security. Search engines consider web security as a ranking factor, meaning pages that don’t comply with the following factors may rank lower than those that do.
HTTPS Encryption
Installing a security certificate is a simple way to encrypt the data exchanged between your website and the user’s device. Most domain registrars and hosting providers offer a way to bundle in an SSL certificate with your domain name or website hosting. Alternatively you could install your own certificate for free from Let’s Encrypt, an open certificate authority aiming to create a more secure internet for everyone.
Once installed, use an online tool such as the SSL Server Test from SSL Labs to check that your certificate is valid and up to date.
Permanent Redirect to HTTPS
Configure an automatic 301 (permanent) redirect at the server level to direct any request for a http URL to https. This ensures users and search engines are accessing your pages via a secure connection. The approach will vary depending on your web server but for Apache your virtual host configuration would look something like:
<VirtualHost *:80>
ServerName www.example.com
Redirect / https://www.example.com
</VirtualHost>
<VirtualHost _default_:443>
ServerName www.example.com
DocumentRoot /var/www/www.example.com
SSLEngine On
# etc...
</VirtualHost>
Avoid Mixed Content
Ensure that any links to resources loaded within your pages (images, CSS, Javascript etc.) are loaded via an https:// URL. Not doing so will produce an insecure content warning in Chrome and other browsers, which is sure to have a negative impact on your users’ confidence in your product.
Technical SEO, Done.
That brings us to the end of our list of technical SEO best practices. By following this guide you have ensured your pages will be found, understood and appropriately indexed by search engines. Your users will enjoy fast loading pages over a secure connection, all of which adds to a superior user experience and maximises your chance of ranking above your competitors in search.