Blog

Search
Close this search box.

Haroon Q. Raja

Technology Solutions Architect

How to perform a killer technical SEO audit

Content may be king, but if your website serves it in a manner that’s frustrating for your visitors to experience, or if it’s structured in a way that makes it inefficient for search engine bots to crawl and index, you might be missing out on ranking benefits over your competitors. That’s where technical SEO comes in. And since we can’t fix what we don’t know is broken, the first step towards improving technical SEO is to perform a comprehensive website audit.

The purpose of a technical SEO audit is to identify problems and potential areas for improvement in your website that might be keeping you from utilizing your full ranking potential on major search engines, primarily Google.

Scope

First and foremost, there’s no point in optimizing pages for SEO that you wouldn’t even want to rank on Google. That’s why we’ll be skipping parts that don’t offer any SEO benefits, such as Terms of Service, Privacy Policy, and areas requiring visitors to sign in to view the content.

Furthermore, since this isn’t a content audit, we wouldn’t be analyzing every single page that we want to rank either. For sets of technically identical pages such as blog posts, category archives, products etc., we’ll only be analyzing one page for each, in order to provide guidelines on factors to consider when creating or optimizing any of those pages.

Areas Covered

We’ll be covering the following key areas in this audit:

  1. Discoverability
  2. Technology Stack
  3. User Experience
  4. Content
  5. Metadata

Let’s get into the nitty gritty of each.

Note that GSC will refer to Google Search Console, which we will be utilizing heavily through the course of this audit.

1. Discoverability

In order to rank your website’s content, search engine crawlers must be able to find it. Furthermore, if they find the same content duplicated across multiple sources, they might not choose to index the pages you want.

Discoverability depends on several factors.

Domain & Protocol

To avoid duplicate content issues on your site, you should check whether it’s accessible both with and without the www domain prefix. Also, check if it’s being served over both http and https. Google considers these domain and protocol variations as distinct websites, so if all other combinations don’t redirect to the domain and protocol you choose to use primarily, you have a problem.

Robots.txt

The robots.txt file can be used to tell search engines which parts of you site you want to index, and which areas should they ignore. It’s up to search engine crawlers to honor it, but Google, Bing and any other reputed ones do.

Check if the robots.txt file is present, and if it properly allows and blocks relevant areas.

Analysis Tool: GSC
Reference: https://developers.google.com/search/reference/robots_txt

XML Sitemaps

Just the way your site map provides an overview of the site’s entire structure and pages to your visitors, XML sitemaps do the same for search engine crawlers.

Check if any required XML sitemaps have been submitted to Google, and have all the pages submitted through them been indexed.

Analysis Tool: GSC

Index

GSC lets you view and debug the index of your website. Check if there are any indexing errors, and also see if there have been any unexpected changes in the number of indexed pages, such as a steep decline or a sharp rise.

IP Canonicalization

If your site is accessible at its IP address, it can also create duplicate content issues. Check by visiting the IP address of your website, and see if it redirects to its domain or just starts loading the site with the IP still showing in the address bar.

Domain Canonicalization

Content accessible at multiple URLs on your website can again create duplicate content issues. Examples are inks with query strings such as /store and /store?ref=newsletter, or /cars and /cars?sort=price. Check if all variants of these pages have the primary page set as canonical. You can also spot these while viewing the index in GSC. If you see different URL variations of the same page with query strings in the index, we have a problem!

2. Technology Stack

Nobody likes websites with errors, server timeouts, slow pages, spam and malware infections, and security warnings. This is especially true for search engine crawlers, who could demote your rankings, show warnings to visitors, or even remove your site entirely from their index if they consistently find such issues when crawling it.

DNS

Your DNS server is responsible for resolving your domain name into its IP address. Slow or overloaded DNS servers such as those offered by many web hosts or domain registrars by default aren’t suitable for high-traffic websites as they’re serving too many requests at the same time.

Check if the DNS resolution time is quick enough, especially during high traffic times. If it’s too slow, you might need to consider third-party DNS services.

Analysis Tool: http://dnscheck.pingdom.com/

Web Hosting

When it comes to hosting, you really do get what you pay for – no exceptions whatsoever. Check if the website is hosted on a poor quality host such as GoDaddy or any EIG-owned host (HostGator, BlueHost, iPage, Site5 and countless more). If it’s any of these, we’ve got a problem, even if everything seems fine so far. Also, consider if you need high-end managed hosting, or a self-managed VPS if you’ve got the technical chops for it.

Web Server

As the software responsible for receiving your visitors requests and responding with the web pages, your server technology and the way it handles traffic can make a huge difference in its availability and reliability. Check what server technology is being used, and if there are better options available (such as Nginx as a far superior alternative to Apache.)

Application Server

For dynamic websites, the application software running at the backend is responsible for building up pages and handing them over to the web server before they’re served to your audience. Check if the web application server (such as PHP) is the latest stable version, and if its taking full benefit of its latest performance and security features.

Database Server

Application servers store and retrieve data from databases, so having a fast, secure and reliable database server is of utmost importance. Check if the database server (such as MySQL) is the latest stable version, has been tweaked for performance, and load tested for scaling. Also check if a snap-in alternative is available that might offer security and performance benefits (such as MariaDB and Percona in place of MySQL).

Security

Google has started penalizing sites that accept any user input (including a search box) in its search results if they don’t have an SSL certificate. Furthermore, Google Chrome and Mozilla Firefox show such pages as insecure to visitors in the address bar. Apart from that, lack of security can results in malware and spam injection, which could plummet your rankings. Test if SSL is installed and properly hardened. Also check if the site is listed on any blacklist due to malware or spam. Run a scan for malware and spam injection as well.

Analysis Tools:

Cache

Not all user requests need to hit your web server, especially for static resources already cachable in your browser. Similarly, not all server requests need the application server to compile and run the page’s code, if the exact same page was recently compiled for another visitor. And as you might guess, the application server needn’t request data from the database server when compiling a page if the exact same database query was recently executed. Caching at each of these layers ensures that resource-intensive components of your stack don’t have to do redundant work, and can therefore handle much higher traffic loads.

Test if caching has been implemented at browser, web server, application server and database levels. Also check what type of technology is being used for caching, to see if more efficient options are available.

There are several free tools out there to help test browser and web server cache. For information on the application server and database caching being used, you’ll need to contact the developer.

3. User Experience

Google can determine whether or not your visitors enjoy browsing your website. It can tell if they engage with its content or close it in frustration, making UX a key ranking factor. This is especially true for mobile visitors on small screens and potentially slow 3G connections. Offering users with disabilities a decent browsing experience also goes a long way.

Performance

How long does it take your site to load could mean the difference between user engagement and a bounce. Check what’s the time to first byte (TTFB). Check how long it takes for the page to fully load. Test multiple key pages to identify any slow ones. Test the scores on major performance and speed testing websites. Check for broken links, and HTML, CSS or JavaScript errors.

Analysis Tools:

User Interface

Are your users able to easily navigate your website and find the content they need? Check if all key sections are accessible from the home page. Also check if all important content within each section is accessible from the section page. Is there a quick and robust search feature available? Are any plugins (such as Flash or Java) required to view the site or any of its parts?

Mobile Experience

Is the website mobile responsive? Does it load fast enough on slow 3G? Are tap targets large and spaced properly to make them easy to tap and avoid unintentional taps?

Analysis Tool: https://search.google.com/test/mobile-friendly

Accessibility

Is the website accessible to the visually impaired? Is a printer-friendly version available for those who can’t look at a screen for extended durations?

Analysis Tools:

4. Content

Without unique, high-value and engaging content, even a perfectly tuned world-class dedicated server and highly tweaked CMS can’t help you rank. Therefore, laying sound technical foundations and policies for content can go a long way.

Titles

Are titles for all pages with ranking value optimized properly with keywords? Are they overoptimized, which could result in a penalty? Are titles of a reasonable length to be completely viewable in search results pages?

URLs

Are URLs SEO-friendly? Do they follow a logical structural pattern according to the site’s content and hierarchy? Are URLs utilizing their full potential in terms of keywords without being overoptimized?

Descriptions

Do all pages with ranking value have meta descriptions either defined explicitly or set to be generated automatically by the CMS from their content? Are the descriptions of adequate length? Do they contain your keywords?

Meta Keywords

After being abused for spam for several years, the keywords meta tag was put on ignore by most search engines, including Google. Furthermore, if keyword stuffing is detected in it, Google may even penalize you for using it, leaving no real reason for its use, other than in very exceptional cases. Check if it’s being used, and in case it is, what’s the reason for its presence?

Headings

Are heading tags present on each page? Do they represent the page’s content and hierarchy accurately? Is heading text relevant to the subject matter? Have headings been optimized for keywords properly? Are there any headings that are better off being regular text or links?

Text

Is the text-to-HTML ratio reasonable? Is the text unique and high-value? Is there too much fluff and filler text?

Keywords

Has a proper keyword strategy been devised? Are the chosen keywords practical to rank for? Are there any obvious signs of excessive keyword stuffing? Are keywords present in titles, headings, descriptions and content?

Images

Do images have proper ALT text? Are image file names descriptive? Are images relevant and engaging?

Is link text properly optimized? Is it overoptimized? Are links descriptive enough and easy to identify? Are there any links that need to be set to nofollow? Has any black-hat link-building been done?

Blog

Can the website and the business it represents benefit from having a blog to build authority and a loyal audience? Has such a blog been implemented? Is the blog getting regularly updated and utilized to its fullest?

5. Metadata

Having properly defined metadata in your site can increase your chances of appearing in rich snippets and cards in Google, providing you more exposure than that offered by regular search results. Furthermore, visually appealing previews can encourage click-through rate, be it from search results or social media sharing. Lastly, for local businesses with a physical presence, having properly implemented Local Business data can vastly increase exposure to your target audience.

Structured Data

Does the site offer any content that can benefit from structured data? Has any structured data been implemented? What format has been used for the implementation? Has the website been properly linked to its social media accounts?

Analysis Tool: Google Search Console

Rich Cards

Can the website benefit from any of the available rich card implementations? Have any rich cards been implemented? Have any rich cards started to show up in results?

Analysis Tool: Google Search Console’s Rich Cards tool

AMP

Has an Accelerated Mobile Pages strategy been considered? Have any AMP pages been published? Are AMP pages showing up in search results? Has the traffic seen a rise, a fall or no change as a consequence? How about the revenue and profits? Rise, fall or no change?

Facebook Open Graph

Have Open Graph tags been incorporated? Are sharing previews for key pages showing up as expected on Facebook, complete with titles, images and descriptions?

Analysis Tool: https://developers.facebook.com/tools/debug/

Twitter Cards

Have Twitter Cards been implemented for key pages? Are the cards showing up in sharing previews for those pages on Twitter?

Analysis Tool: https://cards-dev.twitter.com/validator

Local Business Data

Is the business a local one with one or more physical presences? If yes, has a Google My Business page been setup for it and have the relevant locations been added?

Reference: https://support.google.com/webmasters/answer/92319?hl=en


There you have it – following these steps should ensure that you’ve got all aspects of technical SEO covered.

UPDATE: My downloadable sample technical SEO audit report is now live. Grab it here!

What do you think of my killer technical SEO audit? Tried it out for your website? Got any questions or feedback? Let me know in the comments below!

Leave a Reply

Your email address will not be published. Required fields are marked *

Up for