Regular SEO audits are imperative for monitoring the health of a website. Large websites may have tens of even hundreds of staff making regular changes to the code, content and user experience. It’s easy for any contributor to inadvertently cause SEO problems by not adhering to best practice.
In this guide, I’ll walk you through several key stages to the audit process, and outline some of the key tools at your disposal to help you and your team monitor the SEO-health of your sites.
17 key stages to a thorough site audit
I recently published an in-depth audit checklist on my blog, where I identified 17 areas of analysis that ought to be considered when carrying out an audit for a client or an employer. If you are completely new to SEO and you’re interested in learning how to audit a website from scratch, I’d recommend having a read through of the checkpoints in this post.
With this guide, I’d like to elaborate on each of these key stages so that it is clear how tools can be used to support your recommendations to the development and digital marketing teams.
Stage One – Google Checks
Ensuring that behaviour is being monitored correctly on a website is an essential first step in the audit process, particularly if a site has not been audited for some time. Google Analytics, Google Tag Manager and Google Search Console are the critical triad of tools that you will need to test at this stage. However, even after installing the scripts for these tools, stuff can go wrong with how the data is collected and processed. So, here’s the first recommended tool to install.
Google Tag Assistant
This extension for Google Chrome is a useful tool for validating Google scripts on your site as well as diagnosing and troubleshooting any issues with the tags or how data is being processed. If you are trying to figure out why analytics data isn’t processing correctly, this tool will help you figure it out.
Stage Two – Market Research Checks
Competitor analysis is an essential component of the SEO audit process. If you don’t understand your top competitors’ activities or their approach to keyword targeting, you will continually scratch your head as to why you are being outranked. Tools such as ahrefs.com and SEMRush are well known in the industry, and they can help you to understand the authority of your domain in comparison to other competitors in your field. They can also help to highlight the tactics they might be using to win backlinks, the paid keywords they are bidding on, and a range of other insights.
However, at its most basic, SEO market research must involve keyword research and analysis. This brings us onto my second recommended tool.
Ubersuggest is a great tool for carrying out both keyword and market research on the fly. Generally, it is quick to return results and it adds additional layers of information that are unavailable using Google’s Keyword Planner. I like to use this tool at the audit stage to help me get an idea of how realistic a client’s keyword targeting is. The ‘SEO Difficulty’ and ‘Paid Difficulty’ metrics (as shown above) will help you here, together with average backlink count and the likely domain authority score required to rank in the top 10.
If your employer or client is relatively new to the market, doesn’t have very many backlinks and hasn’t invested heavily in great content and user experience; this will help to explain why they have yet to secure a top 10 result for one of their main keyword targets. If all those keyword targets are highly competitive terms, it would be worth making the recommendation in your audit to look at optimising for less competitive, but high-intent, long-tail phrases.
Optimising for the long-tail is a good way for smaller businesses to start ranking for relevant terms, whilst building their expertise and authority with the aim of ranking for more competitive higher volume phrases, further down the line. A good quality audit should highlight the current health of the domain in the context of its key competitors, then outline a suggested strategy for improving keyword performance.
Stage Three – Crawling, Indexing and Ranking Checks
Google’s ability to crawl and index a website is fundamental to how it then ranks on its search results pages, so this stage of the audit is especially important. Here you will want to check if the site ranks well for brand terms and long-tail phrases. If not, why not? Have certain URLs been excluded from Google’s index and, if so, why so? There are numerous tools you can use to help you answer questions like these. I’m going to provide a few tips around my favourite one.
Screaming Frog Spider Tool
Possibly the most well-known web crawler tool on the market (and, in my opinion, still one of the best), Screaming Frog’s Spider Tool enables you to quickly crawl key information on a website and identify any flaws or issues that need to be remedied. Screaming Frog is an advanced tool with immense capabilities, so I’m not going to document every feature of it here. Take a look at their user guide to help get you started. Instead, I will show you a quick tip for troubleshooting indexing issues.
Firstly, fire up Screaming Frog but before starting the crawl of the site click on ‘Spider Configuration’. Scroll to the bottom of this panel then click ‘Crawl Linked XML Sitemaps’ then check the box which states ‘Auto Discover XML sitemaps via robots.txt’ (or ‘Crawl these sitemaps’ then add the location of your xml maps).
After clicking ‘ok’, head over to the Crawl Analysis Configuration and make sure the ‘Sitemaps’ box is checked. You might want to uncheck any boxes that are not relevant to your site, such as hreflang. Check the box that says ‘Auto-analyse at End of Crawl’, then hit ‘ok’.
Now you can start the crawl of your website. When it completes, point your mouse over the drop-down button shown in the screengrab below and select ‘sitemaps’.
From here you will be able to easily troubleshoot crawling and indexing issues relating to URLs on your site. Perhaps the URL that is not ranking well or not ranking at all is not in your xml sitemap? Perhaps it is in the sitemap, but it has the ‘noindex’ directive applied to it? Or perhaps it’s in the sitemap but it is not internally linked to from anywhere within the website itself, making it an orphan URL?
This is just one example of the many features that Screaming Frog’s tool provides, to help you diagnose crawling and indexing issues.
Stage Four – Crawler Errors
Larger websites are often prone to high volumes of 404 (‘not found’) errors and internal 301 redirects, which wastes crawl budget and impacts on website performance. Screaming Frog’s Spider Tool can also help you to identify and fix these issues, however, sometimes it is useful to understand how frequently search engines are visiting 404 status URLs and unnecessary redirects. To help you understand this better, you will need to carry out log file analysis.
Screaming Frog Log File Analyser
Here I should point out that I am not an employee of Screaming Frog nor am I being paid to sponsor their products! I am simply a fan of their tools and their Log File Analyser is probably the simplest but most effective way to get started with log analysis, if you have never tried it before.
To get started, you’ll need to request your log files in from your web host or server administrator. Depending on the size of your website and the amount of traffic it receives, these raw files can potentially be huge in size. So, you may need to perform this analysis on a computer that has a high level of processing power.
Within the Log File Analyser, you can then simply import the file to a new project and let Screaming Frog do the rest. Again, you’ll want to check the user guide before you get cracking on this. When the results are in, it is easy to see how much of an issue 4xx and 3xx status URLs are for your web property. Simply click on the ‘Response Codes’ tab then sort by the relevant HTTP status code column. In the example below, you can see that Google is frequently crawling many expired products on an eCommerce site. One possible solution to this might be to return a 410 “gone” status for these URLs, to indicate that they are gone and not coming back.
Stage Five – Page Rendering Checks
Google URL Inspection Tool
To use this tool, you’ll need to log into your Google Search Console account. From there it is simply a case of copying and pasting the URL you want to check for rendering issues into the grey bar at the top of the page. Then, once Google has processed the results, head over to the “tested page” section on the right-hand side and click ‘screenshot’ (note, you may need to process a crawl request to see this). Does the screenshot look the same as it does for users of your site? If not, click on the ‘more info’ tab to find out possible reasons why it does not. From here you can troubleshoot issues in the HTTP response, blocked resources and problems relating to CSS & JS.
Stage Six – Essential Technical SEO Checks
Technical SEO is a big subject and it’s not possible to cover every aspect of the auditing process in one post. However, essential checks you will want to carry out include: ensuring that there are no canonicalization issues with the site (for example both the www and non www versions of the site being accessible to crawlers); ensuring that canonical tags are configured correctly across the site and that variations of the same page have a canonical tag pointing to the definitive version; ensuring there are no redirect chains or issues within .htaccess. Screaming Frog, or any good quality crawler tool, will help you to diagnose these issues. But you may need to dig deeper into the cause of the problems. This might involve delving into the .htaccess file to find out what’s going on.
Made with Love Htaccess Tester
If you are trying to figure out problems relating to 301 redirects or why rules relating to Expires Headers are not being respected, you might be able to figure it out using htaccess tester. Simply copy and paste the code within your .htaccess file into the tool, along with your site’s URL, and click ‘test’. The tool will then list any issues where rules were not met or where redirects could not be followed.
Stage Seven – Website Performance Checks
Page load times have been a ranking factor in Google’s algorithms for several years now, but it has taken on increasing importance in the mobile-first era. Google does not want to refer its users onto sites that feature unnecessarily heavy files that have to be downloaded on access to the page, zapping the end users’ data allowance. In addition, the coding of a site’s pages needs to be configured so that essential content is loaded first. There are many tools you can use to diagnose issues at this stage, one of my favourites is GTMetrix.
This is not just a great tool, it’s a great resource for helping you to understand the technical issues that are causing website performance issues.
Simply enter the URL you want to test then you will see a scorecard like the one above. Underneath the scorecard, you can then investigate the issues identified. If you don’t understand what the issue means, simply click on it then press the ‘what’s this mean?’ button. You can then follow useful links through to pages which explain the issue and how you can go about fixing it.
Stage Eight – Website Security Checks
Having a robustly secure website is an essential requirement for compliance purposes, to help fight cybercrime and to reassure Google and other webmasters that you are safe and worthy of being linked to. A hacked website can have a disastrous impact on organic search visibility because Google and other search engines do not want to refer their users to a web property that is riddled with malware. If you have failed to take the steps to prevent your site from getting hacked, Google may have less confidence in your ability to keep it safe in the future. Therefore, regular website security testing should be part of your audit process.
Pen Test Tools
When working with a new client, I like to run a check on it using this quick free pen testing tool. It can help to diagnose obvious security flaws that have been overlooked. Here are the results for my own website.
Stage Nine – Site Architecture Checks
The way a website is built and the way in which pages within a site interlink to one another, is a strong determiner of how those pages will rank on search results pages. Flat structured sites with logical URL hierarchies and consistent vertical linking are much more likely to perform better, compared to sites that have complex layers of information and pages that are not reachable within a few clicks of the homepage. At this stage of the audit, you will want to refer back to the crawl results from Screaming Frog’s Spider Tool (or your preferred web crawler) to check for issues such as buried URLs or an excessive volume of content located in the root folder.
To support your findings in terms of site architecture, the PowerMapper tool is useful and free. You can visualise how the structure of your site looks in a variety of formats including electrum, cloud and tree, as shown in the screengrab below. Simply enter your domain name into the tool and it will output the rest.
Stage Ten – Structured Data Checks
Structured data is code that sits in the background of your website and helps search engines to better understand the meaning and context of your web pages. When applied correctly it can help to improve the visual appearance of your site’s search results, improving the click-through rate and, in turn, organic traffic referral. At this stage of the audit, you will want to ensure that existing structured data has been applied to the site correctly, and you will want to assess whether additional markup could be applied to improve how search engines parse data regarding your web pages.
Structured Data Testing Tool
At this stage of the audit, Google’s Structured Data Testing Tool will be your main requirement. Simply fire the tool up then add in the URL that you want to check for structured data issues. If you spot any errors, these will need to be flagged up in your audit. Errors are important to fix as Google may be less likely to show an improved SERP result whilst they are there. For example, if you have marked up your reviews but not specified the best and worst rating fields, you will be unlikely to achieve a star rating associated to your URL in Google’s search results. Ultimately, you want to ensure that there are no errors or warnings across any of your site’s pages, as shown in the example screengrab below.
However, auditing structured data at scale, for very large websites will require more advanced assessment tactics. Again, your favourite crawler tool should be able to assist you on this front. Screaming Frog, for example, provide a guide to validating structured data here.
Stage Eleven – Consistency & Compliance Checks
Inconsistent business details within a website and in citations across the web can diminish the trust that Google might place in a web property. Trust signals around a website can be heightened by ensuring that all business information is complete, marked up using structured data and displays consistently across all web pages. Trust can also be heightened by having a uniquely written, and GDPR compliant, Terms & Conditions and Cookies page.
There many ways to assess if your site is GDPR compliant but an excellent tool to use at this stage is Cookiebot. Again, all you need to do is enter the URL and allow the software to go to work.
Stage Twelve – On-Page SEO Checks
Regular audits of on-page SEO are crucial on websites where content is frequently updated or where the structure of the code may have changed for design purposes. It’s also easy for non-technical staff to make incorrect changes to web copy, perhaps by including multiple H1 tags within a single document. On-page SEO is another big topic, so I won’t cover every aspect of it here. Head over to my checklist mentioned earlier, or take a look at this handy infographic by Backlinko.
There are a myriad of tools available to help you assess the quality of your on-page SEO but this one by SEOptimer is free and outputs the results in a useful format. You can see below that my social skills really let me down.
Stage Thirteen – Website Accessibility Checks
If your website fails certain W3C compliance checks, it’s unlikely that this will cause any ranking issues on Google. However, it’s good practice to fix any issues identified so that your site renders properly and is accessible to all users. At this stage of the audit, you will want to test key landing pages and report back any issues to your development team.
W3C Markup Validation Service
These tests can be carried out using the W3C Markup Validation Service. Simply enter the URL into the interface and click ‘check’. You can then document the issues pertaining to that URL which need to be fixed.
Stage Fourteen – User Experience Checks
For websites to perform well on search engine results pages, it is no longer enough to have a technically proficient set-up, with good on-page SEO and strong backlinks. Today, sites must demonstrate expertise and a high-level of alignment to the intent of the search engine user. Getting the click is one thing but how users then interact with your site when they land on it, is of equal importance. At this stage of the audit, you will want to review any heat maps or visitor recordings that you might have access to via software such as Hotjar. If you don’t have access to anything like that, you will want to suggest to your client or employer that they get it installed. In the meantime, tools such as mobiReady will help you to assess how the content of the site looks for users on different devices.
I like the visual output of this tool as it can quickly help you to identify potential usability issues on different device types. Are there any call to action banners or above-the-fold advertising taking up too much space for users on smaller mobile devices? Does the text display awkwardly on certain types of phones? Is the navigation scheme intuitive for both desktop and mobile users? This tool is handy for visualising and then documenting these types of issues.
Stage Fifteen – International SEO Checks
If your business serves customers in multiple regions around the globe and your website traffic derives from people located in many of these regions, you will need to regularly audit how well your site is set-up to target these users. Part of this process will involve serving region and language-specific versions of pages, marked up using hreflang. Regular auditing of the hreflang code on your site is necessary to ensure that there are no issues such as missing tags or missing x-default attributes. Whilst your preferred crawler tool can help to diagnose these issues, I also like to run checks using a free tool from technicalseo.com.
Hreflang Tags Testing Tool
This testing tool is particularly beneficial if you are trying to figure out why you are having ranking issues for a certain web page in a certain country. It provides a very clear visual display of the issues pertaining to a specific URL.
Stage Sixteen – Local SEO Checks
In contrast to international SEO, if your business is focused on providing products or services to people within a certain vicinity, you will want to carry out regular local SEO audits. Optimising your website for Google’s local pack requires a different set of tactics and activities. I won’t document all of those here, take a look at Andrew Shotland’s website for further information on this area. However, one important aspect of local SEO management is in ensuring that citations of your business are consistent across the web. A regular audit of your citations is, therefore, an important element of on-going local SEO auditing.
Moz Local Check
A handy free tool to use at this stage of your audit is Moz’s Local Check. This will help you to quickly assess if you have the important citations in place for each of your key business locations. Make sure you run this check for every physical shop or office location pertaining to your business.
Stage Seventeen – Off-Page SEO Checks
Last, but certainly not least, a thorough audit of a website’s health is not complete without an analysis of its backlinks. Have you secured any fresh backlinks recently? If so, what is the quality of those links like? Are sites linking to you using natural looking anchor text or does it seem as though there has been a concerted effort to build keyword heavy backlinks in an automated fashion? How do key competitors fare in terms of their backlinks? Do they have links from relevant industry and media sites? These questions and more can be answered via off-page auditing analysis. A great tool for this is ahrefs.com.
Ahrefs.com wins for me, as a backlink analysis tool, based on its reasonable pricing and processing power. It crawls billions of URLs daily and has over 16 trillion links in its index. It is usually quite fast to identify and pick up new links that are pointing to your domain. Keep a regular eye on your site using this tool and look out for any suspicious trends such as a sudden upward spike in referring domains (which could be legitimate or evidence of possible “negative SEO”). Your site audits should include advice for your client or employer on any low-quality links that might need to be disavowed.
That concludes my guide to the seventeen stages of SEO auditing and the tools you can use to help improve your reporting. Did I miss any useful tools that you use in your audits? Feel free to get in touch with me with your feedback.