Deciphering SEO Audits: Your Roadmap to the Top of the Search Results

Deciphering SEO Audits: Your Roadmap to the Top of the Search Results

Discover how to actually prioritize all the findings you get from an SEO audit tool.

See your rankings pop by efficiently prioritizing and actioning SEO audit findings.

SEO Audits are great, but on their own, as a list of findings they are not that useful. 

Especially when they are not considered within the context of what your website is, who it serves and how it is used.

Want to just dive in now – click here

Deciphering SEO Audits Your Roadmap to the Top of Search Results

Furthermore, many SEO tools have some outdated findings and don’t consider things in the correct priority from my 26+ years of SEO experience.

In this article I dig deep into the different findings, what they mean and prioritize them in order of significance and difficulty.

I aim to cut through the noise and give you the real priorities for SEO Audit findings so you can apply them to your audit findings and then action the outcome.

Action is the keyword.

An SEO audit on its own is not that useful if you have no method for prioritizing and then actioning the findings.

If you have an update, edit or comment on any element of this article please reach out, if you have something valuable to share I will credit you in this article.

I have grouped the findings together into

  • Errors
  • Warnings
  • Notices
  • Forget about it

The two main things to consider are

  1. Severity / Impactfulness
  2. Difficulty

If it’s easy to implement and has a significant impact it should be prioritized higher.

Conversely, if something is very difficult, affects only one page and doesn’t have much impact then that should be at the bottom of the pile.

Also note that some error categories are either resultant or causal of other error categories, e.g broken internal links and 404 error pages usually go hand in hand.

Let’s dive in!

Errors 

Errors are exactly that – something that is not working as it should.

Finding and fixing errors should be a top priority.

Find and fix errors

Given that site-wide quality signals exist it makes sense to fix any and all true errors that exist on your site.

SIDE BAR: You can crank out all the new content you like, but if your legacy site sucks its dragging down your best efforts. This is where audits, and more specifically content audits can really shine.

10/10

Fix the issues in this category first!

4XX status code

Category: Technical SEO

Severity: Major

Difficulty: Easy

Score: 10/10

Details: The 4XX error code series is typically the 404 page not found error although 403 not authorized is also a common one. 410 is permanently deleted and is less common but is appropriate in some situations.

Any error like a 404 url not found should be rectified as a high priority.

404 Error - Page not found

Any errors on your site should be high-priority fixes. 404 errors could have links pointing to them that are being wasted, they also are a brick wall to crawlers and waste resources.

There is a caveat to this one, many SEO audit tools report a 4XX error codes as any 400 series code. Not all are true errors.

For example, a 401 response means unauthorized – it’s technically an “error” but not really if a followed link goes to a login page for example. Similarly, a 403 (relatively common) means forbidden. Not necessarily an error.

Client Server 200 vs 404 response code

Many SEO audit tools also result in a 429 “Too Many Requests” response code. This is due to how the server is set up and the frequency at which the SEO audit tool crawls the site. A bunch of 4XX errors can result due to the crawl itself and these are typically false positives

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



9/10

Category: Linking

Severity: Major

Difficulty: Easy

Score: 9/10

Details: Broken internal links are highly related to 404 errors as when a 404 suddenly occurs (perhaps a page was deleted or renamed) then there is likely also a broken link (or several) pointing to a 404 page.

Since these are errors and they are in your full control they should be rectified as a high priority.

Turn broken links into quick wins

Broken internal links happen naturally which is why it is something you should be regularly checking for. 

I like to use a Chrome plugin called “Check My Links” to highlight broken links on a page.

I also like to use Kristina’s Azarenko’s SEO Pro chrome extensions tool which also has some useful features for tech SEO audits.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



8/10

Category: Linking

Severity: Medium

Difficulty: Easy

Score: 8/10

Details: Having broken external links is a quality issue, leads to a poor user experience and affects link flow.

Finding and fixing broken external links is a fairly high priority and significant fix.

Broken Link

I like to use the Check My Links Chrome plugin. You can also inspect or view source and ctrl + f (or cmd + f) and find the broken link.

If there is no obvious good alternative then simply remove the link.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



No redirect or canonical to HTTPS homepage from HTTP version

Category: Technical SEO

Severity: Major

Difficulty: Medium

Score: 8/10

Details: All websites should have correct canonicalization in place where the HTTP version 301 redirects to the HTTPS version and with the non www. Version of www. Version redirecting to the other, depending on which way round the preference is. My general preference is to redirect the www. Version to the non www. Version.

So it should go

HTTP://www. -> HTTPS://www. -> HTTPS://

If you don’t have this 301 redirect sequence set up and you also do not have the correct canonical URLs in place you are causing significant Canonicalization issues and effectively duplicate content.

There is an argument for the shortest possible redirect path which would have HTTP://www. Redirecting straight to HTTPS:// which is fine, as long as you have an enforced consistent rule set in place to deal with this issue that is all good.

Pro Tip: You should have a regular check for these kind of things (as things can and do change – I have a very recent experience of this being necessary and likely saved a client significant revenue) 

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Home page does not use HTTPS encryption

Category: Technical

Severity: Major

Difficulty: Easy

Score: 8/10

Details: The HTTPS protocol is almost universally used, however many websites are still accessible via an HTTP (no S) request. It is imperative that you provide an HTTPS version of every URL on your site.

There are sites such as Lets Encrypt that make this process straightforward for you. Also many hosts provide easy ways to do this. Even if you don’t have those options, ask a friendly web developer to help you out.

http://www (no)
httpS://www (yes)

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Redirect Chains and Loops

Category: Linking

Severity: Major

Difficulty: Medium

Score: 8/10

Details: Redirect chains and loops represent a significant problem for crawlability and your ability to get indexed.

You will hear a lot of talk about crawl budget with respect to these types of issues too, however for the most part, crawl budget is not something most website managers need to worry about. It’s only very large sites that need to consider their crawl budget.

SIDE BAR: I once audited a site that had a crawl delay declaration in robots.txt that meant it would take 127 days for a search engine that respected robots.txt to crawl the site (needless to say, that was not optimum fo new content discovery).

Redirect Loop or Chain

A redirect chain is where there is a sequence of several URLs that redirect to get to the final URL. 

Note that Google’s crawler will quit following redirects after 7 hops.

This might seem like a high enough number that you don’t need to worry, but it’s not that uncommon to have a site migration and or site redesign lead to a few hops and if certain conditions are present this can multiply that number of redirect hops and you can get to 7 hops quickly.

This is why it’s important for an audit to look at these types of issues, especially as they are not visible to the end user who likely ends up at the correct place and is unaware that the redirect chain that exists.

Redirect loops are where a URL that redirects leads back to another URL in the redirection path. This quickly leads to an infinite loop and will cause Google to quit crawling your site and can also end up in an internal server 5XX error and or server resources being consumed and generally leading to a bad experience for both users and search engines.

Neither of these conditions are good and they should be rectified as a high priority.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



7/10

WWW resolve issue

Category: Technical SEO

Severity: Severe

Difficulty: Medium

Score: 7/10

Details: The www. Resolve issue is related to Canonicalization. If you don’t have a clear path for where the www. Version of your site goes to it can cause several issues.

It’s OK for www. To redirect to non www. or vice versa, however there needs to be a clear rule set in place and have it be consistent across your site.

Having the following different versions of your URL all separately available and resolving to a 200 response code will lead to content duplication and canonicalization issues.

  • http:// domain . com
  • https:// domain . com
  • http:// www . domain . com
  • https:// www . domain . com

All except one version should return a 301 response code and all should lead directly to the final 200 response code URL. This is your correct canonical URL. It is important that you set it and that Google also reflects that. You can use Google Search Console (set-up all variant properties as separate properties) to check this.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



5XX status code

Category: Technical

Severity: Medium

Difficulty: Medium

Score: 7/10

Details: The 5XX series error codes are internal server errors. This is usually where a fault has occurred and they should be very rare.

If you find sections of your site that have large amounts of 5XX errors it warrants further investigation. Some malformed URLs or bad parameters can cause this type of issue.

500 Internal Server Error

In some scenarios, a 5XX error code is given but a page still loads!

It’s very important to understand why 5XX errors occur and resolve them.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Mixed content

Category: On-Page SEO

Severity: Medium

Difficulty: Easy

Score: 7/10

Details: I consider Mixed content to be a medium severity issue. Some tools play this down, however given that it is related to security issues and it’s usually very easy to fix it’s something that should be on the upper half of your audit prioritization.

Mixed Content

Mixed content is typically where there is an http resource on an https page. Most of the time the fix is as simple as adding the s on the http.

Many people play this one down but I do consider it up there as it is a potential security issue and Google prioritizes security quite high.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Page Couldn’t be Crawled (DNS resolution issues)

Category: Technical SEO

Severity: Medium

Difficulty: Medium

Score: 7/10

Details: DNS resolution issues refer to the inability of a web browser, client or search engine crawler to convert a domain name (like example.com) into the IP address needed to locate and load the website. This is analogous to the telephone book look up where you know the name (domain) and need the phone number (IP) to connect.

Crawl Issues

If search engines can’t resolve the DNS of a website, they can’t access, crawl, or index any of its content. It’s likely you have a more fundamental problem if you are regularly getting DNS resolution issues.

Generally with these errors, you need to inspect DNS records, understand how your main domain, subdomains, and individual DNS records are set up, and make sure there are no misconfigurations.

This is not a very common error but it can be significant if a search engine is not able to crawl your page.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Page Couldn’t be Crawled (incorrect URL formats)

Category: Crawling

Severity: Medium

Difficulty: Medium

Score: 7/10

Details: This is a relatively ambiguous error. The result is clear, a page could not be crawled due to an incorrect URL format, however what that format is and the error that either causes it or led to it is much less clear.

These types of findings are not common, but if there is a URL format that is preventing crawling that can be troublesome.

This type of error can often be found on AJAX sites that use #! hashbangs in the URLs. These are often how AJAX URLs are structured.

There are specific ways to handle AJAX crawling and it is important to observe the appropriate crawling and URL strategy to make sure your AJAX content is accessible to search engines.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Broken Internal Images

Category: Technical SEO

Severity: Minor

Difficulty: Easy

Score: 7/10

Details: This is an error and a quality issue, which is why it belongs in the Error category, even though it has low severity.

It is also possible for a broken image to go unnoticed as oftentimes it doesn’t affect the flow of a page.

Finding and fixing these is still a fairly important item, especially as you can and should have appropriate ALT text associated with the image that helps describe the content and focus of your page further.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Hreflang Issue

Category: Technical SEO, International SEO

Severity: Medium

Difficulty: Medium

Score: 7/10

Details: The “hreflang” attribute is an HTML tag used to indicate the language and regional targeting of a webpage. When implemented correctly, it tells search engines which is the most appropriate version of a page based on a user’s language and location.

Incorrectly implemented “hreflang” tags can lead to search engines serving the wrong language or regional version of a page to users. This is definitely something you want to avoid. Especially so since you usually will not be closely monitoring this due to geographic and language reasons.

To address this issue, review and validate your “hreflang” implementations using tools or plugins specifically designed for this purpose. Ensure that all references are correctly set up and that there are no discrepancies between the declared language or region and the actual content of the page.

NOTE: A lot of plugins designed for this task do a fairly poor job. Have an SEO pro check it out for you.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Hreflang Conflicts Within Page Source Code

Category: Technical SEO 

Severity:  Medium

Difficulty: Medium

Score: 7/10

Details: The issue “hreflang” conflicts within the page source code refers to discrepancies or contradictions in the way hreflang tags are set up within a single page. These conflicts can cause confusion for search engines, making it difficult for them to understand which language or regional version of a page should be presented to users.

Such conflicts can lead to search engines either ignoring the “hreflang” tags altogether or incorrectly serving pages to international users, impacting the user experience and potential conversions.

To rectify this, conduct a thorough audit of the “hreflang” tags within the source code of the affected pages. Make sure each tag is unique and accurate, remove any contradictory or redundant tags, and ensure all referenced pages are accessible and have corresponding hreflang annotations. 

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Category: Technical SEO

Severity:  Medium

Difficulty: Medium

Score: 7/10

Details: Issues with incorrect hreflang links arise when the URLs specified within the hreflang tags do not point to the intended pages. This is often where the page linked or referenced doesn’t have the appropriate language.

Some of the tools that do this automatically do quite a poor job and don’t serve international SEO sites well.

Pay particular attention to Google Search Console and how your pages are being used.

Don’t underestimate the power of Google Analytics with geographic and landing page filters to make sure that the appropriate language and region pages are being used predominantly where you would expect them to be.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Non Secure Pages

Category: Technical SEO

Severity: Medium

Difficulty: Medium

Score: 7/10

Details: It’s 2023 (or whatever year it is now – I mean come on, why don’t I make this dynamic!) – come on!

There really isn’t a good reason for this anymore. The protocols and load sequence of modern secure transmission protocols are actually more efficient.

Fix it – fix it now!

Non Secure Pages

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Issues With Expiring or Expired Certificate

Category: Technical SEO

Severity: Medium

Difficulty: Easy

Score: 7/10

Details: This one is fairly self-explanatory, SSL certificates expire, and if you don’t have a process in place to handle expiring certificates you will be presenting this situation to search engines. 

Many hosts have good automated methods to keep the certificate live and active, but you shouldn’t take it for granted. This is one of those useful checks in an SEO audit that would potentially go unchecked otherwise.

This is a relatively easy thing to resolve.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Issue With Old Security Protocol

Category: Technical SEO

Severity: Medium

Difficulty: Medium

Score: 7/10

Details: Some hosts are not configured with the latest security profiles and or have open vulnerabilities or old security protocols.

This is a technical SEO / Web Dev action. The server admin or host should be able to address this.

Our SEO audits have a strong security element in them because it matters for the longevity and quality of your web presence. Anything that doesn’t support that end goal is something that can potentially cause you problems down the road and is a quality issue.

This is another one of those items that usually goes unchecked.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Issues With Incorrect Certificate Name

Category: Technical SEO

Severity: Medium

Difficulty: Medium

Score: 7/10

Details: This error is a little ambiguous, although on the face of it it sounds fairly self-explanatory. Sometimes certificates get configured for hosts but don’t actually pertain to or represent the domain and hostname that is being served. 

Or, it is possible that the hostname and domain name are in conflict and one is configured while the other is not. This error can also be where a subdomain is configured but not other subdomains on the TLD and the certificate scope does not fully cover all of the subdomains being served.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Pages with Multiple Canonical URLs

Category: Technical SEO

Severity: Medium

Difficulty: Medium

Score: 7/10

Details: Just no! This should not be happening and you need to fix it. Giving conflicting canonical signals is a surefire way to search failure!

Having multiple canonical URLs on a single page sends mixed signals to search engines about which version of the content is the “preferred” or “authoritative” one. 

This can result in search engines ignoring the canonical tags altogether or indexing the wrong version of the page. It essentially defeats the purpose of having canonical tags in the first place. 

Giving conflicting canonical signals is a surefire way to failure! Ensure you audit your pages, identify those with conflicting canonical tags, and correct them to have a singular, accurate canonical URL. Using SEO tools that can scan and validate canonical implementations will greatly assist in resolving these issues efficiently.

NOTE: This can happen due to how some SEO plugins are configured, there is a default template and there is also an SEO tool that adds an entry.

Also, some sites are configured to display a canonical URL for every single unique page which may not be a good idea.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Pages Have a Meta Refresh Tag

Category: Technical SEO

Severity: Medium

Difficulty: Medium

Score: 7/10

Details: The meta refresh tag is a much less common form of redirect than it used to be, however they still exist. 

Meta refresh basically loads a 200 response code page and then redirects the user to another page. While this usually has the desired redirect effect and looks “almost” the same to a site visitor, it does not instruct search engines to follow the redirect and does not have the same intent or value as a 301 permanent redirect directive.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Subdomains Don’t Support Secure Encryption Algorithms

Category: Technical SEO

Severity: Medium

Difficulty: Medium

Score: 7/10

Details: This is highly related to the certificate and security protocol findings. Secure encryption algorithms are crucial to ensure the safety and integrity of data transmitted between users and the web servers, including on subdomains. When subdomains don’t support the latest or secure encryption methods, they become vulnerable to attacks and data breaches.

It is essential for SEOs to flag these issues even if they primarily fall under the web development domain. An insecure subdomain can deter users due to browser warnings, reduce trust, and consequently affect search performance and user engagement. 

SEO tools often flag these points because they represent clear vulnerabilities and can impact the overall performance and reputation of a site. These issues, while technical, play a significant role in ensuring a website’s credibility and trustworthiness in the eyes of both users and search engines.

As a part of the broader digital team, SEOs should collaborate with developers to ensure that all aspects of the site, including subdomains, adhere to best security practices, supporting overall search success. After all, this emphasizes the WO – Website Optimization that we advocate for.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Category: Linking

Severity: Medium

Difficulty: Easy

Score: 7/10

Details: This is similar to the pages can’t be crawled due to incorrect URL formats. This is likely triggered by some of the links on an AJAX site, although there can be other reasons.

This definitely warrants further investigation as you could have uncovered a major area of the site that is inaccessible. This is much more common on larger sites than many people realize.

Anywhere on a site that hinders a crawler needs attention at a minimum you need to understand what the issue is and why it exists. Then you need to come up with a plan to action the issue, or a good reason for not doing so.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Warnings

Warnings are not as serious as errors but they are still indicative of something that’s not right and usually action should be taken.

There are warnings given by some SEO tools that I consider to be errors or notices and some are not even worth bothering with. More on that later (jump to don’t even bother)

6/10

Category: Linking

Severity: Medium

Difficulty: Minor

Score: 6/10

Details: This is close to mixed content except it’s just where you are linking to another page that has an HTTP URL. Since this also creates a potential security issue it is highly advised to link to the HTTPS alternative.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Robots.txt file errors

Category: Technical SEO

Severity: Medium

Difficulty: Easy

Score: 6/10

Details: This one is fairly subjective as it depends what type of errors exist. If there is a large part of the site that has a noindex declaration that should be indexable then this is a big problem. Similarly if there are certain portions of code and directories that Google is disallowed from crawling, and those directories or files affect how the page is rendered then a seemingly

Robots.txt

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Issues With Broken Internal JavaScript and CSS files

Category: Technical SEO

Severity: Medium

Difficulty: Medium

Score: 6/10

Details: The significance of this issue greatly depends on what the issues are specifically.

Minor issues with some javascript and CSS files may be insignificant, where as some larger issues could cause the page not to render when crawled by Google or display content it shouldn’t or worse still, hide content.

Pro Tip: Go to Google Search Console, inspect the URL, live test the URL, then navigate to the view tested page option and look at Screen Shot and then the More Info Section.

In the screenshot you will see how Google actually renders your page, large Javascript and CSS errors will be apparent here, even if it loads just in your desk top web browser or mobile device.

The more info tab elaborates on where Google had issues loading resources.

It is important to note that a “broken” site can still perform fairly well on multiple devices but have significant issues that effect search engine rankings.

(This is in fact one of the main reasons to do an SEO audit and to regularly audit to catch any such errors)

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Images Don’t Have Alt Attributes

Category: On-Page SEO

Severity: Medium

Difficulty: Easy

Score: 6/10

Details: I consider missing ALT attributes – not alt tags as they are often described – the ALT text is the attribute to the HTML element IMG image Example: 

<img src=”image.jpg” alt=”descriptive text” />

to be a fairly high on the list audit finding.

The reason is its an easy fix, it helps with context and especially if the images are links then you are effectively missing out on valuable anchor context.

The other reason this finding is high up there is that there is no visual que as to the ALT attribute being missing. 

ALT text was originally introduced to describe what images were about for non-image displaying browsers or text-only browsers. ALT attributes are also a key element of accessibility and something that screen readers and accessibility tools rely on.

Side note: Read up on ADA compliance to understand its importance.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



5/10

Pages without Title Tag

Category: On-Page SEO

Severity: Medium

Difficulty: Easy

Score: 5/10 

Details

Really?

Seriously though how do you press publish without a title, it’s literally the single most important on-page SEO parameter there is.

If you have pages without titles make sure you add one. 

This is a quick win if you have this issue.

Page Title

It should be noted that Google will rewrite page titles a lot of the time and it can and will truncate longer titles. This leads to the advice to keep titles below 70 characters. 

This is good advice in general as it gives more control over what is displayed in the SERP however Google will still asses (and rank) the page based on the full title even if it truncates or uses a different title.

So if you need a super long title for a full title generally that’s OK.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Page couldn’t be crawled

Category: Crawling

Severity: Medium

Difficulty: Medium

Score: 5/10

Details: Usually this is not a good one and the root cause should be resolved. It may however be completely on purpose. For example a login or private section might be private and protected but there is a link to it. 

This would be an example of a perfectly reasonable blocked from crawling result.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



HTTP URLs in sitemap.xml for HTTPS site

Category: Technical SEO

Severity: Medium

Difficulty: Easy

Score: 5/10

Details: This is where you have old HTTP only URLs and don’t automatically default to the HTTPS protocol and those URLs are listed in your site map. You may have updated oyur site to default to HTTPS which is great, but if you are putting HTTP URLs in your sitemap this is similar to the mixed content issue.

Google has made it very clear that security is a top priority. Act accordingly and get rid of anything non HTTPS.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Pages Have no Hreflang and Lang Attributes

Category: International SEO

Severity: Medium

Difficulty: Easy

Score: 5/10

Details: This is another difficult one to score, depending on whether international SEO is a consideration or not greatly affects the significance of this audit finding.

I will assume that it is for a site that cares about international SEO. In which case this is definitely a problem.

If you are targeting multiple countries, languages and regions you should have valid Hreflang and lang attributes in your site.

<link rel="alternate" href="url_of_page" hreflang="lang_code" />

This article from Semrush on hreflang is useful.

This is one of the most misunderstood areas of international SEO and it can lead to some significant problems for geographical targeting.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Page is Blocked From Crawling

Category: Technical SEO

Severity: Minor

Difficulty: Easy

Score: 5/10

Details: Depending on the number of pages and what is blocked this item can range from a minor issue to a relatively major one.

Given that crawlability is one of the fundamental aspects to get right in SEO, if there are pages that should be able to be crawled but they are blocked this can be a major issue.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Issues with Blocked Internal Resources in Robots.txt

Category: Technical SEO

Severity: Medium

Difficulty: Easy

Score: 5/10

Details: This is one of those issues that is difficult to see, because a site can render perfectly well for you, yet when Google crawls it it does not. This can be because there are instructions to prevent Google through robots.txt from using certian parts of the site content.

This can break the way the site renders.

You can check this with the Google Search Console inspection tool and debug if you have this issue or not.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



External Images are Broken

Category: On-Page SEO

Severity: medium

Difficulty: Easy

Score: 5/10

Details: This occurs when you have previously linked to an image that is not on your domain (hence external) and it is now broken.

This error should highlight the potential issues with relying on external sources for the stability and quality of your website.

There are legitimate reasons for linking to an external source, and you should always cite the source and have permission to use it.

The practice of linking to external images is often referred to as hot linking and some server configurations and or updates can prevent this or cause a different image to load.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Structured Data Items are Invalid

Category: Technical SEO 

Severity: Minor

Difficulty: Medium

Score: 5/10

Details: This isn’t a break-the-bank kind of issue but it’s definitely something to tidy up. 

Structured data provides search engines with specific details about the content of a webpage. There are many ways to implement structured data and sometimes the methods don’t use complete data or don’t check all of the boxes. Furthermore, sometimes invalid data can find its way into the structure data markup and lead to an error.

Commonly these issues arise when there is missing or incomplete data and it doesn’t fully meet the scope of the structured data item.

While this isn’t a critical error that would drastically affect a site’s ranking, it can impede the site’s ability to have enhanced search results, such as rich snippets or knowledge panels.

For large e-commerce sites, structured data plays a crucial role in displaying product details, ratings, availability, and more in search results.

Invalid structured data can therefore limit these enhancements, potentially reducing click-through rates and user engagement.

Addressing this issue involves using tools like Google’s Structured Data Testing Tool or Schema Markup Validator to identify and rectify invalid structured data items. Regular audits, especially after making updates or changes to the site, will ensure that structured data remains compliant and effective in enhancing search results.

Given the medium complexity of structured data, it’s advisable to have a collaborative effort between SEO specialists and web developers to ensure accurate and valid implementations.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



4/10

Pages Have Too Many Parameters in their URLs

Category: Technical SEO

Severity: Medium

Difficulty: Medium

Score: 4/10

Details: This issue can arise when there are lots of options. How many is too many? Well if you have URLs with 5 or more URL parameters you are possibly in that territory.

The real question is why do they exist, are they the only unique source of that content or are there multiple-faceted navigation options that can drive people to the same content available at multiple URLs with different parameters.

The next consideration is, how similar are other URL parameter variants. Should there be a main category canonical or are unique individual canonicals OK.

Multiple parameters are not necessarily a terrible thing, however, it can make for some ugly-looking URLs (which is probably less of a real issue than it is perceived).

One of my clients has their highest traffic page at a URL called my-first-blog-post

and no I am not about to change the URL from an indexed and high-ranking page just to make it look pretty – the reality is most people don’t care.

One problem with an unchecked and uncontrolled multiple URL parameter site is that it can give you more options than Starbucks and that may be problematic for index blot, crawlability, and a host of other ramifications.

As with many things in SEO (and life its self for that matter) context matters. Be aware of the reasons and understand them, then judge if its a real problem or not.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Pages have Incompatible Plugin Content

Category: Technical SEO

Severity: Minor

Difficulty: Medium

Score: 4/10

Details: This one has a varied scope is it is broad and vague.

One little secret hack I have is to search a site for the classic “Lorem Ipsum” code – which can be visible and crawlable in your text but not displayed. This is where some plugins have some default placeholder text and put it out in layers that are hidden, even though there is no visible “Lorem Ipsum” content ont he site. This is an example of some bad plugin content.

Search for: site:domain.com “Lorem Ipsum”

Other issues may be more technical, this can often be the case with dynamic feeds, APIs and integrations.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Pages Contain Frames

Category: On-Page SEO / Technical SEO

Severity: Medium

Difficulty: Medium

Score: 4/10

Details: Frames, typically an iframe have been considered as an absolute NO NO for SEO.

This SEJ article describes the more modern treatment of iframes with commentary from Google on the topic.

Iframe content is content from another site, its basically not part of your site, yet it is displayed on your site. 

Historically Iframes have not been followed, crawled or indexed through the site that is displaying the iframe, however, this treatment has changed.

Iframes are less common than they used to be. They do present challenges for tracking and looking at metrics. They can still have some valuable uses but generally, we prefer to have on-page content. 

Today in the vast majority of cases you can achieve anything you want with AJAX that an Iframe may have accomplished in the past.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Incorrect pages found in sitemap.xml

Category: Technical SEO

Severity: Minor

Difficulty: Easy

Score: 4/10

Details: Having incorrect pages in a sitemap.xml file can lead to search engines attempting to crawl and index pages that might not exist, have been moved, or are not intended to be indexed. The sitemap acts as a guide for search engines to understand the structure of the site and prioritize the crawling of its pages.

Common issues with incorrect pages in the sitemap include:

  1. URLs leading to 404 error pages.
  2. Including URLs that are set to “noindex.”
  3. Outdated URLs from previous site structures or versions.
  4. URLs leading to duplicate content or non-canonical versions of pages.

While this issue is fairly minor and won’t necessarily harm the site’s SEO directly, it can waste crawl budget — the number of pages a search engine is willing to crawl during a specific period. For large sites, optimizing crawl budget is essential to ensure important pages are indexed timely.

Resolving this issue involves auditing the sitemap.xml file, ensuring it only contains valid, live, and intended URLs. Using SEO tools that can validate sitemap entries against actual site content can be valuable.

Regularly updating the sitemap, especially after making structural changes to the site, will keep it accurate and beneficial for search engines.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Sitemap.xml files are too large

Category: Technical SEO

Severity: Minor

Difficulty: Medium

Score: 4/10

Details: There are limits on XML files, e.g 50,000 URLs per site map. It is best practice to have a sitemap index file that links to specific section XML sitemaps.



No Viewport Tag

Category: Technical SEO

Severity: Minor

Difficulty: Easy

Score: 4/10

Details: The viewport meta tag is essential for responsive web design. It controls how a website is displayed on mobile devices by adjusting the view according to the device’s screen size. In the age of mobile-first indexing, where search engines prioritize mobile-friendly websites, missing a viewport tag can result in suboptimal rendering on mobile devices.

Adding the viewport tag is straightforward. A typical tag might look like:

<meta name="viewport" content="width=device-width, initial-scale=1">

Implementing the tag ensures that the website content scales appropriately for every device, providing an optimal user experience. Regularly testing the website’s mobile responsiveness after adding or making changes to the viewport tag ensures consistent performance across devices.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Sitemap.xml file error

Category: Technical SEO

Severity: Minor

Difficulty: Easy

Score: 4/10

Details: The sitemap.xml file serves as a roadmap for search engines, guiding them through the important pages on your website. Errors in this file can lead to issues with how search engines crawl and index your site.

Common issues related to sitemap.xml file errors include:

  1. The file being inaccessible (e.g., server errors or incorrect file permissions).
  2. Incorrect syntax or structure within the sitemap.
  3. Listing URLs that return error codes like 404 (Not Found).
  4. Outdated or redundant URLs listed in the sitemap.
  5. Sitemap exceeding the size limit or containing more URLs than allowed.

To resolve these issues, verify the sitemap using SEO tools or platforms like Google Search Console. Ensure that it’s correctly formatted, up-to-date, and free from errors.

Regularly reviewing and updating the sitemap.xml file, especially after significant site changes or updates, will ensure that search engines always have an accurate guide to your site’s content.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Slow Load / CWV 

Category: Technical SEO

Severity: Medium

Difficulty: Medium

Score: 4/10

Details: Much noise was made about the Core Web Vitals (CWV) and I will also admit I was expecting the introduction of these into the ranking algorithm to be more significant than they were.

A slow-loading page, for whatever reason, is not good at all. 

This should be resolved independently of any SEO considerations.

Google has given us clearly defined page experience and core web vitals (CWV) metrics with well-established thresholds and tolerances. It behooves any webmaster or business owner to ignore them.

It is also often the case that one change in code, template, theme or load sequence has a site-wide effect. For this reason, these fixes should be prioritized higher, however from a purely SEO perspective, as much as it pains me to say, this doesn’t reach the top priority.

However, bear in mind SEO considerations are not made in a vacuum, they must be weighed in the full context of the entire digital experience and for that reason, these issues may be prioritized higher than they would if we were just considering SEO.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Duplicate title tags

Category: On-Page SEO

Severity: Minor 

Difficulty: Easy

Score: 4/10

Details: Duplicate title tags usually occur when you have a multi page article on a topic and this is a pagination issue.

Having duplicate title tags is considered by some SEO tools as a warning and given a high priority. 

However, I consider this to be a minimal issue. It should be rectified but is definitely lower on the priority list.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Duplicate content issues

Category: Content

Severity: Minor

Difficulty: Easy

Score: 4/10

Details: Duplicate content on your own site is not as severe as having duplicate content to an external site.

Duplicate content is common on some platforms where there are categories and tags that deliver very similar content.

This can be related to Canonicalization which is something that is of high importance.

It’s best to ensure that if any duplicate content exists, correct canonical tags are in place.

Where it occurs due to faceted navigation it’s important to understand all the different paths to the content and manage how it is presented.

PRO TIP: Make sure any staging, test or development environment is not publicly discoverable and is protected from being indexed. I have seen too many times where a development site got indexed and was considered the original, meaning the actual live site always suffered and didn’t rank well.

Also, speaking from experience (unfortunately – see below) when you deploy a site, make sure that the items restricting indexing are removed!

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



HTML size too large

Category: Technical SEO

Severity: Medium

Difficulty: Medium

Score: 4/10

Details: There is a 6Mb limit for the size of an HTML file for crawling. For most practical purposes this is plenty sufficient and if you have code this large you should evaluate why and if it’s appropriate.

If you are seeing these types of errors there may be something broken on your site. You should have a chat with your web developer.

Note that the 6Mb limit is the size of the code and not the combined elements of a page.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Orphaned Pages in Sitemaps

Category: Indexing 

Severity: Minor

Difficulty: Medium

Score: 4/10 

Details:

This is considered an FYI by some SEO tools, however, I rank this higher than most.

The main reason is if you have orphan pages you are losing an opportunity to spread link equity and diversify your keyword base with more anchor text variation.

Identify appropriate internal linking opportunities for the orphaned pages.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Category: Technical

Severity: Medium

Difficulty: Medium

Score: 4/10

Details: Although this is in fact a true error, I am putting this in the warning section with a score of 4/10. If the canonical tag is broken Google will determine what the appropriate canonical can be.

Don’t get me wrong here, canonicalization is a very important issue to deal with, however, if the canonical tag is broken Google will assign one.

This will also likely result in the Google Search Console (GSC) category Google chose an alternate canonical.

If there is a site or section-wide issue with this type of category that would elevate it into the true error category, however, this one on its own is less severe than many SEO tools call out.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Category: Linking

Severity: Minor

Difficulty: Easy

Score: 4/10

Details: Links should always have anchor text. There are some occasions where they don’t – for example on an image (which should have descriptive ALT text to effectively serve as the anchor text). Sometimes page templates can get broken and there can be a link with no clickable anchor text. This is often not visible but should be fixed when found in an audit.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Category: Internal Linking

Severity: Minor

Difficulty: Easy

Score: 4/10

Details:

Internal links are very powerful and a good internal linking process is one of the most underutilized SEO techniques there is.

In the case of there only being one internal link, it usually makes sense to add additional links to and from the content to help spread around link equity and add in additional keywords through anchor text variation.

You also have to ask yourself – how valuable is this piece of content? Does it deserve a link or only one link? Could it be consolidated or improved? Furthermore, you can ask yourself is it still needed.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Issues With Broken External JavaScript and CSS Files

Category: Technical SEO

Severity: Minor

Difficulty: Major

Score: 4/10

Details: Well …. This is an issue, and there may or may not be something that you can do about it.

Generally, anything that is broken is an error. This is categorized as a warning and not priotirized higher because its likely difficult to fix. Note, it can be something that an updated library, new URL reference or note to a friendly developer can be dealt with.

These issues are fairly rare.

If you can fix it, please do so, but it’s not super high on the list of overall priorities.

For example, on several of the audits I run, a popular call tracking software code has no minified version so it is flagged as an error in the audit – it’s not a big deal, so this is a reasonable error to suppress.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Notices

Notices are where I feel there is a lot of ambiguity in SEO Audits. A notice doesn’t necessarily mean a problem and can often be more of an FYI or “just so you know” however some notices are looked upon as major issues.

3/10

Pages are Missing the Viewport Width Value

Category: Technical SEO

Severity: Minor

Difficulty: Easy

Score: 3/10

Details: The viewport width value, usually set as “width=device-width”, ensures that your web content adjusts to the screen width of the device it’s viewed on, providing an optimal user experience. When pages are missing this value, they might not display correctly on all devices, particularly mobile ones.

Potential implications of missing the viewport width value include:

  1. Content not scaling correctly, leading to a zoomed-out or zoomed-in view on mobile devices.
  2. Misalignment of web elements or inconsistent layouts across different devices.
  3. Decreased mobile user experience, potentially resulting in higher bounce rates.
  4. Potential penalties or reduced ranking in mobile search results, as search engines favor mobile-friendly pages.

To rectify this, ensure that every page’s meta viewport tag includes the “width=device-width” value. Regularly testing pages for mobile responsiveness can also help identify and address any related issues. Making this simple adjustment can significantly enhance the mobile user experience and improve compatibility across various devices.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Pages Don’t Have Enough Text Within the Title Tags

Category: On-Page SEO 

Severity: Low

Difficulty: Easy

Score: 3/10

Details: This issue is touched upon in other areas of audits too.

It should be noted that a title is evaluated in its entirety whether it is displayed or not.

Titles such as Home, About or Contact are leaving a lot on the table. The title tag is one of the most significant on-page SEO factors there is so you should be as descript as possible and echo the theme of your site at a minimum.

There is really no specific minimum however many tools consider 30 characters to be a minimum threshold.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Pages Have a Low Word Count

Category: Content

Severity:  Minor

Difficulty: Medium

Score: 3/10

Details: Word count is one of the most misunderstood and misinterpreted parameters out there in SEO.

Word count itself is NOT a ranking factor!

Let me just say that one more time – Word Count is NOT a ranking factor!

A lot of poor-quality content will not rank as well as a little bit of high-quality content.

It is true that most of the sites that rank at the top of the SERPs tend to have longer content than those that rank lower down, however, it is symptomatic of a high-quality piece of content, not causal of the higer rankings. That’s a very important distinction that is often overlooked.

This article is turning into a gargantuan – which I didn’t intend, however, the length of this should have nothing to do with how it ranks. Hopefully, you will see this as a well-researched and discussed perspective on my many years of experience in SEO auditing websites. This article could well rank for a multitude of terms due to the broad SEO audit related topics it covers. This should actually work very well with passage indexing, however, the key point here is I created it to be helpful – I didn’t fill it with fluff.

OK, now I will get on with it.

That being said though, if your pages have too little content and potentially could be classified as thin content it may not meet the quality and helpful content requirements, and as such it may not rank well and may not even get indexed (this is becoming more and more common).

Before just adding content though, think about the intent of the page and what you could do to improve it, don’t just mindlessly stuff it full of other headings and content to check some SEO boxes.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Pages Don’t Have an H1 Heading

Category: On-Page SEO

Severity: Minor

Difficulty: Easy

Score: 3/10

Details: The H1 tag has long been a mainstay of on-page SEO and is one of the most important on page elements. It is after all supposed to be the first heading!

However, times are changing and if a heading renders as such but uses different markup elements its unlikely to make a significant difference.

H1 Heading Tag

There are case studies that show no effect changing the H1 and there are other case studies that show a positive effect by adding a heading tag to text

It is still considered best practice to make your main page heading be the H1 tag, but it is not as critical as it used to be.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Pages Don’t Have Meta Descriptions

Category: On-Page SEO

Severity: Minor

Difficulty: Easy

Score: 3/10

Details: This is a minor one. Google will dynamically write meta descriptions to fit the SERP the majority of the time.

This doesn’t mean we should completely ignore meta descriptions however.

It is still a decent chunk of SERP real estate that we can have influence over. Any chance we have to increase CTR is valuable

The advice is always to provide a good concise description that compels searchers to click.

Bear in mind also that meta descriptions are not considered in Google’s ranking algorithm. (Something that is often overlooked by many SEOs)

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Pages Have More Than One H1 Tag

Category: On-Page SEO

Severity: Minor

Difficulty: Medium

Score: 3/10

Details: Sometimes some pages, templates or even other SEOs will make a heading an H1 tag.

The H1 heading (main page heading) is the most important On Page SEO element next to the page Title.

Side Note: Some SEO Audits highlight a page title and heading being the same as a finding. I don’t agree with this. I think its perfectly OK for the page title and heading to be the same. Its also OK for them to be different, but when you think about what they are, the main heading in most documents is essentially the title.

For a long time the H1 heading has been considered as the most important heading and for good reason. However, in SEO things change (quite frequently). There have been studies showing that you can change an H1 heading with little or no effect. There have also be studdies showing that adding in heading tags lead to a positive result. 

Heading tags often get mis used as a function of their styling. Many people will change the level of an H heading tag to fit in with their style or theme.

NOTE: It is entirely possible to have your <p> paragraphs styled as an H1 heading and conversely have your H1 heading styled as a regular paragraph text.

It is also possible to have specific style classes that can change the look and feel of specific elements while retaining the general styling site wide.

In the case of pages that thave multiple H1 elements it is best practice to make only 1 H1 heading and it should be the main page heading, however this is not as severe an issues as many SEO tools identify.

NOTE: In today’s SEO world it is more important how the page renders than it is the code behind it (and as a code purist and old-school SEO thats hard for me to say but it is what it is).

There are also some cases where templates have a common header that is styled as the H1 heading and the real main page heading is an H2 (or can also be an H1).

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Pages Need More Than 3 Clicks to be Reached

Category: Technical

Severity: Minor

Difficulty: Medium

Score: 3/10

Details: This is one of those legacy items that still has some relevance today.

In E-commerce (and really everywhere) it is considered that you should not need to click more than 3 times from the home page to achieve the result you were looking for (e.g buy a product).

This is sometimes also referred to as crawl depth or click depth.

However, given that search engines can store a page that takes 10 clicks from the home page just the same as one that takes 1 click, and that the same algorithm is run by those pages in order to return them, you start to see why this is considered more of an informational item than it is an error or a warning.

The site hierarchy and internal linking has a big effect on how many clicks it takes to reach a page.

It is best practice to have a relatively flat well clustered and categorized site structure and the internal linking naturally follows this path.

If the user experience (UX) is good in my opinion it is fine to have pages be at a higher click depth.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



URLs With a Permanent Redirect

Category: Technical SEO

Severity: Minor

Difficulty: Medium

Score: 3/10

Details: This is usually just an informational notice. A permanent redirect is a 301 redirect. It does exactly what it says, permanently redirecting a URL to another URL.

There are often very good reasons for permanent redirects to exist and as long as those reasons exist no action is needed in this category.

If there is a significant percentage of URLs with 301 redirect status it may be worth considering why that is and if any action should be taken. 

However, this is usually just firmly in the FYI informational category and no action is needed.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Category: Technical

Severity: Minor

Difficulty: Easy

Score: 3/10

Details: This is a minimal issue. A 403 status code means not authorized or forbidden, usually this is the result of a link to a login or other protected and private area of the website.

Really, there should be a nofollow and noindex status on the link pointing to such pages, however they can still be crawled by audit tools depending on how they are set.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Page URLs are Longer than 200 Characters

Category: Technical SEO 

Severity: Medium

Difficulty: Easy

Score: 3/10

Details: This issue is not common, but there is an upper limit on the length of URLs. If they are too long then they can’t be read, parsed and then indexed easily.

The limit actually varies depending on browsers and some other factors but 200 is a suggested upper bound as a best practice.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Pages Have Too Much Text Within the Title Tags

Category: On-Page SEO 

Severity: Minor

Difficulty: Easy

Score: 3/10

Details: This is related to some other audit findings and best practices with title tags. However, unless we are talking about being really excessive here I don’t feel as if there is an upper limit.

If you don’t care about Google dynamically rewriting your titles and having some degree of control over the title presented in the SERP then this isn’t a big issue.

Bear in mind too that Google will evaluate your whole title regardless of whether it chooses to display it.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



2/10

Category: Linking

Severity: Minor

Difficulty: Easy

Score: 2/10

Details: There could be perfectly valid reasons for placing a rel=nofollow attribute on an external link. That is why this is just a notice and not a warning or an error. 

You may well want to review it and you should be aware of it and understand the reasons for it, but it may be just fine.

Excessive use of nofollow should be checked though, it basically says I can’t vouch for this content and if a lot of your links you can’t vouch for, well, what does that say about your site?

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Category: On-Page SEO, Linking

Severity: Minor

Difficulty: Easy

Score: 2/10

Details: Internal linking is a valuable and severely underutilized SEO technique.

So if your site has triggered this finding I applaud you!

There is no such thing as too much of a good thing right?

Well, maybe there is, however unless there is an obscene amount of internal linking going on this one can be safely disregarded.

The types of pages that might trigger this are lists and directories.

As a general guide and rule of thumb I advise 3-5 internal links per 1000 words of content, but that is a very loose guideline.

I also advise you to look to wikipedia for internal linking, since this is essentially the bluepoint of how to do internal linking. Its more than most sites don’t you agree?

If it’s relevant and you have a resource on it – link to it!

You do need to bear in mind and consider that every link made dilutes some of the page rank, however, if you are distributing it around internally that’s great.

Also, if there are useful relevant external links to make – make them!

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Robots.txt not found

Category: Technical SEO

Severity: Minimal

Difficulty: Easy

Score: 2/10

Details: The robots.txt has been the main stay of search engine indexing control, with the file being used to control what search engines could and could not look at and ultimately include in their index.

Below is an example basic robots.txt file – it says that all user agents are allowed to visit and that there is nothing disallowed, then there is also an XML sitemap declaration, which is an SEO best practice.

User-agent: * 
Disallow:

Sitemap: https://stolber.com/sitemap_index.xml

However today robots.txt is a deprecated standard and is not the dominant index control method used by most search engines today. Many SEOs miss this issue today.

It is now not a directive but a hint and still has an effect but the recommended best practice is to use a page-level meta robots tag or the meta-X-robots header response.

Below is an example meta robots declaration.

<meta name='robots' content='index, follow, max-image-preview:large, max-snippet:-1, max-video-preview:-1' />

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



URLs with a Temporary Redirect

Category: Technical

Severity: Medium

Difficulty: Medium

Score: 2/10

Details: This one is difficult to score because context matters a lot here.

A 302 redirect or a temporary redirect is essentially an SEO brickwall. So if you have a lot of 302 redirects you should investigate and understand why they are there.

A 302 link to a page that is temporarily redirected is perfectly OK.

I have seen a major insurance company website with an antiquated backend that had the home page 302 to another page. The home page had 800 high-quality referring domains pointing to it.

Here is the kicker with that, the 302 was not really temporary as it had been that way for a long time, and Google (in Google Search Console) had correctly understood the real canonical URL so was assigning the 302 redirected page as the home page and I would guess was also passing link juice through the 302.

That being said, 302 temporary redirects are for short durations like up to 3 months max. If you have a 302 in place for more than 3 months you should really make it a 301 permanent redirect.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Issue with Blocked External Resource in Robots.txt

Category: Technical SEO

Severity: Minor

Difficulty: Easy

Score: 2/10

Details: The robots.txt file is used to instruct web robots (typically search engine robots) about which pages or files on your site should not be processed or scanned. If an essential external resource is blocked in robots.txt, it can prevent search engines from accessing or understanding key components of a site, potentially affecting site performance and search rankings.

Examples:

Essential CSS or JavaScript files that help render the page correctly are blocked, making it hard for search engines to understand the page’s layout or interactivity.

External resources that provide critical information for the site (like APIs or databases) are blocked.

Analytics or tracking scripts blocked, potentially affecting the data collected.

Resolution:

Review the robots.txt file to identify which external resources are blocked.

Determine if these blockages are intentional. If not, modify the robots.txt to allow access to these resources.

After changes, use tools like Google’s Robots Testing Tool to ensure that the resources are accessible to search engines.

Monitor the site’s performance and search rankings to ensure there are no related issues arising from these blockages.

NOTE: It’s good practice to use the live test view in Google Search Console and see how a page is rendered. This is where you can catch these issues.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Pages use too Many JavaScript and CSS Files

Category: Technical SEO

Severity: Minor

Difficulty: Medium

Score: 2/10

Details: This can be an issue however this one is fairly low on the list.

Load sequence, cache control and read time latency all matter for a sites performance. This is probably one as an SEO you should kick back to the dev teams nd say – hey can you do anything here?

This is not commonly a significant issue for any site. If it is excessive then the dev team should be able to fix it.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Sitemap.xml not found

Category: Technical SEO

Severity: None

Difficulty: Easy

Score: 2/10

Details: An XML sitemap is commonly called sitemap.xml – note that Sitemap.xml and sitemap.xml are different files (in a *nix system, but not on a Windows server). It is common to list your available URLs you want indexing in a file called sitemap.xml.

Oftentimes, the file will be called sitemap_index.xml and this index file will list other XML sitemap files.

Not having an XML sitemap has no effect on how your site ranks, but it is considered a best practice to have a valid and up-to-date XML sitemap file and submit it to the search engines such that you give them every chance to find your content. 

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Pages Don’t Have a Doctype Declared

Category: Technical SEO

Severity: Insignificant

Difficulty: Easy

Score: 2/10

Details: The Doctype (Document Type Declaration) informs the browser about the version of HTML that the page is using. When not declared, the browser may render the page in “quirks mode,” which can lead to inconsistent and unpredictable display issues.

Examples:

Pages that display differently across browsers.

Web elements not aligning properly or appear broken.

Inconsistent styling due to browser-specific interpretation.

Resolution:

To resolve this, ensure that every page on your site begins with the correct Doctype declaration, typically placed at the very top of your HTML documents. For example, for HTML5, the declaration should be:

<!DOCTYPE html>

By including this declaration, you can ensure a more consistent and standard-compliant rendering across various browsers.

This one isn’t going to a make or break item. Its best practice to include the appropriate doc type.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Pages Have Hreflang Language Mismatch Issues

Category: Technical SEO

Severity: Minor

Difficulty: Medium

Score: 2/10

Details: This issue can arise for a few different reasons but the base case is where there is a declaration about the href lang or alternate page that doesn’t agree with either the page referring to another page or the page that is being referred to.

Hreflang is implemented with some kind of error about 80% of the time in my experience. A lot of it stems from a mismatch between language and region and the way those designations are made.

You can use the hreflang testing tool to verify your declarations.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Category: Technical SEO

Severity: Minor

Difficulty: Medium

Score: 2/10

Details: Web resources such as images, CSS files, and JavaScript files should ideally be linked using their direct paths or URLs. If they are formatted as page links, it can lead to inefficient page loading, broken elements, or even increase the number of unnecessary requests to the server.

Examples:

Instead of linking directly to an image (e.g., /images/pic.jpg), it’s linked as a page (e.g., /images/pic.html).

CSS or JavaScript files being accessed through a redirect rather than directly.

Increased server requests as browsers might try to load these as pages first before realizing they are resources.

NOTE: There are some fairly popular plugins that have this default behavior and will trigger these notices. It’s not ideal, but usually is something that doesn’t rise high enough to trigger a redesign or refactoring of code.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Category: Linking

Severity: Minor

Difficulty: Easy

Score: 2/10

Details: It is general best practice to have descriptive anchor text. When you just have a “click here” or “Learn more” you are leaving context sand relevance off the table. Sometimes its not convenient to put additional descriptive text in your links (or anchor text) however where it is possible you should do it.

This is a minor finding. You should have a good internal linking process in place and take advantage of links everywhere that you can.

NOTE: Look at how Wikipedia uses internal linking to link to any related piece of information. Wikipedia is the blueprint for internal linking.

For a bit of irony here is Wikipedia’s page on Internal Linking.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



1/10

Blocked by X-Robots-Tag: noindex HTTP header

Category: Technical SEO

Severity: Medium

Difficulty: Medium

Score: 1/10

Details: With this issue and with most index control-related issues, context matters. If a page is supposed to be noindexed then this is just a notice. If it is not supposed to be noindexed then this is an error.

Most SEO audit tools do not distinguish between this and can’t know the context of your site and the reasons content may or may not need to be indexable.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Category: Linking

Severity: Minor

Difficulty: Easy

Score: 1/10

Details: There are some valid reasons for putting a nofollow attribute on outgoing links. This is especially true when you have a lot of User Generated Content (UGC) and or directory links that you do not want to be followed.

For most article and information-based sites consider what the nofollow attribute is saying. Nofollow essentially says “I don’t vouch for this content” and if you have content on your site that you don’t vouch for, what does that say about your site?

So this is not necessarily a problem, but as always, make sure you understand the reasons behind the finding and action or not as appropriate.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Pages Don’t Have Character Encoding Declared

Category: On-Page SEO

Severity: Insignificant

Difficulty: Easy

Score: 1/10

Details: This one is of minor significance and for the most part it will not have a significant bearing on how your HMTL code is interpreted. There are some caveats when non-standard or non-typical characters are being used and in which region and by which browser.

Here is a useful article describing how to use character sets in your code.

There are some best practices and the likely default you will want to use is;

<meta http-equiv="Content-Type" content="text/html; charset=utf-8">

This is a best practice but it is likely on the bottom of most SEO audit prioritizations.

This will usually have a site-wide effect for a single change and the change is easy to make which does push it up the difficulty level vs effort involved scale, making it something that would typically be prioritized higher.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Duplicate meta descriptions

Category: On-Page SEO 

Severity: Insignificant

Difficulty: Easy

Score: 1/10

Details: The only reason this isn’t a 0 out of 10 is because it’s easy to resolve.

Duplicate meta descriptions is unlikely ot be an issue for Google. It may be more of an issue for Bing, however, it is very much something that should be at the bottom of your priority list.

If there are obvious choices and changes to make and you have gone through the rest of your list then this is fine to focus on.

It’s also a nice one to check off the list but do the important stuff above first!

This is probably one of the most fitting uses for Chat GPT or other generative AI. I have a particular method where I pull in the context of the page, the title, the headings, etc and produce a valid meta-description summary in bulk. These can then be batch-uploaded.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Sitemap.xml not indicated in robots.txt

Category: Technical 

Severity: Insignificant

Difficulty: Easy

Score: 1/10

Details: This one is close to being on the forget about it list, however it is considered a best practice to put your XML sitemap declaration in your robots.txt file.

Example: Sitemap: https://stolber.com/sitemap_index.xml

Not having this certainly won’t make or break anything but if all other higher-prioritized SEO findings have been

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Pages have a JavaScript and CSS Total Size that is too Large

Category: Technical SEO

Severity: Minor

Difficulty: Medium

Score: 1/10

Details: This one is hard to score, as if the issue is breaking the site and or breaking the way it renders for Google this could be a major seviry issue.

It’s possible that one massive file becomes so large that it is too big for some browsers.

These issues are rare and are usually indicative of a wider web development issue.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Forget About It

This section is for those legacy SEO issues, myths and misunderstandings that find their way into SEO audits.

Sometimes some of these items are prioritized as significant and look very scary. They look even scarier when the output of the audit is a simple list to the client with “Here look what this automated audit we ran found” – : You should be very worried!”

Low Text-HTML Ratio

Category: Content 

Severity: None

Score: 0/10

Details: This is something that some SEO tools consider as a warning or even higher however in my opinion it is of no concern.

This is a legacy SEO finding. Way back in the day circa the early 2000s having huge bloaty code did cause some issues with crawling.

This was back when search engine bots were just code parsers and weren’t true headless browsers that render pages.

Today this is a non issue with the very slight caveat that the max HTML code limit is 6Mb. To all intents and purposes this means it is irrelevant.

Spend exactly 0 time worrying about this one.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Pages Have Duplicate H1 and Title Tags

Category: On-Page SEO

Severity: Insignificant

Difficulty: Easy

Score: 0/10

Details: There are some audit findings that consider having a Title and an H1 heading with the exact same text as an issue.

I do not consider this so!

It is perfectly natural and reasonable for a page title and heading to be the same. The heading is after all what the page is about …. That sure sounds like a title don’t it?

I put this one firmly in the forget about it category.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Page Has an Underscore in the URL

Category: Technical

Severity: None

Difficulty: N/A

Score: 0/10

Details: This is another legacy SEO finding. Technically an underscore _ and a minus – character are very different in terms of separating out words in URLs.

Minus – signs were preferred in the early days of SEO because they were true logical separators where as an underscore _ character was just another ascii character. Before search engines got really good at understanding text and context it was best practice to separate URLs with a – sign.

Underscores technically were just another character and therefore did not separate words at all.

The other option for separating words in a URL is a space character. You will likely have seen URLs with %20 in. The %20 is the url encoding for a space character and what you are seeing is URL encoding. 

For this reason people preferred to use – or _ as separators.

Today I don’t believe it is any concern at all so I demote this to a notification status.

NOTE: It is almost NEVER a good idea to change a URL to be “more keyword friendly” when it is already indexed and ranking. You take a URL that Google knows about and sends traffic to and replace it with one that it doesn’t.

There are lots of SEO Gurus that advise on the perfect URL structure for SEO, and while there are some best practices, usually this advice misses the VERY important fact of whether you are creating a new page and thus URL or if you are changing the URL of an existing page. They are very different cases.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



NOTE on Compressing and Minifying

Compressing and minifying is a category that generates lots of findings in audits. 

On the face of it, they look quite scary and result in a large number of errors & warnings.

However, often what we are talking about here is already handled by a good CDN setup.

CDNs make optimized versions of a site which extends to resizing images, loading the appropriate code libraries and preloaded versions of pages for all devices and screen size combinations, stored on local nodes around the world.

This fixes and negates the vast array of issues in this category.

It’s still best practice to fix this at source and there is no point sending around a lot of what space all around the web.

Minification is a fancy way of saying stripping out the unused white space.

Also, most modern secure transfer protocols do some degree of compression too.

I am certainly not advocating ignoring these but I am putting them into context.

170K unminified JS & CSS files are not the reason your site is not ranking.



Unminified JavaScript and CSS files

Category: Technical SEO

Severity: None

Difficulty: Medium

Score: 0/10

Details: Minification is the process of stripping out all of the white space in code. A space character represents one byte of data. Stripping out all the white space can actually reduce the code size a fairly bit.

With today’s modern web technology, CDNs and parsing ability I put this firmly in the does not matter category.

Don’t get me wrong here, there is nothing wrong with doing this it’s not a bad practice, it’s just usually not necessary. 

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



Uncached JavaScript and CSS files

Category: Technical SEO

Severity: None

Difficulty: Medium

Score: 0/10

Details: Caching and cache control is important for many reasons. However modern CDNs and browsers with the latest security protocols have advanced methods of file handling and request management. From an SEO perspective, only this is a non issue.

[Back to Top | Errors | Warnings | Notices | Forget about it | Summary | Resources | Severity 10, 9, 8, 7, 6, 5, 4, 3, 2, 1, 0]



AMP – Generalization 

Amp is essentially redundant nowadays.

AMP – Accelerated Mobile Pages were a good idea however they didn’t get widespread acceptance and technology moved on making them pretty redundant.

For this reason I don’t pay much attention to AMP pages anymore.

The principles behind AMP were very sound.

The technology moved in a different direction favoring functionality and feature-rich experience over speed and performance, however, several other web technologies have evolved to enhance the performance of sites.

Google’s use of page experience and core web vitals in their algorithms and in search console reports should also serve as a good reminder of how important these factors are.

  • AMP pages have no canonical tag
  • AMP HTML issues
  • AMP style and layout issues
  • AMP templating issues

Summary & Conclusion

In this article, I have taken a deep dive into the different types of findings that SEO tools provide.

I have given some context and prioritization to these issues.

Note that individual situations are usually different and this is generalized advice. There are many reasons why the severity and difficulty of an issue would be more or less important depending on the all-important context of the finding, the site, and the audit itself.

If you don’t agree with something here let me know.

If you have an update or useful reference let me know and I will consider amending and crediting you for the resource.

References & Resources

  1. https://developer.mozilla.org/en-US/docs/Web/HTTP/Status
  2. https://hreflang.org/
  3. https://en.wikipedia.org/wiki/Internal_link
  4. https://www.semrush.com/blog/hreflang-attribute-101/
  5. https://www.seobility.net/en/wiki/Character_Encoding#:~:text=Ideally%2C%20you%20want%20to%20use,able%20to%20read%20your%20content.
  6. https://www.searchenginejournal.com/google-iframes-debunking-myths-understanding-seo-impact/484037/
  7. https://www.w3schools.com/tags/tag_iframe.ASP
  8. https://developers.google.com/search/docs/appearance/structured-data
  9. https://developers.google.com/search/docs/specialty/ecommerce/pagination-and-incremental-page-loading

Leave a Reply