How to Perform an Effective SEO Audit

How to Perform an Effective SEO Audit

If you need to perform an SEO audit, What are the first steps to take? The first step is to determine what your goals are for this audit. If you’re looking to rank higher in Google, that’s a different goal than if you’re trying to improve conversion rates or increase traffic-whatever your goals may be; it will dictate how the rest of your audit goes. The next step is figuring out where on the page problems occur and why they exist. For example, if people leave before filling out your form because they can’t find it, then maybe there’s something wrong with the button’s location on the screen or its size. It could also be that there was no call to action at all! This article goes over the steps needed to perform a thorough SEO Audit.

Crawl your website

Every SEO Audit starts with crawling your website using an automated tool to check for errors quickly and efficiently quickly. Manually checking for errors would be too time-consuming, and you can easily miss faults.

To automate the process, I recommend using one of the following tools:

With one of these tools, you can schedule a weekly technical SEO audit and our Site Audit tool will automatically start crawling your website. 

What is an SEO Audit?

An SEO audit is a process of analyzing how well your web presence relates to best practices – it’s the first step in creating an implementation plan with measurable results. The purpose behind this analysis? To identify as many foundational issues affecting organic search performance as possible so that you can start fixing them right away!

Technical Errors

Starting with the scan, you will want to look for errors related to:

  1. 4XX, 5XX errors

    SEO Audits: SEMRush SEO Audit Errors

    If people keep getting an error page (404 – page not found, broken links) from broken internal links, broken SERP links, or broken CSS + JS files, they will go somewhere else to find the answers/products they are looking for.

    Google Search Console Crawl Errors is a great way to identify what errors a GoogleBot encounters when crawling your website. To view these, you will need to set up your site as a GSC property.

  2. Missing & Duplicate Title Tags and Meta Descriptions

    Google uses metadata as the tiny billboards within the Search Engine Result Pages (SERPs); this entices people to click on your link. If it’s not optimized, there is a significant chance they will go somewhere else.

    Title tags and h1’s (header tag) should not be the same; the title should attract people to click, whereas the h1 should explain what the article is about.

  3. Duplicate content

    When you talk to people, you don’t want to hear the same thing repeatedly. You want a new and engaging conversation. The same can be said for website copy. So keep your articles unique and captivating.

    Copyscape is an excellent tool for finding plagiarized content.

  4. Broken images (Missing ALT tags)

    If your websites are broken, people viewing your website will question its authority, how well it’s being maintained and will probably bounce to somewhere more relevant.

  5. Missing Sitemaps.XML

    SEO tools: XML Sitemap

    XML sitemaps are the first thing Search Engines uses to crawl your website. Web crawlers can find new pages without XML sitemaps, but this makes the process a lot easier for them to find new content.

    Using the Robots.TXT file in your root directory is what you use to indicate to crawlers where your sitemap is located.

  6. Redirect chain loops

    A redirect loop occurs when a URL is redirected to another URL, which redirects back to the previously requested URL, leading to an infinite cycle of redirects. These redirect loops are a closed chain of redirects.

    When someone encounters one of these errors on our website, the page will fail to load, providing a negative experience for the user.

  7. Incorrect Canonical tags

    A canonical tag tells search engines that a specific URL represents the master copy of a page. Using the canonical tag prevents problems caused by identical or “duplicate” content appearing on multiple URLs.

    An example of this is reflected in these URL structures:
    http://domain.com/seo-audit/
    https://domain.com/seo-audit
    https://domain.com/seo-audit/
    https://www.domain.com/seo-audit/

    I’ve commonly seen the canonical tag set to the non-SSL version of the website, forgetting to update it after migrating the website to HTTPS, showing multiple versions of your site to search engines.

  8. Invalid Structured data (Schema)

    Adding Schema/Structured Data is a great way to tell important information to Search Engines about the website, but that information needs to be validated.

    When the data is invalid, it might not be used by Search Engines; it is essential to test the results with developer tools:
    Schema Markup Validator
    Rich Results Test

  9. A large amount of 301 redirects

    Having a few 301 redirects is ok for a website; if the website has over 20% of its links as redirects, this could be a cause for alarm. Every time a URL is redirected, you have a slight lag in performance; a large number of redirects would signal to Google that your site is sluggish and doesn’t offer the best user experience.

    A 301 redirect is a permanent redirect set up in the hosting/server that responds to where the new page is found.

  10. Thin content

    Screaming Frog SEO Audit

    Thin content, dead pages, and pages with low word count will lower your overall quality score. This doesn’t mean you need to write extra-long articles, but you must provide engaging value to readers.

    Dead pages are articles that no one visits or reads. You can identify dead pages by looking at analytics data for how many visitors a page receives over a six-month to a one-year timeframe; if no one has visited that page of the past few months, it’s time to rewrite or delete.

    Screaming frog automates this process using its Google Analytics & Google Search Console APIs.

  11. Slow pages

    It’s vital to have a website that loads quickly and efficiently. You can test your page speeds using Page Speed Insights or Lighthouse.

    Website speed is essential for a good user experience. If your website speed is low, your conversion rates are likely to be the same. On average, if your website takes longer than 3 seconds to load, approximately 40% of your visits will abandon your site.

  12. HTTPS leads to HTTP or Staging

    If you want your website to rank well in Google, Google must know which is the correct version they should index. It is very easy to accidentally leave a staging server indexable and linked to the production website or have all your links pointed to an HTTP version of your website.

    To manually check pages for this error, you can hover over links and images in your browser to see the URL path.

  13. Orphaned pages & Crawl depth (Internal linking)

    PageRank is an algorithm that measures the importance of web pages by using anchor text, links, and content. This algorithm helps to rank websites on their search engine results page by calculating how important they are.

    When a page has no links (orphaned content) or has a crawl depth over three, this tells Google these are not valuable pages.

    It’s important to remember to link pages accordingly and not to go overboard. When you overlink a page, it dilutes the PageRank score. Also, when naming URL anchors, try to be descriptive and non-repetitive.

  14. Check Indexation

    Google Search Console Coverage audit tool

    Google does not index every page you publish; it only indexes pages it finds valuable to searchers.

    You can check how many pages are indexed by using:

    Google Search Console’s index tab, the number of indexed pages is under coverage >> valid.

    The second way to view how many pages are indexed in Google is to perform an Advanced Search Operator in Google Search. This is accomplished by writing site:ReplaceWithYourDomainURL.com in the search box.

  15. Analyze Content

    To have an engaging website, it’s essential to view your website as if you are a reader that has never seen your website before. With these fresh eyes, ask yourself if your site is appealing? Does it outperform the competition? Is it functional and easy to use? Do you adequately answer readers’ questions? Then see where you can improve on a page-to-page level.

    SurferSEO is a great automated tool that can help with this on-page SEO optimization and fix pages on your site.

    This analysis will help you to Silo (organize) your content in straightforward steps, give you ideas for better article ideas and quick wins (keyword research), and insight into finding external linking opportunities.

Tracking Results

After performing an SEO audit and fixing what you can, it is vital to track the changes, analyze your results, and see their effects on your organic traffic. The optimizations are never done and will always be an ongoing process. 

About the author
Isaac Adams-Hands

Isaac Adams-Hands is SEO Freelancer in Ottawa, where he helps clients plan marketing goals that are keyword-optimized and measurable.

He has worked at Microsoft, The institute of chartered accountants in Australia, Auto Trader, Le Cordon Bleu, and Algonquin College in various Digital Marketing Roles.

Isaac is qualified as a Full-stack developer, Server Administrator, and cyber security expert, adding additional experience to his Search Engine Optimization knowledge.

His Inuit heritage brought him to the Arctic to hunt and fish for most summers, which grew his passion for 4-wheelers and dirtbikes.