How to Conduct a Website Audit: A Step-by-Step Guide
What is a Website Audit?
A technical SEO audit is the practice of examining aspects of a website to understand how it performs.
It is crucial for ensuring your website functions efficiently and ranks well in search engines.
It helps identify issues that affect user experience primarily, indexing, and speed.
Ensuring your site is in good working order, makes it easier for search engines to crawl and rank your site.
Crawling is the process where search engines like Google, use automated bots (called spiders or crawlers) to scan websites. These bots move through links on your site, collecting data on each page’s content, structure, and keywords.
Indexing is the next step, where the crawled pages are analysed and stored in the search engine’s database so they can appear in search results.
The terms "crawl" and "index" are often used interchangeably, although they are different (but closely related) actions. Crawling and indexing helps search engines understand what your site is about so they can rank it appropriately in search results, making it easier for people to find your website online.
Why Perform a Website Audit?
Regular audits help prevent penalties and ensure your site operates smoothly. When Google can easily crawl your site, it improves your chances of ranking higher.
Conducting audits frequently, especially if your website is constantly updated, also helps catch and fix issues promptly.
If you have a website that is constantly being updated then we recommend you use tools like Lighthouse and check Google Search Console on a monthly basis. We tend to do this for clients to make sure new pages are being indexed and, if removing pages, there are no broken links or 404 errors.
Tip: A more thorough technical SEO audit should be carried out once per year.
The Key Areas of a Technical SEO Audit:
1. Website Structure & Crawling
A flatter structure (fewer subfolders) makes a site easier for both users and search engines to navigate. When a page takes more than 3 clicks to reach, it is considered a deep page and is harder for the users to reach.
Tools like Screaming Frog and Octopus.do are invaluable in visualising a website’s architecture and identifying "deep" pages or revealing poor site navigation (if present).
Due to the way some sites are built means that a flat site structure isn’t suitable but either way, visualising it can help understand the relationship between different pages and where they should live.
You can use paid tools like Screaming Frog, Ahrefs, and Semrush when crawling a website or you can use free tools like Google Search Console and Lighthouse to find errors and issues.
Common crawling issues include errors in robots.txt or the sitemap.xml file - both prevent Google from crawling your site properly. Slow server response or a particularly slow site may also contribute to this.
2. Indexing
To ensure all important pages on your site are being indexed, you should submit a sitemap.xml to Google Search Console (most CMS’s generate a ready baked sitemap).
Once done, regularly check the robots.txt file for pages that are accidentally being blocked.
You can check this by using the "robots.txt Tester" tool in Google Search Console.
Warning: When editing the robots.txt file, proceed with extreme caution. Making incorrect changes can prevent search engines from indexing your site properly, which may negatively impact your SEO.
If you’re unsure, it's best to consult an expert to avoid unintended issues. Our team are on hand to help and can be reached here .
The URL Inspection Tool in Search Console also allows you to check individual page statuses and request indexing if needed.
3. Site Speed & Performance
Google may not always seem to care about how fast your site is but users will still struggle to use a slow or unresponsive website.
Site speed directly affects user experience, especially on mobile devices. Large, uncompressed images, unnecessary plugins, and bloated Google Tag Manager codes are common speed offenders.
At JL Creative, we use various tools to perform individual tests, however, tools like Lighthouse offer a quick and free way to test performance.
To improve loading speeds, optimise and compress images, remove unnecessary plugins, and get rid of any old or unused tags in Google Tag Manager.
If you are using tools like Hotjar or Clarity, pause them when you have finished using them.
However, you should be aware of false negatives. Some CMS’s show errors in Lighthouse and other tools when there is actually nothing wrong.
4. Mobile Optimisation
Mobile-friendliness and performance can be assessed using Google Search Console’s Core Web Vitals but Screaming Frog and Ahrefs provide a more in-depth analysis.
We also use heat maps and user testing to check mobile user experience and have found that common issues include overlapping text as a result of poor responsive design or images/elements moving around as they load.
It is important to check for these issues and resolve them, ensuring a responsive design.
5. HTTPS and Security
Ensuring your site is secure (using HTTPS) is critical, especially if you handle sensitive data. In older pages on websites, we often discover old HTTP links - we recommend updating them manually rather than adding a redirect (where possible). Another common issue is multiple instances of websites eg. https, http.
Mixed content, if both HTTP (non-secure) and HTTPS (secure) versions of your site are accessible, can create issues as it means the website can be accessed via two different protocols. You must remember to add canonical tags in the <head> section of your pages to indicate the preferred version of a URL.
6. URL Structure & Redirects
Similar to website structure, a clean URL structure reflects your site hierarchy, helping search engines navigate it. 301 redirects are permanent, while 302s are temporary.
301s play a critical role in scenarios like migrating a site or retiring old content that still drives traffic. (checkout the website migration we conducted for GTEC).
These redirects ensure that users and search engines are directed to the correct pages without losing SEO value.
To identify issues like 404 errors, we can use Google Search Console or tools like Screaming Frog and Ahrefs. Based on findings, we can recommend 301 redirects to preserve traffic or opt for de-indexing if the page no longer serves a purpose in search results.
7. Sitemaps & Robots.txt
Your XML sitemap should be submitted to both Google Search Console and Bing Webmaster Tools. Make sure it contains all critical pages and doesn’t include "noindex" URLs.
Things like proper XML Format are essential, URL elements should include essential tags like <loc>, <lastmod>, <changefreq>, and <priority>.We also use tools like Screaming Frog or Ahrefs to crawl the sitemap and check the status of each URL.
Because there is quite a long and exhaustive list when it comes to an XML sitemap, we have a separate blog about How to Evaluate the XML Sitemap.
Issues with robots.txt configurations, like blocking important pages, can seriously harm your website’s performance, therefore, it is important for you to work with an experienced SEO professional and developer when dealing with it.
The robot.txt file’s main purpose is to tell search engines what not to crawl so changing something here could be detrimental. At JL Creative we check if there are any pages that need to be nofollowed, this usually includes admin or login areas or staging sites but could also include other pages or files.
8. Duplicate Content & Canonical Tags
Duplicate content can confuse search engines and Ahrefs is a great tool for identifying duplicates.
Fixing this issue is either a case of updating the content or adding a canonical tag to inform search engines which is the preferred piece of content, helping consolidate page authority.
Tools like Screaming Frog, Ahrefs, Semrush, or Sitebulb can crawl your website and report on canonical tags for each URL. These tools highlight pages that are missing canonical tags, have conflicting or inconsistent tags, or point to incorrect URLs.
In Screaming Frog, for instance, you can check the Canonical Links tab to see the status of canonicalisation across the site.
9. Structured Data & Schema Markup
Using schema markup helps search engines better understand what your website does, or what a page is for, particularly for local SEO.
This can be checked in GSC enhancements and in tools like Ahrefs. JL Creative also looks at similar websites or competitors to see if there are any structured data opportunities being missed out on.
In our opinion, the most important schema types to include on a site are breadcrumbs. Breadcrumbs are super important, particularly for sites with an unavoidably deep structure as they are like a trail of links that show where you are on a website and help you understand the structure.
The most important thing is to use the schema checking tools to make sure your structured data will work. We often struggle to get the tool to work correctly so you may also just have to wait and see if your structured data appears in enhancements on Google Search Console.
Final Recommendations
After an audit, we come together as a team to discuss what will make the biggest difference. We prioritise fixing critical issues like sitemap errors, indexing problems, and performance bottlenecks.
We also have a checklist to make sure we don’t waste time on false negatives. We then have a meeting with our client to go through our findings and recommendations, as there may be some tasks that require developer/in-house involvement.
Our Technical SEO Audit Services are designed to make the process as transparent as possible. We try to stay away from jargon if we need to because our job is to help our clients understand what is going on with their website, not cause confusion! We are super responsive and transparent throughout the entire process. We are also extremely thorough and the perfect choice for medium-sized businesses.
After we conduct your audit, you can either take the audit away and work on the recommendations yourself, or we can become an extension of your team and help you grow your online footprint.
Contact us today to schedule your website audit and start improving your online presence.
Speak to an expert
Enter your details below and we'll call you back within 1 working day.
Start your journey
Book a free strategy call with us today and find out how we can help your business grow online.