Technical SEO matters! A lot.
In this article, I am going to share a guide to carrying out all the technical SEO On-page checks that blog and website owners, beginners and SEO pros need to ensure a blog/website is optimized for search engines like Google and Bing.
You would find lots of blog posts written on this topic already by technical SEO experts and we will link this checklist with already published content so that you get a detailed guide at one place, rather than having to refer multiple articles.
Every website needs a technical SEO audit because it’s a part of the SEO process. Without a technical audit in hand, you cannot find the mistake or bugs to improve upon.
If you want to learn and conduct a technical SEO audit by yourself but don’t know where to begin from then don’t worry. We are going to take you through the detailed guide that would help you audit the SEO.
It is certainly not some rocket science but requires you to have basic technical knowledge of HTML and programming along with FTP.
Step this way…Whether you are performing first time or for investigating issues for clients’ website, I’ll tell you about some significant methods and tools while letting you know about the key areas to focus on that would help you conduct the audit.
So, let’s begin!
#1. What is Technical SEO?
Technical SEO might sound terrifying, but actually, it is straightforward to understand. Technical SEO refers to the process of optimising your website to ensure it is meeting the requirements set in place by a search engine, such as Google, with one ultimate goal: to improve your organic rankings!
How does it work, and why is it important?
Search engines will send bots, or spiders, as they are commonly called, to find, crawl, and then index your site, and they want to do this in the most efficient manner possible. If your website is hitting their requirements, they will be able to effortlessly crawl and index your site, increasing your chances of ranking higher already! If your technical SEO is not to standard, then you won’t rank. It is as simple as that, and why the importance of technical SEO cannot be dismissed.
#2. What is a Technical SEO Audit?
In layman’s words, a Technical SEO audit is a process of identifying errors in a website that might be causing issues for search engines when their algorithm crawl web pages and hence create a report for improvement.
Wait a moment……Before we move ahead, read these important technical SEO audit points.
Technical SEO Audit Points
- The SEO audit takes time and requires basic knowledge of on-page optimization. If you are a newbie in SEO, read beginner guide of SEO by MOZ
- You can’t use this SEO audit checklist for all types of websites/blogs
- SEO audit for a static or dynamic website and e-commerce website is different from SEO audit for blog
- It requires some programming and technical knowledge of website design development and hosting
- You must know some SEO terms like ROBOTS.TXT, .htaccess, website structure, URL and HTML in depth
- The audit requires SEO tools to be used on an advanced level for websites and blogs.
- Every situation is unique – if you have a penalty or serious technical issue, you will need to consult with an SEO expert or company to research. This guide cannot cover everything.
Today, I am going to share with you all the same checklist that I use to audit my company’s and our client’s website.
Not to boast, but we are ranking on a page first on Google search engine. However, let me clarify that this guide’s goal is let you know about audit techniques to find errors, not fix them.
When Should You Do a Technical Audit?
An on-the-page audit is always one of the best ways to evaluate a website’s performance and there are two occasions when we should conduct audits:
- At the beginning of every new campaign
- Once a quarter for running campaigns
Now that you know when you will have to conduct an SEO audit, let’s discuss the techniques.
Technical SEO Audit Checklist:
- Meta Data / “On-the-Page SEO” Checks
- Crawlability & Indexation
- URL Structure Analysis
- Secure Protocol HTTPS
- Canonicalization & Redirects
- Site Speed Optimization
- Rich Snippets/Schema Structured Data
- Google Search Console Audit
- Required Tools (Free and Paid)
#1 – Meta Data and On-the-Page Checks
Meta tags that include title and meta description, are two main attributes of html code that serve to inform users and search engines about the page’s subject.
If you optimize both meta tags correctly, it will increase your ranking and click through rate. To perform proper optimization, incorporate the following guidelines.
Title and Description Tags:
- Your website description should be within the limit of 156 characters. Anything more than that will get truncated by Google in search results. Although Google is working to show long descriptions in search results, for now just should follow the current Google guidelines.
- Your web page should have all necessary meta tags, none missing.
- Do not repeat Meta tags and ensure there are no duplicates.
- Your web page title should be 65 to 70 characters long. Anything more will get truncated by Google on search engine result page.
- Each page of the website should have unique title tags, none missing or duplicate.
How can I check if I have Meta Tags?
- You can check the source code of your page by pressing Ctr+U and then +F <title> <description> in the page.
- You can use Screaming Frog to check meta tag errors and generate report.
Few more On-the-Page Checks include:
Meta hreflang Tag: This meta tag is used to inform Google which language is being used on a page, if you have content in different languages on your website.
H1 Heading Tag: Generally, <h1> tag is used at the top of the page to display main page’s heading. It should be unique and there should be just 1 heading on each page.
Image Alt Tag: This tag is used to show alternative text when an image fails to get extracted and gives a description of the image.
#2 – Crawlability & Indexation
Search engine bots perform two major functions: crawling and indexing to provide users with a list of web pages that they have gauged to be most relevant. To get detailed information about it, read a complete guide on how search engine work by MOZ.
The crawlability of a web page is one of the main functionalities of search engine bot that determine which pages are returned following a Google search.
If a search bot can’t crawl your web pages, then it won’t be able to rank and show to a user who might find the information relevant to their query.
Consider these 3 components that will help you facilitate the crawlability and indexation of a webpage:
- XML Sitemap: An XML sitemap file is used to help the search engine bots to better crawl and understand your website and blog. It helps to improve your web pages’ indexation. It also helps you to protect your website against duplicate content.
- HTML Sitemap: A sitemap that helps website visitor to easily navigate the web pages and it lives on a web page, not an XML file. Watch more about HTML site map Matt Cutts on HTML Sitemaps. You can generate sitemaps by using this online sitemap generator tool.
- Robots.txt: This file is used to instruct search engines on how to access your website. With this file you can control the search bot on how bot can or can’t access your web page or folder. Read more about it on Robots.txt optimization.
#3 – URL Structure Analysis
This check will help you in analyzing a web page URL to ensure proper optimization for visitors as well as search engines. You can use Screaming Frog SEO Spider for a complete analysis of your website and blog URLs.
Some additional guidelines to adhere to are as follows:
- Use your main keyword in URLs. It represents your page in best way.
- Avoiding excessive use of folders
- Always use hyphens in URLs
- Remove unnecessary punctuations from URLs
- URLs should be clean, short and shareable
Few examples of URLs:
#4 – Secure Protocol HTTPs
Half of the websites are on secure HTTPs protocol since Google launched it back in 2014. To make your web or blog secure you have to switch from HTTP to HTTPs.
“According to Wikipedia: The Hypertext Transfer Protocol (HTTP) is an application protocol for distributed, collaborative, and hypermedia information systems. HTTP is the foundation of data communication for the World Wide Web. Hypertext is structured text that uses logical links (hyperlinks) between nodes containing text.”
To provide more secure web experience to users Google announced that HTTPs is a ranking signal and 70% of websites that rank on Google first page are on HTTPs. Google bots give priority to secure pages over unsecured pages. When you do an audit, ensure that this element is also an important factor of Google ranks.
Through HTTPS (Secure Hypertext Transfer Protocol), the exchange of authorizations and transactions are protected by an additional layer known as SSL (Secure Sockets Layer) to transfer sensitive data.
When you run an audit in Screaming Frog, you can search how many URLs are setting on HTTP and HTTPs and identify any duplicates.
Check the SEO Ranking Factors Study by SemRush to find out more about the Https importance in ranking a website.
Before summing it up, let me also tell you my favourite tools to conduct a Technical SEO Audit:
- Screaming Frog SEO Spider
- ISS SEO Toolkit
- SEMrush Site Audit
- Pingdom DNS Check
- Google Webmaster Tools
- Reverse IP Lookup
- Bonus Tool: Your Brain
Summing It Up
I hope this guide has been helpful to you. These are just the guidelines, to begin with. If you want to fully audit your site, look out for the second part of this guide when we will move onto rankings.
Don’t forget to check our complete SEO Audit Video guide.