Are you trying to develop an effective SEO strategy for your business but aren’t sure where to start? Your first step should be to conduct an SEO audit.
Search engine optimization is not an easy subject. Without the right guidance, you can easily be lost in the murky water trying to figure out what’s working and what’s not. Luckily, an SEO audit can help answer this question and provide opportunities to improve your site’s performance in search engine result pages (SERPs). This article covers a simple SEO audit checklist for your business. Read along to learn how you can improve your organic search performance.
Page Loading Speed
Page Speed refers to how long it takes for a web page to load. Page speed affects SEO and Google rankings. In fact, page speed is one of Google’s ranking factors, although many people still forget about it.
Website speed doesn’t just affect SEO and Google rankings; it affects user experience (UX), not forgetting that users are likely to leave a page if it loads too slowly (takes more than three seconds to load). A slow-loading page increases the bounce rate and reduces page dwell time.
Many factors affect a page loading speed, including:
- Slow server response time (SRT)
- Unoptimized images
- Page filesize
- Long redirect chains
To improve page loading speed on your site, you first need to determine what is slowing your page load time or ask your developer to help if you don’t have the required skills.
How to Check Page Loading Speed
Google has a custom tool called PageSpeed Insights that helps determine which elements need to be optimized to increase page loading time. That said, here are the three most common ways of measuring page speed:
- Time to First Byte (TTFB):
TFFB refers to the time it takes for a page to start the loading process. A good example of TTFB at work is when you land on a page but have to stare at a white screen for a few seconds before seeing the page’s content.
- Fully Loaded Page:
This is the simplest way to determine page loading speed. It measures how long it takes for a page’s resources to load 100 percent.
- First Meaningful Paint:
This is the time it takes for a page resource to load for a user to be able to view the content of that page. For example, if a page takes 10 seconds to load fully, that could be a long time for a user to wait. However, paying attention to First Meaningful Paint will help you understand how users interact with your page as it loads. For instance, for a page that takes 10 seconds to fully load the entire resources of a page, users will get to the ‘First Meaningful Paint’ earlier; let’s say after 1.5 seconds. This way, they won’t have to wait 10 seconds to start interacting with your page and won’t categorize it as a slow-loading page.
As stated earlier, there are many ways of measuring page speed, and none is more important than the other. The best approach is to follow the diagnosis from PageSpeed Insights and take the necessary steps to improve your page loading speed.
Mobile-friendliness measures how well a website is optimized on mobile devices like smartphones or tablets. Mobile SEO isn’t something we anticipate; it’s already here with us.
If you don’t know, mobile searches make up about 60 percent of organic searches. Besides, Google started using the Mobile-First algorithm in March 2021, meaning the mobile version of your site will be indexed first for mobile and desktop searches.
That means websites that don’t optimize for mobile will lose many opportunities for getting mobile traffic, negatively impacting their rankings in search engine results pages (SERPs).
There are three ways you can design your website to be mobile-friendly:
- Dynamic design
This option checks whether a user is on mobile or desktop and serves a different HTML code based on that. It requires using the Vary HTTP header to ensure that the caching servers serve the right version. Dynamic designs also allow mobile user agents to access the mobile page version.
- Mobile Subdomain
This option is also called mDots and involves building a separate mobile site and hosting it at a subdomain. It involves much developer work and resources, and the result is usually something like m.domain.com or mobile.domain.com. However, to be sure that Google can figure out the relationship between your main site and the subdomain, include the rel=” canonical” tag pointing to the desktop version. However, this option is not recommended for large or complicated websites.
- Responsive design
Unlike the mobile subdomain, this option doesn’t require much developer time and resources. It only involves the addition of one meta tag but no changes to your current code.
To create a responsive site, you need to add the meta viewport tag in the page’s <head>:
t<meta name=”viewport” content=”width=device-width, initial-scale=1.0″/>
The viewport tells the browser to give a page based on screen size.
How to Check Your Site for Mobile-Friendliness
Google’s Mobile-Friendly Test tool helps test whether your site is mobile-friendly and highlights the elements that need to be improved if your site doesn’t pass the test.
Additionally, you can get the Coverage Report in Google Search Console, which provides information on any mobile crawl issues on your site.
Pro Tip: Set the primary crawler as a smartphone before checking the Coverage Report in Google Search Console.
Core Web Vitals
Another factor to include in your SEO checklist is Core Web Vitals. The Core Web Vitals report shows how pages perform based on how users use and interact with your pages. According to Google’s 2021 update, Core Web Vitals has three pillars: speed, responsiveness, and visual stability.
In other words, Core Web Vitals involves checking how fast the biggest piece of content loads in the initial viewpoint (loading), the responsiveness of a page (interactivity), and visual stability. Core Web Vital comprises three metrics: Largest Contentful Paint, First Input Delay, and Cumulative Layout Shift.
Check this post to learn more about these Core Web Vitals metrics: https://alchemyleads.com/core-web-vitals-how-to-improve-mobile-pagespeed-cwv/
Google prioritizes providing a good user experience to its users and will favor websites with a good user experience when providing results to a search query. As such, you should analyze how users interact with your website and optimize your site for UX signals. Some of the metrics to analyze in that regard include:
- Bounce rate: How many users visitors visited and left your site without interaction?
- Dwell time: How long does the user spend on your site before returning to SERPs
- Organic CTR: How many users clicked on your website after seeing it in the search results
To check your website’s performance, check the Core Web Vitals report in Google Search Console.
Check for Manual Actions
Manual actions are another thing that can impact your ranking on search engine results pages. You get a manual action when a human reviewer at Google determines that your site isn’t complying with their search guidelines. As a result, some content on your site won’t appear in Google search results.
Of course, you won’t have a manual action unless you have violated Google’s webmaster’s guidelines, but you still need to check if you have one, as it would seriously hurt your visibility in search results.
Again, this doesn’t mean that manual action is a ranking factor since it’s something that Google implements on its own. However, a manual action overrides all the ranking signals because it can instantly remove your website from its face.
How to Tell if A Manual Action Impacts Your Website
Google communicates clearly when a website has been hit by a manual action. Sometimes you may be warned in advance before the manual action is implemented on your site so you can correct the issue.
Google communicates this information through Google Search Console, making it an invaluable SEO tool. This is where Google will contact you if a manual action impacts your site. You can get information about any manual actions against your site by checking the Search Console Manual Action Report.
To recover from a manual action, you must fix all the issues that Google highlighted on all offending pages. Once you resolve all the issues, submit a reconsideration request for Google to reconsider the penalty issue to your site.
Google will reverse the manual action if it is convinced that the issues have been fixed. Of course, your site’s rankings won’t immediately return to their initial position, but nothing will be holding you back from rising again in the SERPs.
Indexing errors occur when Google cannot crawl your website or add it to their databases correctly to appear in search results. Identifying all indexing errors on your website is essential as it helps detect any technical issues that prevent Google robots from indexing your pages.
How to Check Indexing Errors
The fastest and most straightforward way to check whether all your pages are crawled and indexed is via site search. This option doesn’t require any advanced tools—you only need to enter this command into Google: site:website.com.
Google will give results that will help determine the number of pages indexed on your site. This option isn’t ideal for large websites but provides accurate results for sites with few pages.
Another great place to get information about indexing issues is the coverage report in Google Search Console. Google Search Console helps analyze your webpages’ indexing status to determine what pages were successfully indexed and those that were not.
Here is the list of common indexing errors you may see in the Search Console Index Coverage Report:
- Submitted URL marked ‘noindex’
- Submitted error blocked by robots.txt
- Submitted URL not found (404)
- Submitted error returns unauthorized request (401)
- Server error (5xxx)
- The submitted URL has a crawl issue
Remove all the indexing errors on your site to avoid losing valuable web traffic.
The next element to include in your SEO audit is the amount of organic traffic you’re getting. Google algorithms keep updating from time to time, which affects the amount of organic traffic you get.
These updates target many things, including content quality, link spam, etc. That is why you should check if there is a traffic drop to your site due to specific Google updates, as this may indicate serious underlying issues.
To check your traffic, head to Google Search Console and check the Search results report for a specific period. Pay attention to whether your organic traffic has increased, decreased, or remained flat during that period.
If your drop in traffic is due to a Google update, check the update to see what you should change or improve.
Launching your website and doing everything possible to ensure its success is a great thing but not enough. You should take measures to ensure website security; otherwise, your Google rankings will be impacted.
Google prioritizes website security and wants users to feel safe and secure when browsing the web. In other words, Google favors sites that encrypt data and have taken the necessary measures to ensure website data isn’t exposed to cybercriminals or exploitation.
Having a proper security solution will protect your site from common security threats, including:
- DDoS attacks
- Vulnerability exploits
It also protects users from many security risks, including:
- Phishing schemes
- Stolen data
- Malicious redirects
- SEP spam
- Session hijacking
As a rule of thumb, implement the Secure Sockets Layer (SSL) certificate to protect sensitive data from the users. An SSL certificate authenticates a website’s identity and allows for an encrypted connection.
When your website is secured by an SSL certificate, users will see the acronym HTTPS (HyperText Transfer Protocol Secure) on the URL. Without this certificate, your URL will only have HTTP letters, which is not secure.
If you haven’t switched from HTTP to HTTPS, it’s about time you did that. After all, according to Google, HTTPS protocol will impact your rankings in search engine result pages (SERPs).
Nothing is bad as visitors coming across broken pages on your site. It leads to a bad user experience and a higher bounce rate and ultimately impacts your rankings in SERPs.
Broken pages are links to pages that no longer exist. In most cases, this happens when you delete a page from your site and fail to use a redirect. The most common sign of a broken link is the “404” error which occurs when the server cannot find the requested resource.
Many broken links on your site lead to a bad user experience and a gradual decline in traffic and conversions. In the end, you’ll lose your rankings since Google favors websites that provide a good user experience when providing results for a search query.
One of the tools you can use to find the number of broken pages on your website is Ahrefs. Head to the Internal Pages in Site Audit and select “Broken.” Once you identify the broken pages, remove or replace them as soon as possible to avoid impacting your site further.
You’ll have duplicate content if your content appears in more than one place on the internet. In other words, duplicate content is when content appears in more than one unique website address (URL).
While Google won’t penalize you for having duplicate content on your site, duplicate content can still impact your search engine rankings. Having appreciably similar content (as Google calls it) in more than one web place makes it hard for search engines to decide which content to show in search results when a user performs a relevant query.
As a result, search engines are forced to choose which of the versions is the best for the search query. In the end, this reduces the visibility of each duplicate, which affects your rankings.
Many different tools can help check for duplicate content on your site. This includes:
- Semrush: Gives report on duplicate content but only on the domain
Of course, most of these tools aren’t designed to help find duplicate content, but they can help.
Once you discover duplicate content, you should fix the issue as soon as possible. Since Google values content that shows expertise, authority, and trust (EAT), find a highly-qualified content writer to create the content afresh.
The best way to prevent duplicate content on your site is to use canonical link tags. These tags help organize your site and tell search engines the content they should index first.
Another crucial file to include in your SEO audit is robots.txt. Ideally, we use this file if you don’t want Google crawlers to index all pages within your website. You can use robots.txt to block such pages from indexation.
Use the robots.txt Tester tool to test your robots.txt to determine whether Google bots can crawl content URLs you want to block from search. Robots.txt Tester works similarly to Googlebot to ensure that robots.txt has properly blocked the URLs that you don’t want to be indexed.
- Robots.txt Tester tool doesn’t work with Domain properties—only with URL-prefix properties
- This tool only works with Google user-agents or Googlebot. It’s still unknown how other web crawlers view your robots.txt file.
- You should copy and paste the changes you make in the tool editor into the robots.txt file on your server, as changes are not automatically saved to your web server.
Backlinks (when done better) can help enhance your SEO efforts. While some think that the more backlinks they have, the better, the case is entirely different. Quality over quantity is the name of the game when it comes to having a robust backlink profile.
Here are the elements that make up a high-quality backlink:
- Relevance: Your backlinks should come from legitimate websites in your industry. For instance, if you run an eCommerce store, your backlinks should come from a legitimate eCommerce store.
- Authority: Relevance is a crucial factor when building better backlinks, but it isn’t enough. To have a more robust backlink profile, you must get backlinks from high-authority websites. Getting backlinks from spammy sites will negatively affect your site’s authority.
- Diversity: Ensure your backlinks come from different types of pages and not from a single site
If you notice any illegitimate backlinks, remove them as soon as possible. You can also disavow spammy backlinks via Google Search Console.
Voice Search Optimization
The way users conduct searches on the internet has changed over time. Today, we experience rapid growth in voice search. To stay ahead of the game and increase your site traffic, you should ensure that your site is voice search-friendly.
The best way to achieve this is by creating content your target audience wants to see. Ideally, your content should provide answers to your target audience’s questions. For instance, if someone conducts a query like “how to choose the best SEO agency? ” Your content should provide direct answers to that question.
Also, your content should be clear and easily readable by the speaker. Providing relevant, valuable, and clear content improves your chances of appearing in the featured snippet. This is good because voice search devices get information from these featured snippets.
On-Page Content Optimization Audit
Keywords and other on-page elements help search engine crawlers know what your page is about. Optimize your copy and build it around high-value keywords to enhance your on-page SEO efforts.
Here are the crucial aspects of on-page SEO you should optimize:
- Meta description
- Heading structure
- Title tag
- Images (file names, alt text, etc.)
- Internal and external linking
- URL structure
- HTML emphasis tags
That said, here are actions to take to optimize your on-page SEO:
- Avoid creating blocks of paragraphs by using H2s, H3s, and H4s, making lists, and use of bullet points
- Use keywords strategically in title tags, meta descriptions, and H2s and H3s
- Apply your main keyword or phrases naturally within the first 100 words
- Use a conversational tone and ensure your content is easily understandable
- Optimize navigation for a better user experience
- Avoid grammatical errors
Your sitemap helps search engine crawlers understand the relation between pages on your site. In other words, it lists the pages that search engines should index. It shouldn’t include dead pages, redirects, non-canonical pages, or other pages that send mixed signals to search engines.
Submit your sitemap directly to Search Console to ensure that Google can find it. If you’ve already submitted it, confirm that your sitemap is accurate and up to date.
Our SEO audit checklist will not be complete without analyzing competitors to see how they do their SEO.
Analyze competitors who rank for certain keywords in your industry and see what strategies you can borrow from them. Look at the types of keywords they are using and their content strategy. If they’re attracting more leads using their content than you are, that means there’s something you can learn from them.
During competitor audits, pay attention to the type of content that resonates with your target audience in your industry and how your content strategy compares to your competitors. This will give a clear idea of what you’re doing wrong and the steps you should take to enhance your search results.
AlchemyLeads Can Help with Your SEO Audit
The need to conduct an SEO audit for your website or eCommerce store cannot be overstated. It helps lay the foundation of a great and robust SEO strategy for your project and forms a crucial part of its ongoing success.
It helps find and fix issues in your strategy as early as possible to ensure your business remains on the path to success.
At AlchemyLeads, we have many years of experience optimizing SEO for clients in various industries for better search visibility and conversions. As a full-service digital marketing agency, we specialize in SEO audits, content creation, on-page optimization, local SEO, and building better backlinking strategies.
If you’re looking for an award-winning SEO agency to help perform SEO audits and provide actionable solutions, look no further. Our SEO experts are detail-and-result-oriented and will handle your SEO successfully for more valuable results. Contact us today to help audit your website for better rankings and conversions.