SEO Audit
The guide provides instructions for conducting a thorough search engine optimization analysis that can only be accomplished by proficient SEO specialists.
Here is a comprehensive SEO audit checklist that enables you to conduct a thorough examination of any website’s SEO. I personally use this checklist every time I perform an SEO audit, and I encourage you to do so as well.
This list likely comprises all the technical aspects related to SEO that one can imagine. Additionally, based on the feedback, some non-technical factors that are equally significant, such as E-A-T, have been included.
The purpose of your SEO audit
To initiate the SEO audit, it is essential to respond to these five introductory queries which will aid in determining its objective.
Here are some questions you can pose to yourself:
- Are you interested in discovering the causes behind the website’s decline in search visibility and traffic?
- Are you interested in understanding technical problems that could impede the website’s functionality?
- Are you interested in identifying the most critical technical errors and proposing swift solutions?
- Would you like to conduct a comprehensive examination of technical SEO, scrutinizing all of the crucial and less crucial elements?
- Are you interested in conducting an audit for your own website?
Before starting the audit, it is beneficial to have a clear idea of your desired outcome. Without a visible target, hitting it becomes impossible.
After identifying your objective, it is now the appropriate moment to begin the technical aspect.
Acquiring fundamental data for the assessment of your search engine optimization
During these procedures, you will collect fundamental details regarding the location that you are going to assess.
What is the duration of a technical SEO audit?
In this video, I discuss my perspective on the appropriate duration of a technical SEO audit and how much time I typically devote to it.
1. Do a manual review of the website
Starting your audit by manually reviewing the website and making note of what catches your attention is an excellent practice. This preliminary task can offer a fresh perspective that complements the data provided by SEO tools. Your initial examination at this point can facilitate connecting the dots later on when collating the data from such tools.
2. Evaluate the website by utilizing Semrush or a comparable tool for search engine optimization.
Before delving into the specifics of the website, following a manual review, the next logical step is to evaluate it using well-known SEO tools like Semrush or Ahrefs.
One way to collect fundamental information about a domain is to obtain an overview of it in Semrush.
Conduct a thorough examination of the website, focusing on important aspects like domain metrics, backlinks, overall visibility, organic traffic, organic keywords, and traffic trends to form an overall impression. Taking this step is crucial.
3. Check the backlink profile
Despite this being a technical SEO review rather than a link evaluation, it is still crucial to conduct a basic analysis of the website’s backlink profile. The auditor should be aware of any inferior link building strategies utilized by the website as it can adversely affect its search performance and visibility.
Semrush offers a great tool for auditing backlinks, which provides a speedy overview of the site’s backlink profile.
Using Semrush, you have the ability to perform a broad assessment of your links that will organize them and highlight any links that may be of poor quality. Nonetheless, it is important to keep in mind that this is simply a tool and that your own judgment is crucial in this process.
4. Check the CMS of the website
It is likely that the website you are examining is utilizing a CMS, specifically WordPress which is currently being utilized by around 40% of websites.
As an auditor, it is important for you to be aware of the CMS (Content Management System) utilized by the website in order to suggest appropriate solutions for issues that are specific to that particular CMS.
CMS Detect can be employed to examine the content management system being utilized by the assessed website.
A tool such as CMS Detect can be utilized to examine a website’s CMS.
5. Verify the company that is providing the website hosting service.
Having knowledge about the website’s hosting provider is important because it enables you to provide specific recommendations on adjustments to server settings and the addition of SSL certificates.
If you have prior experience, it will be easy for you to determine whether a particular hosting provider is suitable for your website.
The Hosting Checker tool allows you to verify a website’s hosting provider. However, if the website is using a Content Delivery Network (CDN), then the tool will provide details about the type of CDN being used instead.
You can quickly verify this by using Hosting Checker or a comparable tool.
6. Determine whether the website is hosted on a shared hosting platform or not.
There is a vast range of opinions regarding the influence of shared hosting on a website’s ranking. Nevertheless, it is wise to identify whether a website utilizes shared hosting and which other websites share the same IP. This can be achieved through a reverse IP domain check.
With the aid of the tool called “You Get Signal,” you can efficiently verify the other websites that are being hosted on the same IP address.
One option for conducting a reverse domain IP check is to use a tool such as You Get Signal or a comparable alternative.
7. Check the domain history
Without some knowledge of a website’s history, it’s impossible to fully understand it. Therefore, it’s important to investigate the domain’s age, registration history, and current registrant.
Using the WhoIS request tool enables you to promptly examine the previous records of a domain. This data can prove particularly valuable when attempting to understand the causes behind a decrease in search engine positioning.
Checking the domain history is easy using a basic tool such as WhoISrequest.
No. 8: Check the Wayback Machine
Besides the historical information of a website’s domain, it’s useful to have an understanding of its previous appearance and the nature of its content. This information will help you ascertain whether significant changes or redesigns were implemented on the site.
The Wayback Machine is an important resource for SEO experts, particularly when seeking to understand the causes of a decrease in website visits.
You have the option to use the Wayback Machine and view the appearance of the website from previous years. Additionally, please review my SEO checklist for the website redesign.
Number 9: Verify whether or not the website has experienced a significant alteration or renovation in the recent past.
Knowing about any significant alterations made to a website can be beneficial, particularly when conducting an audit to determine why traffic decreased. The Wayback Machine is also a useful tool for checking to see if any redesigns have been implemented.
Let us now delve into the information provided by Google’s tools.
Analysis Of Data Provided By Tools
The Google tools offer a plethora of valuable information regarding the technical search engine optimization (SEO) components of a website. To conduct a thorough and effective technical SEO audit, it is essential to begin by analyzing the data obtained from these tools.
Google Search Console (GSC)
The cornerstone of any technical SEO assessment is GSC. If you can obtain the GSC information for a website, begin by examining the aspects mentioned below.
In case you share my fondness for Google Search Console, I urge you to read my instructions on auditing a website solely with Google Search Console and on employing Google Search Console for keyword research.
Verify whether the website has established a GSC account.
It may sound unbelievable, but there are still websites that haven’t created a GSC account. In the rare scenario where you come across such a website, you can disregard everything mentioned in this section.
To begin with, establish a Google Search Console account for the website. If you lack the authority to do so, ensure that the client makes it a top priority.
No. 11: Check the Performance report
To view a summary of how well the website is performing in search, access the Performance report. Examine metrics, including overall clicks, impressions, average position, and CTR.
Adjust the time frame to encompass the past year and pay attention to any patterns in the amount of traffic.
The Google Search Console Performance report provides a wealth of data about your website’s performance in terms of organic search.
No. 12: Check the Coverage report
The report titled “Coverage” will provide information to you about:
- Which webpages are included and can be displayed in Google (Accepted and Accepted with cautions),
- Which web pages are not included and the reason for their exclusion (due to errors or intentional removal).
This report holds significant value in terms of technical SEO.
The most effective way to check if a website’s pages have been indexed is by using the Google Search Console Coverage report.
No. 13: Check XML sitemaps
Proceed to the Index section and select Sitemaps to verify whether an XML sitemap or sitemap index has been dispatched to Google. If none are present, proceed to the subsequent step. A more thorough examination of this component will be conducted later.
If the Status indicates “Success,” it means that the sitemap has been processed without any issues. You may also select a particular sitemap to view its specific information.
The Sitemap report of Google Search Console allows you to send XML sitemaps and verify their condition.
If you don’t have much knowledge about sitemaps, it is recommended to consult Google’s sitemap guide.
In case you are uncertain if the audited website possesses an XML sitemap, acquire knowledge on locating a website’s sitemap.
No. 14: Check Removals
You also wish to determine if a person purposely or accidentally asked for the elimination of the website’s content. All you have to do is go to Index > Removal and check the details listed there.
The Removals feature on Google Search Console permits you to swiftly eliminate any webpage from being indexed by Google.
No. 15: Check Enhancements
The following task involves examining if there are any improvements needed. To do this, go to the Enhancements section and review every item listed there.
Several essential elements for optimizing website ranking through search engine include Core Web Vitals, Mobile Usability, Breadcrumbs, FAQ sections, and others (which may vary depending on the kind of structured data employed on the website).
It is crucial to thoroughly examine and comprehend Core Web Vitals and Mobile Usability reports as they hold significant importance.
UPDATE: Check the Page Experience report
A new report has been added to Google Search Console, which can be located in the newly formed “Experience” section. This section now contains three reports: “Page Experience,” “Core Web Vitals,” and “Mobile Usability.”
Visit the Page Experience section to check if the website satisfies all the five Google page experience indicators. It is advisable to observe a significant amount of green color in this area.
The latest addition to Google Search Console is the Page Experience report.The reports for every signal related to the page experience on Google are available here.
Step 16: Verify if the website has been subject to manual action.
Although manual penalties are not as frequent as before, they can still occur. It’s essential to determine whether the website has been or is currently subject to a manual penalty. To do so, visit the Security & Manual Actions section and access Manual actions. Ideally, there should be no problems identified.
When conducting a website audit, it is recommended that the Manual Actions report be among the initial areas to investigate in Google Search Console.
In case the website is subjected to a manual action, it should be given top priority to rectify the issue and appeal for reconsideration. Google has elaborated the concept of manual actions in their article in a simple and all-inclusive manner.
Determine whether there are any security concerns with the website.
Having to take manual actions and encountering security issues are both negative. To determine if the website is affected by this issue, navigate to the Security & Manual Actions menu and select Security Issues. In a perfect SEO environment, there should be no occurrences found in this section.
The report on Security Issues can inform you whether the audited website has any security concerns that could result in a lower search engine ranking for that site.
No. 18: Check Links
Finally, we should consider the importance of links. While it may not be a formal link audit, it is still crucial to have a basic understanding of the backlinks. By navigating to the Links section, you can obtain this overview. Additionally, you can cross-reference this information with data collected from other tools such as Ahrefs or Semrush.
Utilize the links provided by Google Search Console to examine the reverse links of a webpage.
You must now locate the section called “Links” which can be found under the category of “Legacy tools and reports”. There, you will find tables containing information on both external and internal links.
Here is what to do:
- Examine the Top linked pages section and identify any URLs that have an exceptionally large number of links compared to other pages on the website.
- Take into account the details provided before and combine them with the data visible in the “Top Linking Sites” section. Are most of the links originating from a single source or a small number of sources?
- Examine the Top Linking Text section to avoid excessive use of anchor texts with exact-match keywords, which is a warning sign. For further information, click on “MORE”.
The GSC provides a Links report displaying the primary linking text for my incorporated links.
No. 19: Check Crawl Stats report
The report for Google crawl stats enables you to examine in-depth how Google is accessing your website.
Go to the Settings section and select Crawling. Then, click on OPEN REPORT next to Crawl stats.
The way to access the Crawl Stats report on Google Search Console is as follows.
The report for crawl statistics provides a brief overview of the overall count of crawl requests, the size of downloads and the average response. Additionally, under the section titled ‘Hosts’, it displays information about the condition of your hosts, if there have been any problems with the fetch of robots.txt, DNS resolution or connection with the server.
The GSC Crawl Stats Report is a great resource for comprehending the way in which Googlebot is navigating through your website.
Ensure that you go through this particular segment to promptly detect any problems related to website crawling.
No. 20: Check the disavow file
During the website audit process, it is important to verify whether the disavow file has been submitted and whether it contains the appropriate links.
It is crucial to confirm if the disavow file has been submitted, particularly when conducting an audit of a website that is experiencing a decline in its organic visibility.
The Disavow tool has been updated with a new version, and this is what it appears to be.
The disavow tool is not available on Google Search Console as it is a complex tool that can lead to significant damages if not handled properly. You can locate it solely on Google.
Your responsibility is to verify the submission of the disavow file and, if feasible, examine its contents to ensure proper usage that aligns with its intended purpose.
No. 21: Check the primary crawler
In March 2021, all websites will transition to mobile-first indexing. It is important to regularly verify the website’s primary crawler prior to this change.
To verify the main indexing software of a website, access Google Search Console and go to “Coverage”. The top of the page will display details about the primary indexing software.
The Google Search Console’s Coverage report furnishes details regarding the major web crawler responsible for indexing the website.
Most websites have already transitioned to mobile-first indexing. However, if a website has not been moved and still primarily uses Desktop crawler, what happens?
- It may be outdated and have significant problems with loading and displaying on mobile devices.
- If the website is not compatible with mobile devices, converting it into a mobile-friendly version should be of utmost importance.
- The mobile version is still available for devices.
- One possibility is that the website is new, but its domain had been reserved previously. In this situation, it is acceptable for the website to continue using the desktop crawler, which is the case for my own website.
The above passage gives a brief summary of the significant information obtained from Google Analytics. For further details, refer to my guide on Google Analytics 4 for beginners interested in SEO.
Verify whether the website possesses a Google Analytics profile, number 22.
If you responded in the negative, you can disregard the remaining inquiries in this particular division. Alternatively, you must establish a GA profile for the website.
Examine the data recorded over the previous 12 to 18 months to see if there are any noticeable patterns.
To verify this information, navigate to the “Audience” tab and select “Overview”. From there, adjust the date range to encompass the previous year’s worth of data.
This is a summary of the audience during a particular time period in GA.
Number 24: Examine how the website obtains its visitors.
To verify this, access the Acquisition > Overview section. The highest amount of traffic is typically generated through organic search, although this may vary depending on the specific circumstances.
The GA Acquisition Overview provides a means of determining if the website receives a large portion of its traffic from search engines.
If you are a beginner in GA, learn how to locate the organic traffic in Google Analytics.
Determine whether the traffic patterns in Bing and Google are comparable by examining data for No. 25.
To comprehend why there is a decrease in website traffic, it is crucial to examine and contrast the path of organic traffic in Google and Bing. This can be done by accessing the “Acquisition” section and selecting “All Traffic” followed by “Source/Medium” to compare the organic traffic from Google with that from Bing. Specifically, compare the figures for “google / organic” with “bing / organic”.
This is the process for examining the organic traffic on Google and Bing through GA.
Here are the 2 scenarios:
- If the patterns of traffic on Google and Bing are alike, it could indicate that the website has technical problems, such as URLs that don’t work.
- If there is a noticeable decrease in traffic only from Google, it is possible that the website has been penalized by Google.
Number 26: Examine the percentage of visitors who leave the website without exploring other pages.
To verify this information, navigate to the Behavior section and select Overview. The Bounce Rate metric will be displayed in that section.
One of my websites has a recorded bounce rate on GA.
Number 27: Verify the typical duration of visits to the website pages.
To verify this, navigate to Behavior > Overview, where you can find the average duration of time spent on a page.
On one of my websites, the typical duration of time visitors spend browsing the web pages is recorded.
Number 28: Verify the geographical location of the majority of the spectators.
Verify this by navigating to Audience, then Geo, and finally Location.
Number 29: Verify the dialect spoken by the majority of the viewers.
To verify this, navigate to Audience > Geo > Language.
Number 30: Verify the web pages on the website that receive the highest amount of traffic.
If you want to see which web pages are being visited the most, navigate to the Behavior section and select Site Content, followed by All Pages.
On this site, you can view the top-rated web pages.
Number 31: Examine the devices most commonly utilized by the site’s users.
To determine whether it is a mobile device or a desktop computer, you can access the Audience section, then click on Mobile and finally select the Overview option.
Tools for website owners offered by alternate search engines.
The majority of website traffic usually comes from Google, while Bing only accounts for a small portion. However, there are other search engines like Yandex and Baidu which some websites receive significant traffic from.
Although infrequent, such situations do occur. Therefore, it is necessary to verify whether a website utilizes or should utilize the webmaster tools provided by other search engines.
Number 32: Verify whether the website has established a Bing Webmaster Tools account, and create one if necessary.
Verifying the website with Bing can be beneficial as Bing Webmaster Tools offer useful tools like Site Scan, Robots.txt Tester, and Site Explorer.
I highly suggest that you should examine those instruments!
I highly suggest using the fantastic SEO tools found in Bing Webmaster Tools as they can be extremely helpful.
Verify whether the webpage possesses a Yandex Webmaster account, and establish one if it is lacking.
You have the authority to determine if the website requires a Yandex Webmaster account. If necessary, create a Yandex Webmaster account. Additionally, refer to my catalog of Yandex search operators if you use this particular search engine.
Number 34: Verify if the webpage possesses and ought to possess a Baidu Webmaster Tools account. Establish one if necessary.
This advice is primarily intended for websites in China. If it is appropriate, create an account and utilize Baidu’s Webmaster Tools.
Visibility in popular SEO tools
Besides examining the information provided by Google tools, it is essential to evaluate the website’s exposure employing an SEO tool such as Semrush or Ahrefs.
Ensure that the website is easily discernible on Semrush with the number 35.
To obtain a basic understanding of the website’s performance, all you need to do is examine the domain overview.
Here is the Domain Overview in Semrush.
You should examine various factors such as the Authority Score, Organic Search Traffic, Traffic Trend, Keywords Trend, SERP Features, Top Organic Keywords, and Organic Position Distribution.
The Semrush tool provides a summary of moz.com’s domain information.
This will provide a relatively accurate understanding of the situation.
Number 36: Verify the site’s visibility using Ahrefs.
All you have to do is enter the domain in Site Explorer from Ahrefs and press the enter key.
You can utilize Ahrefs Site Explorer to examine the site’s detectability.
Some of the factors that should be examined are UR, DR, organic keywords, organic traffic, and organic positions. Additionally, it might be beneficial to review the top pages.
Backlink Profile
A thorough review of the website’s technical SEO cannot be considered comprehensive without briefly examining its backlink profile.
Review the linkage profile of the website designated as number 37.
To swiftly and simply examine a website’s backlink profile, the Semrush Backlink Analytics and Backlink Audit tools are the top choices.
Semrush provides Backlink Analytics.
Mobile-Friendly Test
The next crucial stage of your technical SEO analysis involves performing the Mobile-Friendly Test – a simple yet effective method to ascertain how Google perceives and displays a webpage, identify loading problems, and confirm mobile compatibility.
By using the Mobile-Friendly Test, it is possible to determine if a website is optimized for mobile devices with only a single click.
After conducting the test, the resulting page and its HTML code will be visible to you.
The test outcomes for my website’s mobile friendly features are presented in the Mobile-Friendly Test.
Verify whether the website is compatible with mobile devices.
Conduct the Mobile-Friendly Test to verify if the website is truly optimized for mobile devices as per Google’s standards.
Verify if there are any problems with loading.
Afterwards, select VIEW DETAILS located underneath Page loading problems in order to examine any specific issues concerning the loading of the page.
Thankfully, my website doesn’t face any problems with loading. In case any issues do arise, it would be best to look into them more profoundly.
Review the displayed image and its accompanying HTML markup for item number 40.
Examine the image created by the rendering process and compare it to what you observe on the browser screen. Verify that the critical content, such as navigation links, is present in the HTML code of the rendered image.
There exists a handy JavaScript tool called JavaScript rendering check, which can examine any URL and compare the original source to the displayed HTML for discrepancies.
The outcomes generated by the tool utilized to verify the rendering of Javascript on my website.
Google PageSpeed Insights
The PageSpeed Insights offered by Google is a fantastic resource that analyzes a website’s speed and offers practical recommendations for enhancing its speed and overall functionality.
It should be noted that the information provided by Google PageSpeed Insights pertains to individual pages only. The score given does not reflect the overall performance of the website, but rather only applies to the specific URL (or page) being tested.
With the assistance of Google PageSpeed Insights tool, you can instantly evaluate the velocity and effectiveness of any website (including Core Web Vitals).
In case a website is built with WordPress, it is likely that the WP Rocket plugin can resolve any issues related to speed and overall performance. For further information, please refer to my evaluation of WP Rocket.
Examine the website using Google PageSpeed Insights, identified as number 41.
Review the ratings for the mobile and desktop editions of the site. If the rating falls short of 80 out of a possible 100, you should examine the problems specified by the tool closely. Anything that falls below the 50 mark (highlighted in red) demands your immediate attention, investigation, and resolution.
The analysis of SEOSLY’s website speed and performance using the WP Rocket optimization tool has been conducted through Google PageSpeed Insights.
Determine whether the website meets the criteria for Core Web Vitals.
The ranking system now takes Core Web Vitals into consideration as a factor.
The signals related to the user experience on Google’s webpage, which also include the Core Web Vitals.
Considering this point, it is crucial to prioritize the attainment of Core Web Vitals and ensure that the website successfully meets these standards. This will prove to be a beneficial investment for the long run.
Making it a priority to address any issues with Web Vitals is essential if the website fails to meet the required standards.
Examine my comprehensive manuals on Core Web Vitals, as well as my guides on Largest Contentful Paint, First Input Delay, and Cumulative Layout Shift.
Determine whether the tool shows that images have not been streamlined.
The most straightforward and efficient way to enhance website speed is by addressing the issue of uncompressed, unoptimized images.
The tool will identify any images on the webpage and suggest ways to reduce their size, if applicable. It is recommended to focus on optimizing all of the images recommended by the tool.
Number 44: Verify whether the tool displays a message stating that the JS, HTML, and CSS code is not optimized.
It is typically effortless to execute these enhancements, particularly if the website is utilizing WordPress.
The Google PageSpeed Insights tool provides advice on how to eliminate CSS that is not being used.
At the conclusion of this guide, I have included my suggestions for WordPress plugins.
Number 45: Verify whether the tool shows any other significant possibilities or diagnostic features.
The recommendations provided in the Opportunities and Diagnostics section do not impact the website’s performance rating. Yet, they can be beneficial in speeding up the website’s loading time. In some cases, the improvement in speed can be significant.
The Google PageSpeed Insights tool presents ideas to enhance the loading speed of a website.
The Google PSI portrays the diagnostics.
Typically, the instrument provides you with sufficient data and recommendations to enhance the speed and performance rating of a specific webpage.
Adherence to Google’s criteria for quality
Determine whether the website adheres to the quality standards set by Google.
It is important to ensure that the website does not have any evident mistakes such as overusing keywords, using doorway pages, using sneaky redirects, or committing any other clear breach of the quality regulations set by Google.
A summary of the Google quality guidelines found on the Google Search Central website.
Several of these methods are outdated and seen as negative practices, however, there are still certain websites that implement them.
At this point, you likely have a good understanding of the website and can proceed with conducting a detailed technical SEO evaluation.
To accomplish most of the tasks mentioned below, you will require a site crawler. Personally, I rely on Screaming Frog and Semrush, as I am fond of their SEO tools and crawlers. However, it is possible to carry out these activities with any other reasonable crawling tools. The majority of the screenshots displayed have been taken from these two tools.
Indexing, Crawling & Rendering
Now, let’s delve deeper into the specifics of how Google is analyzing, scanning, and displaying your website.
⚡ Ensure that you review my manual for optimizing the crawl budget.
Status in the index
I will provide a summary of my site’s indexability in Sitebulb.
Determine the number of website pages that have been included in the search engine’s index by checking the index count of web pages with the number 47.
To get a rough estimate of the number of web pages indexed, utilize the site: function. Typically, the initial outcome of the site: search query should display the homepage.
This is the method I use to verify my domain: typing in “site:seosly.com” in a search engine.
The site command offers a rapid means of verifying if a specific URL or website is present in the index.
To obtain an estimate of the total number of web pages contained in Google’s index, one can utilize the “site:” command. It is important to note that this command is also usable on Bing.
I possess a comprehensive manual consisting of Google search operators and Bing search operators for individuals who desire to expand their knowledge on this topic.
Number 48: Verify whether the total number of web pages listed in Google Search Console matches the number of valid web pages.
The estimated count of web pages displayed through the site: function is not exact, however, it is expected to be similar to the number of indexable pages on the website.
To ascertain the precise quantity of indexed web pages, visit the Valid section found in Coverage under Google Search Console.
You can use the Coverage report available in GSC to verify the number of web pages that are currently being indexed. It’s important to keep in mind that the data shown in GSC is from a week earlier.
You may need to give specific attention to any inconsistencies.
Make sure to verify whether any unusual or unrelated websites have been cataloged.
The tool called “site:” is beneficial in verifying if unusual or unrelated web pages were included in the search results. By using the “site:” command, you can scan through 2-5 pages of search outcomes and be amazed by what you find. Additionally, if you place your domain in quotes and conduct an exact match search, such as using “seosly.com,” you can obtain even more precise results.
Robots.txt
If you are not well-versed in robots.txt, begin by going over Google’s explanation of it.
Determine whether or not the website possesses a robots.txt file.
To see if a website has a robots.txt file and its content, just append /robots.txt to the website address.
If you’re looking for my website’s robots.txt file, you can find it at https://seosly.com/robots.txt. It’s important to note that if there is no robots.txt file, the website must still return the appropriate status code (either 200, 403, 404, or 410), or else the site won’t be crawled by search engines.
The content of the robots.txt file on my website can be directly reviewed on Sitebulb.
🔍 You will discover in another write-up how to enter and adjust the robots.txt in WordPress.
Go through the robots.txt file to verify if any website resources that require being indexed are obstructed.
If there are mistakes in the robots.txt file, search engine robots may not find important website resources. Although the resource may still be indexed, it will not be crawled if it’s blocked in the robots.txt file. However, if there’s a link to the blocked resource on the internet, it could still show up in search results.
Determine whether the website resources that must be prevented are truly obstructed by the robots.txt file.
Large websites that have millions of pages can benefit greatly from utilizing the robots.txt file to prevent certain resources from being crawled by search engines. This includes URLs with parameters, thin tag pages, and other unnecessary pages. As an SEO expert, it’s up to you to determine which web pages should be blocked from crawling.
Number 53: Verify the authenticity of the robots.txt file.
You want to ensure that the robots.txt file does not contain any errors or mistakes. To verify the validity of the file, you can utilize these tools:
- The tester for the robots.txt file in Google Search Console
- The Bing robots.txt tester has been improved and upgraded.
Please verify whether there exist other mistakes in the robots.txt document.
Even if robots.txt is error-free in terms of syntax, there could still be other mistakes, such as listing the incorrect directory to be blocked. Therefore, it’s important to ensure that robots.txt is effectively performing its intended function and providing the correct instructions to the Googlebot.
Determine whether the XML sitemap address is specified in the robots.txt file, with the instruction number 55.
It is unnecessary to specify the standard location of the sitemap (/sitemap.xml) in the robots.txt file. However, if there are multiple XML sitemaps or the sitemap is not at the standard location, its URL should be indicated in the robots.txt file.
This is how my website appears: a sitemap can be found at https://seosly.com/sitemap_index.xml.
It is possible to place the XML sitemap on a domain that is completely separate. The only requirement is that it must be listed in the robots.txt file.
Robots meta tag
Determine whether the robots meta tag prevents the inclusion of website assets that ought to be listed as searchable elements.
The robots meta tag can prevent a page from being indexed, whether it was done intentionally or not. To quickly check the indexability of all web pages, crawl the website in bulk.
Sitebulb enables you to assess the indexability of a website’s pages in large quantities.
You can utilize the SEO Indexability Check Chrome extension to evaluate the indexability of individual pages. It’s important to remember that if there are two distinct robots meta tags, Google will favor the one with stricter rules. This is an infrequent occurrence, but it can happen.
The Chrome extension from DeepCrawl called the SEO Indexability Check.
Determine whether any website resources can be indexed and ought to be prevented from being accessed by using the robots meta tag.
Conversely, having a large number of low-quality indexable web pages can also be detrimental. Your duty is to evaluate all the indexable web pages and determine whether they are worth being indexed. Rely on your SEO knowledge and judgement!
The act of verifying the meta robots tag values throughout the website utilizing Sitebulb.
X-Robots-Tag
Number 58: Verify whether the X-Robots-Tag prevents access to website resources that should remain accessible.
A web page can be prevented from being indexed by using X-Robots-Tag. However, your investigation does not stop here. You now need to identify and examine the resources restricted by X-Robots-Tag.
In the Screaming Frog SEO Spider tool, one can verify the X-Robots Tag values.
The instructions given in X-Robots-Tag, robots.txt, and robots meta tag.
Ensure that there are no conflicting instructions among the robots.txt file, robots meta tag, and X-Robots-Tag directives for No. 59.
It is possible to give robots instructions in different ways, including through the X-Robots-Tag, the robots.txt file, and the robots meta tag. Your objective is to ensure that the guidelines do not conflict with one another, or else the bots will become perplexed and may not follow your desired behavior.
Number 60: Verify whether web pages containing privacy policies or terms of service can be indexed.
In the past, traditional SEO teachings recommended avoiding indexing web pages that contain privacy policies, terms of conditions, and similar content due to their lack of unique information. However, this advice no longer holds true. Nowadays, it’s advisable to index these pages in order to boost your website’s E-A-T (Expertise, Authoritativeness, and Trustworthiness), which will be explained further in this guide.
Rendering
Verify if the site’s web pages are displayed accurately.
To render a web page means to view it from the perspective of a bot. It is essential to understand how the page appears to the bot. The Mobile-Friendly Test allows you to see how the page is rendered and displays its HTML code.
To analyze multiple web pages on a website, I utilize either Sitebulb or Screaming Frog.
After initiating a new project on Sitebulb, opt for Chrome Crawler and the software will take care of the unclean tasks on your behalf, while also examining variances between the source and visually displayed code.
To modify the project preferences in Sitebulb, select Chrome Crawler as the option.Once the Sitebulb crawl is complete, access Response vs Render to detect any discrepancies or issues in the source and rendered code of your website.
To make Screaming Frog execute JavaScript, navigate to Configuration > Spider > Rendering. From there, opt for Googlebot Mobile and the JavaScript option.
The configuration options for JavaScript rendering within the Screaming Frog SEO Spider.
XML Sitemap
Now, let’s investigate XML sitemaps like a detective.
Confirm whether the website contains an XML sitemap.
Now, we should verify if the website possesses an XML sitemap. The absence of an XML sitemap can obstruct the search engine robots from locating all the pages of the website, primarily the ones situated deeply within the structure.
So where do you look for an XML sitemap?
- Verify the predetermined destination, which may either be /sitemap.xml or occasionally /sitemap_index.xml.
- Check the robots.txt file.
- Verify the Google Search Console by going to the Index section and selecting Sitemaps.
- In case it is feasible and relevant, access the content management system (CMS) of the website and locate the sitemaps options within it.
This is the XML sitemap for my site.
If there is no sitemap present on the website, you can move on from this section. However, consider developing an XML sitemap for the website or recommending someone to do so.
I am providing seven different methods to locate a sitemap in my tutorial on discovering a website’s sitemap.
Determine whether all the appropriate URLs are present in the XML sitemap number 63.
An XML sitemap is expected to have every web page on the website that can be indexed and is canonical. But, in reality, this is not always true.
To easily verify if an XML sitemap includes the proper canonical URLs of your website, the most efficient method is to scan it. Utilizing Sitebulb for this task is highly recommended.
The sixtieth-fourth step is to verify whether the sitemap includes erroneous records.
To ensure that the sitemap is accurate, it should not contain erroneous listings like URLs that have been redirected, web pages with 4xx errors, or URLs that require a password. To verify this, you can perform a quick check by examining the website’s XML sitemaps.
Sitebulb is designed to automatically scan the sitemap and identify any problems related to it.
Sitebulb can scan and check the XML sitemap of the website you are exploring, and present you with a comprehensive list of any problems that arise.
To examine the material of sitemaps within Screaming Frog, follow these steps:
- I go to Spider Configuration and select XML sitemaps, where I enable Crawl Linked XML Sitemaps and copy and paste the URLs of the website’s sitemaps.
The features in Screaming Frog SEO Spider which enable you to browse through XML sitemaps.
- After completing the website crawl, I proceed to perform a Crawl Analysis.
The procedure for executing the Crawl Analysis on Screaming Frog SEO Spider is described below.
- Afterwards, I examine the materials found in the Overview section of the Sitemaps. This indicates all the essential information I require.
The sitemap crawls’ outcomes can be viewed in the Sitemaps section.
Determine whether the sitemap employs the outdated and parameters, identified as number 65.
To determine if there are any and parameters, one can easily access and view the XML sitemap. However, as Google presently does not acknowledge these parameters, there is no necessity to include them in the sitemap. As a suggestion, it is advisable to eliminate them from the sitemap entirely.
Verify whether the website employs the parameter, and if so, ensure its proper usage.
Google can identify the precise date the website was last modified. If the parameter is not used properly, Google will disregard it.
In case the parameter shows the same date for all the entries in the sitemap, it’s highly likely that it’s not being utilized correctly.
Here is what the parameter appears as in a real XML sitemap.
I generally suggest completely getting rid of it under these circumstances.
To determine if the website possesses a picture sitemap, examine number 67.
If a website has many important images and depends on Image Search, it is important to have an image sitemap. Otherwise, search engine robots may not be able to find and include all the images in their index. The images could be included in a separate sitemap or combined with the regular XML sitemap.
These are the image links present in my XML sitemap, which has been created by Rank Math.
Determine if there is a video sitemap available on the website.
This rule is applicable only when a website stores videos on its own server. A video sitemap is not designed to show videos from YouTube.
Language Versions
If the website does not offer multiple languages, then you can skip this particular part.
Almost any type of web crawler can supply the necessary details to determine whether there are any problems with applying hreflang. If the website being reviewed is not offered in multiple languages, you can ignore this portion.
Number 69: Verify whether the different language versions are distinctly separated.
Websites with multiple languages should have clear separation between languages. Failure to do so can cause certain language versions to be indexed incorrectly. For instance, some pages may only be indexed in one language while others may only be indexed in the other language.
To avoid this issue, a website should:
- It would be ideal to place translations in separate directories corresponding to their respective languages.
- Ensure that there is a language option readily available on each webpage.
- Include hreflang tags, which you can learn more about in the next section.
Determine whether or not the website employs hreflang tags as indicated by number 70.
The utilization of hreflang tags is recommended for websites with multiple languages for the following reasons:
- Hreflang tags enable search engine bots to explore different language variations of web pages.
- The implementation of hreflang tags assists in addressing the issue of duplicated content that arises due to the presence of web pages in the same language, but intended for various geographical locations.
Make sure to select “International” in project settings when using Sitebulb to examine the hreflang tags on your website. This will provide you with unambiguous recommendations and potential solutions.
Ensure that this option is selected to allow Sitebulb to verify the correctness of hreflang annotations and HTML lang attributes on your website.
With Screaming Frog, you can also examine hreflang tags.
Screaming Frog SEO Spider examines the hreflang problems mentioned below.
Determine whether the hreflang tags have been appropriately utilized in Number 71.
In order for hreflang tags to be effective, they must lead to the appropriate web pages. Your responsibility is to verify if the hreflang tags are indeed directing users to the versions of the website in different languages.
Determine if the links that lead back are absent for number 72.
If there are no return links, Google will not pay attention to the hreflang tags present on the webpage. It is an uncommon scenario where having reciprocal links is essential. By making use of tools like Screaming Frog, it becomes possible to verify the absence of return links.
Screaming Frog can inform you whether there are any missing backlinks or not.
Number 73: Verify whether the x-default hreflang attribute has been employed.
Every webpage that utilizes hreflang tags must also reference the default language variant. For additional information regarding hreflang tags and their functionality, take a look at this Google resource on managing sites with multiple languages and regions.
You have the option to verify if X-Default has been utilized on the whole website through the Screaming Frog SEO Spider tool.
Internal linking
The internal links within a website can vastly impact its success, either positively or negatively. Therefore, it is crucial to prioritize the technical SEO aspect of internal linking. A thorough analysis of the internal linking system using a reliable tool such as Sitebulb can alert the website owner of any flaws or opportunities for enhancement.
The following is a condensed version of the evaluation of internal links found in Sitebulb.
Number 74: Verify whether the homepage contains an excessive amount of links.
If the homepage has an excessive amount of links, including multiple links, it suggests an issue. While it is not determined what is considered excessive, anything over one hundred should be viewed as a warning sign and investigated further.
Sitebulb performs an excellent analysis of a website’s internal links and provides insight into how they are implemented. By accessing the Link Explorer in the audit, users can easily gain an understanding of internal linking.
The Sitebulb Link Explorer permits a thorough examination of the internal linking hierarchy, including deeper levels. Currently, I am scrutinizing the links stemming from the homepage.
The following instructions show how to verify the quantity of links on the main page using Screaming Frog:
- Go to the Internal section and select HTML.
- Select the web address of the main page, which is the official and recognized version (canonical).
- Navigate to the Outlinks tab.
- To export all the links, select Export.
- Analyze!
Screaming Frog enables you to verify the outlinks of a webpage.
Verify if the homepage is linked to on every webpage of the website.
This problem is not common, but it does occur. All web pages ought to include a link to the homepage. This link is typically found in the form of the website logo or the word “Home” in the main navigation menu.
The provided instances showcase links to SEOSLY’s main page.
The typical location for this hyperlink is either the logo (which is a graphic hyperlink) or the “Home” hyperlink in the primary navigation.
Determine whether there are multiple hyperlinks present on the webpages of the given website.
The problem of multiple links is not as significant as before. According to John Mueller’s recent Google SEO office hours, the principle of giving priority to the first link is no longer valid and not important anymore. As an SEO expert, it’s crucial to analyze all the links on a website and determine whether or not they pose a problem.
The Screaming Frog SEO Spider tool makes it simple to determine if a webpage contains duplicate links. By arranging the links in alphabetical order, any duplicates will be visible.
Once again, it is necessary to rely on personal experience and logical reasoning. If there are numerous links on a webpage, more thorough analysis may be required. Tools such as Screaming Frog or other website crawlers can be utilized for this purpose.
Number 77: Verify if subordinate web pages are connected to other pages that are thematically related.
When it comes to utilizing internal linking capabilities, product pages and blog articles can be extremely beneficial. Interlinking pages that share similar themes can improve the strength of both the link source and the linked destination. To illustrate, it is recommended that a product page should include links to other related products.
Additionally, this presents a valuable chance to incorporate anchor texts containing keywords, assisting search engine algorithms in comprehending the subject matter of such pages.
Determine whether or not the website employs contextual linking.
Contextual linking is comparable and equally as useful as other types of linking. It assists search engines in connecting web pages to particular keywords found in anchor text. It is completely acceptable to use anchor text with keywords in internal links.
Contextual links are most effective when placed within blog articles that lead to product or offer pages. When a website contains articles or tutorials, it would be wise to implement contextual linking.
To verify if a website contains meaningful contextual links, you may examine the anchor text of its internal links. To achieve this, you can access the Internal Anchor Text option under Link Explorer in Sitebulb.
Please verify whether there are any links directed towards resources that either do not exist or are blocked.
If there are links that are either blocked or do not exist, this can harm the user’s experience which can then negatively impact SEO. Your job is to eliminate any links that lead to non-existent or password-protected resources. Alternatively, you can substitute these links with ones that work and return a status 200 (OK).
It is effortless to examine both internal and external links with Sitebulb.
Sitebulb allows for a rapid analysis of both internal and external link statuses.
Low-value links
Number 80: Verify the existence of text links with low value and unsuitable anchor text.
When links contain insignificant text, such as “Click here” or “Read more,” it becomes challenging or potentially unfeasible for search engine algorithms to connect the associated webpage with relevant keywords pertaining to its content. The links that possess low-value anchor text are a chance for search engines to gather information about the subject of the linked page, and this opportunity is lost with the presence of negligible anchor text.
Number 81: Examine if there are image links of little worth that contain an incorrect or absent ALT attribute.
The ALT attribute for image links is equivalent to the anchor text for text links. Utilize it to your advantage and ensure that graphic links hold significant value.
Website structure
Any software that scans websites can assist with analyzing their organization. Nevertheless, I have a particular appreciation for the manner in which Sitebulb performs this function.
The crawl tree produced by Sitebulb presents the site’s structure in a simple and straightforward manner, making it easy to understand.
This is the method by which Screaming Frog accomplishes this task.
The Screaming Frog SEO Spider can offer a plethora of valuable information that can aid in analyzing the website’s structure.
However, do not rely solely on the tool’s suggestions. Utilize your expertise in SEO and practical judgment. A website structure that steers clear of extreme measures is considered the most ideal.
Verify whether the website’s architecture is too shallow or lacking in depth.
An example of an extreme scenario is when the homepage of a website includes hyperlinks to every page on the site.
It is important to verify if the website has a complex hierarchical structure that may be difficult for users to navigate.
A different situation occurs when the arrangement becomes excessively complex, with an excess of 4-5 levels and many web pages that are disconnected from the rest.
Breadcrumbs
Breadcrumbs aid in the navigation of a website for both users and search engine robots, allowing for a better comprehension of its organization. Breadcrumb navigation forms a route back to the higher-level web pages, such as the homepage, from the currently viewed page.
Determine if the website contains breadcrumb navigation.
Using breadcrumbs is considered to be a beneficial habit on websites of all sizes. In fact, it is imperative to use them on larger websites.
One of the web pages at SEOSLY has a breadcrumb, as illustrated here.
Determine if the breadcrumbs have been correctly executed.
Two things to note here:
- Structured data must be utilized when implementing breadcrumb navigation.
- It is important for breadcrumbs to include all the web pages in the sequence and the final item (the actual page) should not be clickable.
In order to determine if Schema.org was used to add breadcrumbs to the site you are auditing, you can easily verify this by accessing the “Structured Data” section of Sitebulb and checking for the presence of “Breadcrumb” under the “Search Features” tab.
This report generated by Sitebulb indicates that the breadcrumbs on my website have been added accurately and there are no errors or warnings associated with their implementation.
Determine whether breadcrumbs are being utilized uniformly throughout the entirety of the website.
For every webpage, the website must adopt breadcrumbs invariably. You can attain further knowledge regarding breadcrumb trails from Google.
Navigation
The primary navigation of the website provides crucial information about the important web pages to both users and search engine robots.
Sitebulb offers a unique feature that allows for the analysis of the positioning of internal links. This feature allows for the examination of links included within the main or footer menu.
The distribution of the internal links on my website in Sitebulb is outlined below.
Task number 87 involves verifying whether the primary navigation on the website includes hyperlinks to the most critical web pages.
The navigation bar should include links to pages that belong to the main categories, hub pages, or significant information pages such as contact and about pages.
The essential links can be found on the navigation menu located on SEOSLY.
Verify whether the website’s navigation is built using text links for navigation.
Although it may appear simple and apparent, it is still advisable to verify.
No. 89: Check if list tags (
- and
- Navigation elements are constructed using Markup languages.
Ensuring that the navigation is constructed through the utilization of list elements.
Verify whether navigation links are detectable by search engine crawlers.
It is crucial to ensure that navigation links are visible to robots as they are the most significant. This becomes even more important for websites heavily reliant on JavaScript. A comparison between the source and rendered HTML codes of the website can confirm this.
Number 91: Verify that the navigation system is operating properly on a mobile device.
Besides being easily searchable by search engine bots, it’s crucial for navigation links to function correctly from the user’s perspective. To test this, accessing the website on a mobile phone will show whether the navigation links drop down when they should and if they open a new webpage as intended.
The mobile version of SEOSLY displays its primary menu here.
External Links
To examine numerous external links at once using Screaming Frog, navigate to Overview > SEO Elements > External.
Using Screaming Frog SEO Spider enables the user to verify multiple external links at once.
Number 92: Verify whether external links without authentic recommendations contain the attributes of rel=”nofollow” or rel=”sponsored”.
Every external link from the website being reviewed should only include a “nofollow” or “sponsored” attribute if it is not a genuine endorsement. On the other hand, legitimate high-quality non-sponsored links to thematically related web pages should also be included. In short, there needs to be a balance between the two types of links.
Verify whether or not the links that have been inserted by users contain the attribute “ugc”.
When a website has content created by its users, it is recommended to utilize the “ugc” attribute. This is especially crucial when there are links within the comments and forum areas. It is essential to verify this aspect carefully.
Number 94: Verify the presence of dofollow links throughout the entire website.
Most of the time, it is recommended to add a rel=”nofollow” attribute to site-wide dofollow links. Even though it’s almost 2021, there are still some websites that continue to use site-wide links for SEO purposes!
Verify whether there are any external links pointing to valuable resources that are considered as dofollow.
The website requires good quality external links that are dofollow. It is common practice to link out to web pages that are valuable and helpful to the website author. In my guide on conducting an SEO audit, I have included links to various external resources that I consider valuable and can be beneficial to you.
URL addresses
To examine many URL addresses at once using Screaming Frog, navigate to Overview > SEO Elements > URL.
The Screaming Frog SEO Spider is capable of examining every single URL present on a website.
Number 96: Verify whether URLs include parameters, such as session or user identifiers, that have no impact on the displayed content.
URLs should avoid including parameters that have no bearing on the displayed content, like session or user identifiers. In case such URLs do exist, a canonical link needs to direct to the URL version that excludes these parameters.
Determine whether the URLs include relevant terms.
During a recent Google SEO office hours session, John Mueller expressed that the importance of keywords in a URL is minimal. Nevertheless, the perception of users differs. Users find clear URLs desirable, and this preference aligns with what Google also favors.
To make URLs more user-friendly and informative, it is suggested that keywords related to the content of the webpage should be used instead of unattractive characters such as ““/?p=123””.
A piece of writing discussing Google search operators must contain select keywords within the URL, as seen in this instance: https://seosly.com/google-search-operators/.
Number 98: Verify whether the URLs include terms in a language that differs from the main language of the website.
It is important that URLs include terms in the same language as the webpage. If the URL is in a different language, it could potentially create confusion for both website users and search engine algorithms.
Determine if hyphens are utilized to separate words in website addresses with number 99.
Using dashes in URLs is recommended as it benefits both users and bots.
Verify if there are any superfluous words in the website links identified by number 100.
The best practice for URLs is to avoid using extraneous words which could unnecessarily lengthen them.
Redirects
You can use any crawling tool to verify the redirects employed on your website. To do so in Screaming Frog, you should navigate to Overview > SEO Elements > Response Codes to review all the redirects.
Examining the feedback codes within the Screaming Frog SEO Spider.
Check for the presence of multiple redirects, also known as redirect chains, with the identification code No. 101.
It is best if a single URL is only redirected once to avoid any issues. However, it should be kept in mind that Googlebot could cease to crawl the page if there are more than two or three redirects.
Determine if any incorrect status codes are associated with redirects at No. 102.
It is recommended to use 301 redirects for most situations as they indicate a permanent change. 302 redirects should only be used for temporary changes. Some individuals mistakenly use 302 redirects for permanent changes, such as redirecting from a non-HTTPS to HTTPS version of a site. It is important to ensure that redirects are used appropriately based on their intended purpose.
Verify if there are any meta refresh redirects with the ID number 103.
Meta refresh redirects are different from 301 or 302 redirects in that they are executed on the client-side, not the server-side. These redirects involve a command to the browser to direct to a new page after a set amount of time has passed. Google considers these types of redirects to be deceitful.
To determine if a website uses meta refresh redirects, you can utilize Screaming Frog.
It is recommended to use standard HTTP redirects instead of meta refresh redirects. If you want to know more about sneaky redirects and 301 redirects, Google has further information available.
Status Codes
The HTTP status code is the answer from the server to the request made by the browser. It tells us if the request was successful (in the range of 2xx), if there were errors (in the range of 4xx), or if there were redirecting issues (in the range of 3xx) or problems with the server itself (in the range of 5xx).
Determine whether there are any web pages that are producing 5xx errors.
Many web pages showing a status code of 5xx may suggest that the server is encountering issues. These problems could be due to an excessive workload or a requirement for further setup and customization.
Number 105: Verify the presence of website pages that are generating 4xx errors.
We’ve briefly mentioned this before. A large number of web pages that display status 404 (unable to locate) or 410 (content removed) on the website can result in a negative user experience. This pertains to links both inside and outside the website.
If there are links leading to web pages that display a 4xx error code, Google won’t consider them. It’s necessary to delete or switch internal links on those 404 pages to functional pages. If external links direct to those 404 URLs, it’s best to use a 301 redirect to bring visitors to a functional URL.
Error Page
We have not finished discussing status codes.
Number 106: Verify if a page that shows an error returns a 404 code.
It is important for a website to effectively manage error pages. If a page does not exist, it should display a 404 status code indicating that it was not found, rather than a 200 status code indicating it is okay. If an error page returns a 200 status code, it may be categorized as a soft 404 and indexed as such.
Although Google is improving their ability to deal with soft 404 pages, it is still important to make sure your website handles errors in the best possible manner.
The Link Redirect Trace Chrome extension allows you to rapidly verify the status code of any webpage.
To determine if a page shows error 404, one may utilize the Chrome extension known as Link Redirect Trace.
Determine whether the website contains a page that displays errors by looking for Error Page No. 107.
An error page that is completely empty and has the word “ERROR” written in red is not a pleasant experience for the user. A website should have an error page that explicitly states that it is an error page and that the user has landed on it because the URL they entered either does not exist or cannot be found.
Verify whether the website has an exclusive page for displaying errors.
Ideally, an SEO-friendly error page should include links to the website’s most crucial web pages. Additionally, its design and formatting should match that of the rest of the website.
The primary purpose of a dedicated error page is to enhance the user’s browsing experience.
Duplication
Indeed, it is accurate that Google is improving its efficiency in managing replicated content. Nevertheless, if you want to be an efficient SEO, you can assist Google considerably by implementing a few technological adjustments.
Check to see if there is any repeated content due to improper technical execution of content sorting.
If the sorting of the content is not implemented properly, it can lead to a loss of authority over which URLs are indexed. To rectify this, a simple solution is to incorporate a canonical link element that directs to the URL without any sorting parameters.
It should be noted that the indexing criteria may differ depending on the website. In some cases, you may need to index URLs that have particular sorting or filtering settings.
Verify if the website can be accessed through both the HTTPS and non-HTTPS URL versions.
We are revisiting the topic of redirects. It is a common occurrence that needs to be reiterated. When a website has an SSL certificate, any non-HTTPS versions should be redirected (301) permanently to the HTTPS versions with only one redirect should be issued.
Using the Link Redirect Trace tool, you can track the redirects on any given webpage. I am currently examining the efficacy of the redirect from HTTP to HTTPS.
Verify whether the website can be accessed from both the WWW and non-WWW versions of the URL.
This also pertains to the website’s versions with and without WWW. It is recommended that the website does not work for both the WWW and non-WWW URL versions.
The following steps can be taken to verify if a website accurately redirects from the WWW version to the non-WWW version.
One version of the content has to be selected as the main version and the other version should be redirected (using a 301 redirect) to the main version. The Link Redirect Trace Chrome extension can be useful in carrying out this task.
Number 112: Verify if the website’s web pages are accessible through URLs that are not case sensitive.
In a perfect SEO scenario, the capitalization of letters in URLs should not matter. However, if it does matter, there are certain steps that can be taken.
- In order to establish consistency, insert a canonical link that directs to the URL in lowercase format.
- Alternatively, you can opt for a URL rewriter such as URL Rewrite, URL Rewriter, or ISAPI Rewrite 2. This tool will alter the URL addresses by converting all the characters to lowercase.
Determine whether pagination is managed accurately for No. 113.
Ensure that the web pages divided into pages can be accessed by search engine bots. If you are interested in an extensive reading about pagination in SEO, refer to this guide.
Number 114: Examine whether there are any identical or very similar pages.
You have the authority to determine whether or not they actually exist. Using tools like Screaming Frog and Semrush can assist you in locating these pages.
Access the Duplicate Content feature in Sitebulb to verify the existence of any duplicate content problems on the website.
Sitebulb provides information on repeat content in the following manner. From the display, it appears evident that there aren’t any duplicated contents on my website.
To find out which pages are listed under Exact Duplicates, Near Duplicates, and Low Content Pages in Screaming Frog, navigate to Crawl Overview, then Content.
The Screaming Frog SEO Spider examines the following content-related problems.
Determine if the website is accessible through an alternate uniform resource locator (URL).
It is more advisable to take precautions instead of facing the consequences later. Therefore, ensure that no searchable copy or trial edition of the webpage exists on the World Wide Web. Utilize Copyscape to identify any replicated material.
The tool that is widely used to determine whether the content of a website is original or not is Copyscape.
Number 116: Verify whether the information presented on the website is one-of-a-kind and not found elsewhere on the internet.
To verify if a website is original, one can easily copy a distinctive section of text from the website and put it between quotation marks in a Google search. The search should only yield the webpage from which the text was taken.
Canonicalization
Now, it’s time to explore the concept of canonicalization in detail. Utilizing tools like Sitebulb, Screaming Frog, or any similar website crawlers, you can investigate and identify any canonicalization problems on your website.
In Sitebulb, navigate to the section labeled Indexability and select URLs. From there, examine the entry for Canonical URL.
To examine how canonicals are utilized on the website, navigate to Overview > Canonicals while using Screaming Frog.
Screaming Frog SEO Spider allows you to verify all the canonical URLs in large quantities.
The number 117 requires checking if the website is utilizing canonical link elements.
I suggest that every page on the website should have a canonical link element, as per SEO standards. Make sure to verify if this is implemented on the website you are examining.
Ensure that you take a look at my manual on examining a website using JetOctopus.
Please verify whether the canonical link elements have been utilized appropriately in No. 118.
While having canonical link elements is important, ensuring that they are properly implemented is equally crucial. You are required to verify canonicalized URLs and guarantee that a collection of distinct web pages is not consolidated into a single generic URL.
I have witnessed a situation where all the web pages were redirected to the homepage using canonicalization.
Verify whether Google has selected the identical canonical address for the crucial web pages as mentioned by the webmaster.
It is impossible to compel Google to select the canonical URL that you have assigned for a particular webpage. A canonical link is not a command, but merely a suggestion that is provided to Google.
You can verify the URL that has been designated as the canonical one by using the URL Inspect tool available in the Google Search Console.
It is excellent to be aware if Google considered that suggestion. You can verify this through the URL inspection tool.
Title Tags
The connection of page titles to the SEO content is stronger, but for a moment, let us analyze them from a technical viewpoint.
Users can view the element directly in search results and in the tab name of contemporary web browsers.
The homepage of SEOSLY displays a title tag.
The titles of pages give significant information about the web page to both users and search engine robots. It is advisable to use a crawler such as Sitebulb or Screaming Frog to analyze the meta tags of the website in bulk.
If using Sitebulb, navigate to On Page and keep scrolling until you come across Title Length and Title Identification. You can click on either of them to view additional information.
The process for verifying multiple title tags using Sitebulb will be explained below.
To examine a large number of page titles utilizing Screaming Frog, navigate to the tab named Overview, then select SEO Elements, and finally, click on Page Title.
The way to examine title tags in Screaming Frog SEO Spider is explained below.
Verify whether the website’s webpages contain the
tag, using number 120.
Access the “Title Identification” section under “On Page” in Sitebulb and select the “Missing” option. You will only see an active link if there are web pages on your site that lack titles.
To verify if there are any titles absent on the website, you can utilize Sitebulb.
To verify whether there are any pages on the website without titles, one can easily access the Missing option under Page Titles in Screaming Frog.
The Screaming Frog SEO Spider enables easy listing of all the URLs without a title tag.
Ensure that you examine all the website’s web pages. All the valuable web pages on the website ought to possess tags with superior and pertinent content.
Verify whether the tags conform to the prescribed length between 30 to 60 characters.
In order to appear visually pleasing in search results, the page titles ought to have a length of 40 to 60 characters.
Navigate to the On Page section of Sitebulb and select Title Length to examine the character count of the title tags present on the website.
This is how you verify the length of the title in Sitebulb.
To review web pages with either less than 30 characters or over 60 characters, navigate to the Page Titles section in Screaming Frog.
With Screaming Frog, you have the ability to show the title tags that are either too brief or excessive in length.
Check whether the
tags have been duplicated at No. 122.
If you access Screaming Frog and navigate to Page Titles > Duplicate, you will find a compilation of web pages that have identical page titles.
Every page on the website needs to have its own special title tag. For further information on page titles and descriptions, take a look at this article from Google.
With a single click in the Screaming Frog SEO Spider, it is possible to exhibit all the replicated titles.
Verify whether the utilized tags contain suitable contents.
The titles of web pages should be distinct and provide a brief overview of the page’s content.
You can analyze all the title tags’ content easily in Sitebulb by navigating to On Page > URLs and examining the enumerated list.
This is the manner in which Sitebulb enables you to examine every title tag present on your website.
Verify whether the utilized tags have significant keywords.
The titles of pages should include the most significant keywords, but avoid overusing keywords.
Please verify whether the keywords are located at the conclusion of the tag.
It is advisable to include the most crucial keyword at the beginning of the title and avoid overusing keywords.
Verify whether the brand name is present at the start of the HTML tag labeled as
.
To include the brand name in the page title, it should be placed at the end, except for the homepage where it can be placed at the beginning.
Page Descriptions
Now, let’s examine a content-based SEO aspect – page descriptions from a technical perspective. If the page description is alluring and concise, it will compel users to click on the website’s synopsis displayed in the search results.
The meta description for SEOSLY’s homepage is provided below.
Navigate to the “On Page” option in Sitebulb and continue scrolling until the sections for “Meta Description Length” and “Meta Description Identification” become visible.
The meta description elements of your website can be easily reviewed in Sitebulb.
When utilizing Screaming Frog to examine a large number of page descriptions, navigate to the Meta Description option located under Overview > SEO Elements.
With the assistance of Screaming Frog SEO Spider, it is simple to acquire a roster of web pages lacking the meta description component.
Determine whether the meta description tags of the website’s web pages contain any content.
In Sitebulb, navigate to the sections labeled “On Page” and “Meta Description Identification”. Review the content listed as “Missing”.
To find web pages that do not have meta descriptions, navigate to the Missing section under Meta Description in Screaming Frog. If the meta description element is empty, Google will generate its own description.
The absence of a meta description is not a significant problem because Google often rewrites the unique descriptions of web pages. Nevertheless, it is advisable to include unique page descriptions for the most essential web pages of a site as it is a good practice for SEO.
Please verify if the meta description tags contain the suggested number of characters (140-155) as per the recommendation.
If we suppose that Google opts to use our own meta description, it is advisable to limit its length to between 140-160 characters in order to avoid having it truncated in search results or being too brief. The minimum length should be no less than 70 characters.
To view the specifics, navigate to the “On Page” and “Meta Description Length” sections in Sitebulb. Select the option to display the details.
This is the method you can use to verify the size of the meta description components using Sitebulb.
Navigate to the Meta Descriptions tab in Screaming Frog and choose options for descriptions longer than 155 characters or shorter than 70 characters.
With Screaming Frog SEO Spider, it is possible to observe all the web pages that have meta descriptions exceeding the acceptable length.
Sentence: Verify whether the meta description tags are replicated or not.
Similar to page titles, we desire the meta descriptions to have originality. Visit the option of Meta Description > Duplicate to observe the web pages that possess identical meta descriptions.
It is advised not to have multiple copies of the same page description on different web pages. If you need help with page titles and descriptions, there is an informative article available from Google.
Using Screaming Frog SEO Spider, it is possible to quickly and easily observe all instances of duplicated meta descriptions with a single mouse click.
Number 130: Verify that the information included in the page descriptions is suitable.
If the meta description tags contain unappealing or irrelevant content, users will be less inclined to click on the search results.
Verify whether the descriptions of the pages include keywords.
The page descriptions need to have the significant keyword, its alternative, and a similar term, if achievable. However, this is primarily for the benefit of the users and not just for search engines.
Headings
The significance of headers for SEO cannot be ignored despite any current criticisms. These headings are crucial for improving both user experience (especially for those using screen readers) and helping search engine algorithms comprehend the primary topic and smaller subtopics on the webpage. In this technical SEO checklist, a more in-depth analysis of headers is presented.
H1 Heading
Determine whether or not the website’s web pages possess an H1 tag, indicated by number 132.
Each page on the website must contain a single H1 heading.
To examine headings on Sitebulb, navigate to On Page and review the suggestions provided.
In Sitebulb’s On Page report, there are suggestions for H1 tags on my website.
To inspect a large number of H1 headings in Screaming Frog, navigate to Overview > SEO Elements > H1 > Missing. This will display a roster of web pages that lack an H1 tag.
The Screaming Frog SEO Spider tool will present a roster of web pages that lack an H1 tag.
If a web page does not contain an H1 tag, it is losing out on a great opportunity to provide significant information about itself to search engine algorithms. It is necessary for every page within the website, including the homepage, to have only one exceptional H1 heading.
Verify whether or not there are several H1 headings.
Navigate to the “On-Page” section in Sitebulb and continue to scroll downwards until you come across the “H1 Tag Identification” option, then select it to view further information.
Sitebulb allows you to verify the existence of H1 tags on a webpage and determine if they are repeated multiple times on the same page.
To see if there are web pages that have more than one H1 heading, navigate to the “SEO Elements” tab in Screaming Frog and select the “H1 and Multiple” option.
This is how you can verify which web pages contain several H1 tags.
Having multiple H1 tags is still preferable over having no H1 tags. However, it is recommended to use only one H1 tag whenever possible.
Confirm whether the H1 header content is optimized for search engine optimization (SEO).
H1 headers should include the primary keyword that effectively conveys the topic of the webpage to both search engines and users.
Structure of headings
Please confirm whether the web pages of the website include headings identified as No. 135.
Merely using the H1 tag is insufficient to convey the organization of a webpage to both humans and search engine crawlers. If a webpage lacks section headers or only includes one, it becomes challenging for both audiences to comprehend its message.
To examine the arrangement of headers on a webpage, you can utilize Chrome plugins like Web Developer or Detailed SEO Extension.
This is the arrangement of titles found on the SEO Blog section of my website.
Determine if there is an overuse of headings in number 136.
In order to emphasize crucial information and distinct parts of a website, headings are necessary. However, overusing headings can cause confusion not just for users with screen readers, but also for search engine algorithms.
Please ensure that the format of the headings is not damaged.
A useful guideline for SEO is to organize headings in a coherent sequence. Consider headings and subheadings as equivalent to chapters and subchapters in a book. Your webpage signifies the book.
Chrome extensions for checking headers:
- Web Developer Chrome extension
- Detailed SEO Chrome extension
Ensure to examine my inventory of SEO Chrome add-ons (79 add-ons evaluated).
Graphic Elements
Properly optimized graphic elements provide extensive additional information about the webpage to both search engine algorithms and screen reader users.
To view an overview of the page resources on your website, navigate to the “Page Resources” section and select “Overview” within Sitebulb.
To view information about the images, such as the ALT attribute, file size, and compression, click on the Images.
To verify a large number of images through Screaming Frog, navigate to Overview, then go to SEO Elements and select the Images option.
Screaming Frog SEO Spider has the ability to check a large number of images and their search engine optimization at the same time.
Verify whether the pictures have been properly incorporated or inserted within the document.
When it comes to website images, Google does not consider those that are embedded with CSS as webpage content. To verify how a specific image is embedded, you can easily right-click on it and select the Inspect option.
content of the website. This means that graphics should support the message of the website and provide information to the user rather than serving as mere decoration. It is important for website designers to choose appropriate graphics that are relevant to the website’s purpose and help the user understand or navigate the website. Additionally, designers should optimize graphics for web use to ensure fast loading times and an overall positive user experience. tag.
You can rapidly verify whether an image is properly embedded by examining it.
Number 139: Verify whether there are any pictures that have ALT attributes of low worth.
Alt attributes that are of little significance will cause confusion for both people using screen-reading software and search engine algorithms. These attributes offer crucial details about the image’s content. Therefore, each image on a webpage must have its own distinguishing and valuable alt attribute.
Verify whether there are any pictures without any ALT attribute present in them.
To see all the images that do not have ALT text, you can click on Overview, then SEO Elements, and finally Missing Alt Text.
The Screaming Frog SEO Spider has the ability to display all the images without an ALT text.
Using the Detailed SEO Chrome extension, you have the ability to examine individual pages and their associated images.
The Comprehensive SEO Add-On can additionally display information concerning the pictures utilized on a specific webpage.
Every distinct picture present on a webpage needs to have a distinct and valuable alternative text (ALT text).
Determine whether there are any images with file names that hold little worth, identified as number 141.
In order to verify multiple image filenames at once, navigate to the ‘All’ section under ‘Images’ in the ‘SEO Elements’ tab of the ‘Overview’ section. While ALT attributes carry more weight than image file names, having high value ALT attributes can diminish the importance of filenames. Nevertheless, it’s recommended to adopt SEO-friendly image filenames going forward.
Number 142: Verify that the photographs utilized are suitably sized.
Ideally, the pictures should be shown in their original size, which has already been compressed. One common mistake I often notice is when a webpage has large (usually PNG) images and resizes them using CSS/HTML.
If a webpage has this problem, Google PageSpeed Insights will alert you to it. If it happens frequently, it can significantly slow down and hinder the website’s efficiency.
Verify whether or not the utilized pictures have been optimized.
To determine whether there is any scope for enhancement, utilize Google PageSpeed Insights to examine the website. It is suggested that images on a website, particularly those with numerous images, be compressed and optimized, and the latest format should be utilized wherever feasible.
For WordPress websites, there are numerous beneficial plugins that can aid in image optimization and compression. It is recommended to review the best practices for Google Images.
HTML Code
No. 144: Check the code
This is highly dependent on the particular case. To analyze a website’s search engine optimization (SEO), examine the website’s source code and use your basic SEO knowledge. You can access the code of any website by adding “view-source” before its URL, such as: view-source:https://seosly.com/.
ADVANCED TIP: If you want to see the code of a website on your phone or tablet, just insert “view-source” before the web address.
If there are tens of thousands of lines of code and the homepage has relatively little content, there is likely an issue.
The instruction at number 145 is to verify whether any redundant remarks exist in the HTML source code.
Verify if the website incorporates any extraneous or unusual remarks within its code. The contents that occasionally appear there may astonish you.
Verify whether JavaScript is interspersed with the HTML code in number 146.
In the English language, the overall guideline is to put JavaScript tags before the closing body tag and within the header section. Your duty is to verify if they are not placed haphazardly.
Verify whether there are any in-line styles being utilized for number 147.
It is acceptable to use in-line styles from time to time, but they should not be used frequently. The HTML code should not have an abundance of in-line styles.
Content
To access the technical assessment of the site’s content, navigate to Duplicate Content in Sitebulb.
This is the appearance of the Duplicate Content report in Sitebulb.
To examine the technical aspect of the content, navigate to Overview > SEO Elements > Content within the Screaming Frog tool.
The Screaming Frog SEO Spider tool can be utilized to identify prevalent content problems on a webpage.
Number 148 pertains to verifying the existence of identical or almost identical web pages.
Navigate to the Overview tab followed by the SEO Elements tab and then go to the Content section. From there, you should review the pages that appear in the categories of Exact Duplicates and Near Duplicates. It is generally not desirable for these pages to be indexed by Google. To address this issue, these pages should either be canonicalized or have a “noindex” robots meta tag applied to them.
Please verify if the website’s main page contains any written content.
The most important page of a website is undoubtedly the homepage, which is why it needs to include some text with a clear heading structure. It is recommended to have at least a few hundred words. To ensure this, use the Detailed SEO Chrome plugin, which can check the number of words and proper heading structure on any webpage.
To determine the quantity of words on a webpage, you could employ the Detailed Chrome Extension.
Confirm the presence of web pages that contain low-quality content.
Navigate to the Overview section, then click on SEO Elements and finally select Low Content Pages. A low content page refers to a page that contains minimal words and lacks distinctive content. These types of web pages are generally disliked by users. If the low content pages are categorized pages, they can be optimized. Otherwise, they should be shortened.
Number 151: Verify whether the content is displayed in the format of pictures.
Although Google has improved its ability to interpret images, it is still recommended to include written text in order to better optimize a webpage. To confirm whether or not this is necessary, examine the essential pages of the website and evaluate its use of imagery.
Verify whether flash components are employed in place of written content in Number 152.
Although this question may be outdated, I suggest that you still verify whether or not Flash is utilized on the website. This can be done by navigating to the Overview, followed by the SEO Elements and Internal sections, and then locating the Flash category. If it reads zero, then there is no use of Flash on the site.
The Screaming Frog tool can assist in identifying web pages that utilize Flash.
Number 153: Verify whether the website’s material is inserted using iframes.
It is essential to avoid using iframes to display the content on the web page. Take necessary measures to prevent this from occurring.
Verify if the content on the website is pertinent and thematically consistent.
A Technical Look At Keywords
To effectively utilize keywords, technical expertise and a technical approach are necessary. Consider the following key points when examining keywords.
Number 155: Verify that appropriate tags are employed to emphasize important words.
Emphasizing important words on a webpage can be advantageous for both the website visitors and search engine algorithms. In order for this feature to function properly, tags should be utilized instead of. Your objective is to examine essential web pages, articles, or guides and confirm if highlighting of keywords is accurate.
Determine whether or not a thorough examination of appropriate keywords has been conducted for the website with the identification number 156.
As an auditor, your responsibility is to investigate whether anyone has conducted keyword research for the website, and if feasible, review the chosen keywords for the site. This helps to gain further understanding of the website under scrutiny.
Although keyword research is not included in the technical SEO audit, you can suggest to your client that you can provide this service. Utilizing either Semrush or Ahrefs will aid in conducting keyword research. If you are unfamiliar with how to conduct keyword research, it may be beneficial to take advantage of the keyword research course offered by Semrush for free.
Verify whether certain keywords are associated with particular web pages.
If you are unaware of the keyword research for a website, you can manually go through its web pages to determine if they are associated with any specific keywords. Typically, a web page focused on a certain keyword will include that keyword in the title, URL, headings, and initial paragraphs of text.
It is clear that the webpage you are currently viewing is focusing on the term technical SEO audit. This should be evident to those who are knowledgeable about SEO.
To determine if a particular webpage focuses on a particular keyword in WordPress websites using an SEO plugin, you can examine the page’s metadata. Both Yoast SEO and Rank Math permit this. The following is an example of what it would look like in Rank Math.
This is Rank Math on my site.
It is important to have access to WordPress administration in order to confirm this.
Examine whether the webpages are optimized to perform well in search results for their specified keywords.
This is the next stage after completing the previous step. Your aim now is to confirm that the web page you’re focusing on for a specific keyword is genuinely optimized for it. Apart from having the keyword in the title, URL, headings, and the initial paragraphs of content, the webpage should also contain helpful images (with ALT text), links to other related resources, and more.
Structured data
Google employs structured data to enhance its comprehension of webpage content and offer special search results features and rich results. A technical SEO audit must assess structured data on a website. Further information on structured data is available in this Google article.
Determine whether the website employs structured data by examining it.
To verify whether a website employs structured data, the most effective method is to conduct a crawl. Sitebulb or Screaming Frog can be utilized for this purpose.
To verify the types of Schema employed on the website, as well as any alerts or validation mistakes, navigate to the Structured Data and Schema section in Sitebulb.
This is how the Schema entities identified by Sitebulb are displayed on my website.
To ensure that Screaming Frog checks for structured data, verify that you have enabled the JSON-LD and Schema.org Validation options in the Spider Configuration menu under the Structured Data tab. Otherwise, the crawler will not scan for structured data.
Below are the configuration options for Screaming Frog SEO Spider that allow for the inspection and authentication of organized information.
After completing the crawl, it is possible to verify whether the website contains structured data. This can be achieved by going to the “Overview” section, followed by the “Structured Data” option, and then clicking on “Contains Structured Data”.
Screaming Frog offers organized data validation.
Here is a roster of website links that contain organized data. To verify the implementation of structured data for individual pages, you may utilize the Detailed SEO Chrome extension.
Please verify whether the structured data employed is legitimate, marked as No. 160.
To verify the validity of structured data implemented on a website, two tools can be utilized: the Google Structured Data Testing Tool and the Rich Results Test. These tools determine whether a web page qualifies for rich results.
If you utilize Sitebulb, conduct a crawl and visit the Structured Data section to observe an extremely thorough and attractive analysis of the organized data applied on your website.
This report in Sitebulb presents data in a structured manner.
In order to verify the use of structured data on a website while using Screaming Frog, one must go to the Overview section and select Structured data. If structured data is present, the number of URLs will be displayed next to Contains Structured Data and the Missing section will show either nothing or a negligible amount.
The outcomes of the Screaming Frog examination related to the structured data incorporated on my website are presented.
Verify the feasibility of incorporating additional forms of structural data into the website.
Your objective is to evaluate the website’s most crucial web pages, review the kinds of organized data they have, and determine which other types of data can be included. You can utilize a Chrome extension such as Detailed SEO to assess structured data types on a page-by-page basis.
The Detailed SEO Extension can be utilized to examine the categories of organized information that are applied on a specific webpage.
If the audited website operates on WordPress, it may be worth considering an upgrade to Rank Math Pro which enables the utilization of diverse types of organized information.
Website Speed
I am guessing that you have already evaluated the website using Google PageSpeed Insights. Now, let us obtain additional information concerning the website’s speed.
Keep in mind that you have the ability to verify the PSI ratings for each of your pages on bulb using Sitebulb. When configuring the crawler settings, make certain to inspect Page Speed, Mobile Friendly, and Front-end.
Inspect the Sitebulb configuration to evaluate the velocity evaluations and concerns throughout the entirety of the website.
After completing the crawl, access Page Speed to thoroughly scrutinize this component.
The report on Page Speed is provided by Sitebulb.
You can examine the quickness of your website using GTmetrix, which is identified as No. 162.
GTmetrix provides an excellent tool to assess website speed along with practical recommendations and identification of specific issues.
Below are the speed and performance outcomes of SEOSLY as indicated by the GTmetrix tool.
Evaluate the speed of a website by using the WebPageTest tool. (or) Determine the speed of a website by utilizing the WebPageTest website.
To ensure the website’s functionality on mobile, assess it using WebPageTest while emulating a mobile device.
The speed and performance test outcomes of SEOSLY on WebPageTest are presented below.
If you have not already done so, make sure to examine the website’s velocity by using Google PageSpeed Insights.
Repeat the examination and contrast its findings with the outcomes given by alternative velocity instruments.
The majority of internet websites fail to adhere to basic security guidelines. Your objective is to confirm that the website you are examining is not among those vulnerable sites.
If you want to examine the security concerns in Sitebulb in detail, conduct a crawl and proceed to the Security section. There, you’ll find a vast array of security components and their evaluations.
Sitebulb has the capability to assess the website with regards to the prevalent security concerns.
Is there an SSL certificate present on the website identified as number 165?
The use of HTTPS has been considered as a factor in website ranking since 2014, and it is now recommended that all websites adopt it in 2020 and beyond. Failure to do so will result in the website being labeled ‘Not Secure’ in popular browsers like Chrome.
When a website is lacking an SSL certificate, Chrome will exhibit this cautionary message.
Ensure that the website employs the use of HTTPS. If it does not, it should be given immediate attention to transition to HTTPS at the earliest convenience.
If a website possess an SSL certificate, this is what will be displayed.
Take a look at my instructional manual that explains the dissimilarity between HTTP and HTTPS.
Does the website contain a combination of different types of content?
Mixed content refers to the situation where website resources are available on both HTTP and HTTPS. To check for such problems, navigate to the Security section in Sitebulb. If there is no mixed content, you will find it listed under the No Issue section.
The presence of mixed content on my website was not detected by Sitebulb, resulting in the addition of this component to the section of the website evaluated as without issues.
It is necessary to redirect all HTTP resources to resources that utilize HTTPS. However, it should be noted that Chrome has undergone recent updates that allow it to handle mixed content on its own.
Is there any implementation of fundamental security measures?
Determining the range of basic security practices can be challenging. In most cases, increasing the number of practices is beneficial. Some uncomplicated and successful security practices comprise:
- HTTPS,
- Adding an extra layer of security to login panels by implementing a two-factor authorization system.
- password-protection of the login panel,
- strong passwords,
- Consistently using security software to check the website for potential threats.
- doing regular backups,
- Ensuring that the website is not restricted by Google Safe Browsing.
- to name just a few.
Technical SEO Audit: Server Log Analysis
No. 168: Check server logs
It is necessary to perform an analysis of server logs as they provide accurate information. The analysis of server logs should focus on certain areas such as crawl volume, response code errors, crawl budget waste, temporary redirects, and the last crawl date. You can efficiently analyze server logs using Semrush’s Log File Analyzer, which will help you make sense of the raw data.
Semrush offers an excellent log file analyzer that can handle all of the difficult tasks on your behalf.
In case you are unable to reach the server logs, it’s essential to examine the crawl stats report provided by Google Search Console.
WordPress Technical Checks & Quick Fixes
Below are some plugins that can help you resolve the problems mentioned previously. As a frequent WordPress user, I can suggest some of my preferred plugins. It is not mandatory to install all of them if you are only conducting a single technical audit, but it is recommended to acknowledge their availability.
In case you are conducting a website audit and plan to monitor it frequently, using plugins can significantly simplify your job.
No. 169: Install a backup plugin.
Prior to making any alterations to the website, it is important to create a backup. There are numerous methods to backup a website, but one option is to install a backup plugin (such as UpdraftPlus) that can produce a duplicate of all files and databases automatically. Ensure that the backup is kept separately from the other files. If the website’s hosting does not offer a backup plan, it is crucial to create one.
With Updraft Plus, you have the ability to easily update or restore your website by simply clicking a button.
No. 170: Install a security plugin
Websites created with WordPress are highly susceptible to cyber-attacks and breaches. To mitigate the risks, there are some valuable WordPress security plugins such as iThemes Security, that enable website owners to establish minimum security measures. For advanced security, purchasing the pro version of a security plugin is recommended to ensure optimum protection.
UPDATES: Your website’s speed may be affected by security plugins, so it’s important to conduct tests. Integrating Cloudflare CDN into your site can also enhance its security.
Number 171: Set up a highly straightforward SSL.
In WordPress websites, encountering mixed content problems is a frequent occurrence. Usually, there is an issue with HTTP > HTTPS redirects that needs to be corrected. Fortunately, there is a solution with the help of an SEO plugin known as Really Simple SSL, as it only takes one click to fix the problem.
Recent testing indicates that this plugin causes sites to run slower. It is advisable to install and then remove it, while retaining the existing settings.
Update both WordPress and its plugins, listed in number 172.
It is essential to update WordPress and its plugins regularly as it is a crucial security measure. However, some websites may encounter issues after updates, making regular backups crucial.
This dashboard on WordPress is where you have the ability to make updates to both WordPress and its plugins.
Your responsibility is to create a copy of the website and execute any necessary updates (if you have the authority to do so).
No. 173: Install Google Site Kit
Although it is not advisable to have an excessive amount of plugins, I firmly believe that installing Google Site Kit is highly beneficial. Google Site Kit is an authorized plugin by Google that allows one to access valuable information from various Google tools all in one location; the WordPress dashboard.
This is the Google Site Kit plugin.
I have an update: I am now trying to use fewer plugins. Unless you have a specific need for a dashboard in your WordPress panel, I advise against adding this particular plugin to your site.
No. 174: Install an SEO plugin
Examine whether the website is employing an SEO plugin and, if necessary, put in and adjust one. There are two primary options: Rank Math and Yoast SEO. It’s your call to determine which to go with.
New information: I suggest using Rank Math instead, as it does not negatively impact the speed of websites and provides several functionalities that would require payment in Yoast.
Rank Math provides a plethora of highly beneficial SEO features and choices.
Number 175: Enhance the quality of all the pictures as a group.
If images are not optimized, they can significantly decrease the loading speed of a website. However, there are plugins available that can optimize images all at once. One recommended plugin is called Imagify, although WP Smush is another good option.
The settings of Imagify can be modified to determine the degree of optimization for images present on your website.
Enhance the velocity and efficiency of No. 176.
I’ve tested many site speed and caching plugins, and some have improved my Google PageSpeed Insights score while others have lowered it. Through my own experimentation, I’ve found only one plugin that truly speeds up websites: WP Rocket.
WP Rocket is considered to be among the top caching and speed optimization plugins available.
It is important to frequently examine for any broken links on your website.
The availability of links is not constant and they might break causing inconvenience to the users. Fortunately, it’s possible to keep a track of all your links and receive alerts for any broken ones. Additionally, there is a convenient option to fix such links. I personally use the Broken Link Checker tool to monitor my links on a regular basis.
Revised: I suggest avoiding the use of this plugin due to significant site slowdowns during my testing. To identify broken backlinks, consider utilizing Sitebulb for site monitoring.
To check if there are any broken links on your website using Sitebulb, navigate to Links and then go to Internal Link Status and select Broken.
Determine whether the website requires any tidying up in terms of plugins that are not being utilized or are being utilized.
While using WordPress, go through the list of installed plugins and ensure that each plugin is being used. If any plugin is not being utilized, delete it. Check if there are multiple plugins with the same purpose, like SEO or security, and remove the duplicates.
Technical SEO Audit: E-A-T
The importance of E-A-T (Expertise, Authoritativeness, Trustworthiness) cannot be overstated. Therefore, I believe it is necessary to analyze it, even briefly, while conducting a technical audit.
Review if the webpage has links from reputable websites that operate within the same industry.
For an SEO blog, the desirable backlinks would be from well-respected SEO sources such as SEJ or Moz. A Backlink Audit conducted through Semrush can quickly provide a comprehensive overview of the backlinks. The links can be sorted according to their authority score. In the given example, the domain under examination does not possess highly authoritative links, which is regrettable.
Semrush can be utilized to examine the quality of backlinks. The procedure for doing so is discussed below.
Verify whether the website is referenced on other reliable websites.
Acknowledgements may not necessarily be hyperlinked, however, they still hold significant value. It is crucial to verify whether the website or brand name is mentioned on reliable online platforms. The most straightforward technique involves conducting an exact search query for the brand name, such as “seosly” in my case.
Google offers a search option that yields results identical to the search query.
Number 181: Verify that the information on the website is current.
In certain cases, it may be difficult to determine whether the information on a website is current or not, depending on its subject matter. To verify the timeliness of the website’s content, examining the publication or update date of the articles may be an option. This is an example of how it appears on my personal website.
For all my blog posts, I make sure to display both the date of publication and the most recent update.
It is possible to verify the latest alteration date of a specific URL or all URLs by looking at the XML sitemap.
Number 182: Verify that the information is correct in terms of facts.
It may not always be possible to confirm the truthfulness of the information on a website, especially if it pertains to a narrow area. However, it is important to examine whether any statements made on the site oppose widely accepted scientific beliefs. This pertains to websites that endorse alternative medicinal practices or conspiracy theories.
Confirm whether the creators of the websites are acknowledged experts in the respective field.
E-A-T contains a component known as authority, where both the website and its creators must possess expertise in the relevant field. A convenient method to verify this is by conducting a precise search of the website’s author or authors. Are they acknowledged as experts in reputable publications or websites, and are they regarded as authorities?
Determine if the website displays any evidence of its legitimacy, such as awards, certifications, badges and the like.
If the website or brand has any accomplishments such as certifications, awards, trust badges, etc, they should all be showcased on the website. The ideal location to display these accomplishments would be the home page or the about page. I have placed all of my achievements on both my website’s home page and about page, specifically on my SEO consultant page. Below is how it is presented:
These frequently asked questions contain additional details about me and my qualifications to perform my tasks.
Ensure that the website displays authentic feedback, and verify whether the reviews are favorable or unfavorable.
It is unacceptable for a website or brand to create fake reviews. While it may be difficult to confirm, some research can help distinguish genuine reviews from fake ones. A few negative reviews are normal, but if the majority of reviews are negative and customers are unhappy, the brand needs to take action. This should be a priority if the search results reveal overwhelmingly negative feedback.
Number 186: Verify if the website includes details about the individuals who wrote its content, such as author biographies.
If a website has a single author, the about page can provide the necessary information. However, if there are multiple authors, each author must have a biography in their respective articles. Despite being the sole author of SEOSLY’s content, I still include my bio at the end of each article.
The box used for my articles is the one created by me as the author.
Verify whether the website has information on how to contact them, including a dedicated page for contact inquiries.
It is crucial for a website to have a means of communicating with its owner in order to be considered reliable. A preferred option is to have a designated contact page that contains all available methods of reaching the owner, including an email, phone number, and physical address. Other contact information may also be found in the website footer. If there is no visible way to connect with the website owner, it should be seen as a warning sign.
Determine whether the website possesses a presence on Wikipedia, specifically a page dedicated to it, numbered 188.
It is acceptable for most web pages to not have a Wikipedia page. This is because obtaining a Wikipedia page is a highly challenging task. However, if a website’s audit, brand, or author has a Wikipedia page, it is an indication of their credibility and expertise, collectively known as E-A-T.
I can do an SEO audit of your website
My area of expertise within SEO is performing audits, which is why I offer my services for those looking for an experienced SEO to audit their site. You can find more information about my SEO auditing services or directly contact me through email or the provided contact form.