92WEST-OUR-WORK.jpg

Our Technical SEO Checklist

Organic and PPC SEO / SEM Checklist

Technical SEO plays quite a pivotal role in optimized search engine rankings. Indeed, proper implementation of its strategies can help make your site technically perfect and ultimately boost your traffic.

So, what does technical SEO entail?

Technical SEO refers to practices applied on a site and its server with an aim to optimize its usability and crawling, indexing, and rankings on search engines. For this article, we list the top 7 factors which we expect will feature prominently as regards to technical SEO in 2017. Some might sound familiar since they’ve been around for some time now while others are relatively new as a result of the recent changes in search engines algorithms.

Check indexing

There are various ways in which you can check the number of pages on your website that search engines have indexed. For example, SEO crawlers such as Website Auditor can help you find out more on this. You can also enter site:domain.com in the search engine that you’re targeting. This number should ideally be proportional to the total number of your site’s pages and less the pages you do not want indexed.

Make sure useful information on your site is crawlable

You can use robots.txt to check the crawlability of your site, though it is not always accurate. Robots.txt restricts pages from being indexed. You can take count of all blocked pages by using an SEO crawler, irrespective of whether the information was in noindex meta tag, robots.text, or X-Robots-Tag.

Throughout 2017, it will be important to ensure that all your pages and various other resources such as JavaScript and CSS are crawlable. If not, you risk not having your website’s dynamically generated content indexed by Google due to its inability to crawl your JavaScript. Your site’s pages will also not appear as they should if your CSS files are blocked from being indexed.

A site which relies heavily on JavaScript, or which is built with AJAX will require a crawler that can render JavaScript. At the moment, Screaming Frog and Website Auditor are the only SEO spiders offering this option.

Crawl budget optimization

Crawl budget refers to the number of pages that are crawled by search engines in given time. You can use Google Search Console to find this out your crawl budget.

However, if you’re interested in getting a more comprehensive version of the crawl stats, you will need to use a better tool such as WebLogExpert which can help assess server logs. Otherwise, you will not get a page-by-page breakdown of the data by using Google Search Console alone.

Once you’ve figured out your crawl budget, you should aim to increase it. The key factors that Google uses to assign crawl budget are the number of internal site links and backlinks from other websites.


The following steps can help you improve your crawl budget:

  • Avoid duplication of pages. Canonical URLs are not necessary as far as crawl budget goes. As such, you will only waste your crawl budget if you focus on them since search engines only reflect the duplicated pages.
  • Pages with zero SEO value should be blocked from indexation. Expired promotions, terms and conditions, and privacy policies should be disallowed in robots.txt. Certain URL aspects should also be specified in Google Search Console, so that same pages with varying parameters are not crawled separately.
  • Broken links should be fixed; otherwise, you waste a unit of your crawl budget every time search bots link to a particular 4xx and 5xx page.
  • Update your sitemap and register it on Google Search Console.

Internal links should be audited comprehensively

Internal links aid in maximizing page ranks and ranking power more efficiently. A great site with a shallow and logical structure ensures good UX and crawlability.

Below is a checklist to help you audit internal links.

– Click depth

Your important pages should be kept at a maximum of three clicks away from your website’s homepage. You can keep the structure of your site shallow by observing the three-click rule.


– Broken links

We all know that broken links tend to waste ranking power and confuse people who visit your site. The majority of SEO crawlers detect broken links, but they don’t always find all of them. In addition to HTML elements, you can search for broken links in HTTP headers, tags, and sitemaps.


– Redirected links

This affects crawl budget and load time negatively because when users visit your site, they are taken through several redirects even if they land on the right page. Search for links of at least three redirects and ensure to update them to redirected pages immediately.


– Orphan pages

These are not easily found by search engines and people visiting your site because they are not linked to from other site’s pages.

Sitemap Review

Sitemaps help search engine find out about your new content faster by telling them about your site structure. You can check your sitemaps against the following guidelines:


– Freshness

Update your XML sitemap whenever new content is generated on your site.

– Cleanness

Your sitemap should be free of 4XX pages, redirected URLs, pages blocked from indexing and non-canonical pages. Failure of this might cause the search engines to ignore your site completely. Use Crawl > Sitemaps in Google Search Console to check your sitemap for common errors regularly.


– Size

The limit of Google sitemap crawls is 50,000 URLs. Reducing that number will ensure that your most important pages are crawled more often and the crawls will be more effective.

Test and enhance your page speed

Page speed is a key Google’s ranking signal and one of its top priorities in 2017. You can use Website Auditor to enter all the URLs of your pages in PageSpeed Insights tool to test their load time and speed. If your page fails some of the test’s aspects, Google furnishes you with the details and gives you recommendations on how to fix it. If your images are too heavy, for example, it provides a downlink with their compressed version. That shows just how important speed means to Google.

Get mobile-friendlier (responsive web design)

Google recently started the “mobile-first indexing of the web” which seeks to place more priority in indexing mobile versions of websites than their desktop counterparts. This means that your mobile version will play a critical role in how well you rank for both mobile and desktop search results.


Here is a summary of the most essential things you need to prepare for this change to your site.

  • Test your website’s pages for mobile-friendliness. You can use Google’s Mobile Friendly Test tool.
  • Use Mobile Friendly Test tool from Google to test how mobile-friendly your pages are.
  • Audit your mobile site comprehensively by using an SEO crawler with robots.txt and custom user agent settings.
  • Keep track of mobile rankings. Track your mobile ranks on Google and bear in mind that your success in mobile ranks will soon reflect your desktop rankings, too.

What do you think of these technical SEO tips for 2017? We’d like to hear from you.

2626 Harney Street
Suite D
Omaha, Nebraska 68131
t. 402.620.2633
https://www.92west.com

Strategically Creative
branding + design + web + apps + seo

Additional Works:
https://slate.adobe.com/cp/u52ZN/

Demo Reel:

https://youtu.be/eJbxtX6jxJY

Omaha Web Design

No Comments

Post a Comment