Search engine optimization is an ongoing process. It takes a lot of work to achieve those elusive Page One rankings—and a lot of work to retain them. Part of the required onsite work is a systematic review of sources that might indicate there's trouble brewing or already in full storm mode.
This article outlines 10 health checks of indicators we include at our firm as part of clients' long-term SEO strategy, along with suggested check frequencies (there are no hard-and-fast rules on frequency; every company will have a different requirement). All examples are based on issues we've actually encountered.
1. Search Console health check
Frequency: weekly
Search Console information is as close as you're going to get to understanding what Google does and doesn't want. It's advice straight from the horse's mouth. Though it contains lots of useful info (checked your average page CTR against site outliers recently, for example? Need to swap up your meta descriptions to make them more compelling?), reviewing the Coverage and Performance sections and any messages related to slow loading pages is essential.
2. Google Analytics traffic analysis
Frequency: monthly
Review your organic traffic: Any peaks/troughs? pages performing as expected? Correlation between offline marketing activities and branded searches?
3. Meta tag and HTTP header analysis
Frequency: monthly
To be honest, most SEO software will quickly pick up meta tag and HTTP header issues and alert you, but if you're not signed up with one of the SEO software houses, then download the free version of Screaming Frog and crawl your site using that instead. You're looking for any noindex meta tags in the section of the source code or X-Robots-Tag nofollow/noindex directives in the http header.
4. Technical health check
Frequency: quarterly
The most common technical errors we see are canonical errors and mislabeled hreflang tags ("uk" is not country code for Great Britain...). Common canonical errors:
- They disappear; suddenly you've got massive duplicate content problems.
- They all start referencing the unsecure http version of the domain.
- They reference 404s. Yep. Here's the original version of my page...
- They clash with hreflang tags. If you have a regional page indicated appropriately with hreflang tags but the canonical suggests the original version is actually the page dedicated to another region, then this is problematic for a search engine trying to figure out what's going on.
I won't go in to too much detail about hreflang tags, but suffice it to say, use the right country codes.
It's also well worth using the aforementioned Screaming Frog to run a custom search for Google Analytics/Tag Manager code on every page you're tracking. Massive traffic drop? Maybe the homepage has mysteriously shed its tracking code...
5. Robots.txt review
Frequency: quarterly
A simple but obvious one. Anyone been tampering with the robots.txt file? Accidentally copied the file over from a staging site and blocked all search engines from crawling any page on the site with a cheeky "Disallow: /"?