It’s that time of year again, spring cleaning (sigh). No I’m not talking about dusting your house from top to bottom, but cleaning up your SEO basics with a proper SEO Spring Clean.

After telling yourself you were going to keep on top of things this year, you’ve let it slip again, you got side-tracked with the big picture and large scale strategies.

Don’t worry, it happens to be the best of us.

In this post I’ll cover the importance of good housekeeping from an SEO perspective, and why these technical basics are important to improving your organic visibility.

Ladies and gents, the ultimate SEO Spring Clean to-do list…

GOOGLE SEARCH CONSOLE (HTML IMPROVEMENTS)

Duplicate and missing meta-descriptions

In an ideal world, each page on your site should have an individual, page-specific meta tag description. However, in reality that’s not always possible, or an effective use of your limited time.

For the more important, often visited pages on your site such as your home page and key lading pages, you should aim to create unique meta descriptions. For the other less important, less frequented pages, prioritise these based on visibility. If you leave this field empty, Google will generate a snippet of text for you.

Duplicate and missing title tags

The main purpose of a title tag is to provide users and search engines such as Google with some context and information about a particular page. It is recommended these title tags are unique, in order to help search engines understand what your page is about, and also potentially increasing your click-through rate (CTR).

Using Google Search Console is a great way to check whether you have issues in this area:

SEO Spring Clean - Silverbean

Here, you can easily identify whether you have duplicate and/or missing tags, find where they are located and rectify them. Alternative tools such as Screaming Frog also do a great job, without the need for Google to update their systems.

LINKING

Broken internal links

Broken internal links occur when a link within your website that is intended to go to another page, does not resolve. This can negatively affect your SEO efforts due to two main factors.

Firstly, 404s make for a poor on-site user experience, which could not only increase the likelihood that users will leave for another site better suiting their needs, but also interferes with one of Google’s paramount ranking factors: site usability.

Secondly, broken links within your site stop Google crawling it in an effective and efficient manner. If Googlebot reaches a broken link within your site, this would restrict any authority that linked pages would receive, in turn devaluing that page.

Links to your site (Link Audit)

With the latest Penguin 4.0 update, Google announced that real time penalties for poor quality linking are now part of their core algorithm and no longer applied site-wide.

However, penalties could still apply to the specific offending pages, and although Google now automatically devalues ‘bad’ links, it is still considered best practice to periodically carry out a link audit, and submit any unwanted links for Disavow, in Google Search Console.

Broken external links (Linking out to 404s)

As with most issues surrounding 404 errors, it’s best to rectify them as soon as possible.

Linking out to 404’s, despite not necessarily being your fault, still creates a bad user experience, and paints the picture of a poorly monitored site. There are a variety of online tools and programs which can help you identify where this could be occurring, enabling you to quickly resolve any issues.

CRAWL ERRORS & SERVER ERRORS

404 Errors

Needless to say, 404s are generally considered to be bad news for SEO, and should be redirected and rectified due to their negative impact on user experience. A 404 error occurs when a link is clicked to a page that no longer exists, and this can leave users frustrated or lost.

However, this is not to suggest that every single 404 error needs to be immediately actioned. Focusing on the main 404 errors, such as any pages that host a number of internal or external links, pages that generate high numbers of visits etc. will put you in a good position to ensure your website isn’t negatively affected.

302 redirects that should be 301

All too often, website owners and developers will implement a 302 redirect, when really they should be using a 301. This can cause confusion when your site is being crawled, as a 302 essentially tells Google that a page is moved for now, but it will be returning at some point, when in many instances that isn’t true.

By using a 302, it is possible that Google could still consider the redirected original page to be the ‘main’ page, and potentially disregard the new page you wish to be indexed. Ensuring you use a 301 redirect informs Google that the new page is undoubtedly the page you wish to pass authority to.

INDEXATION

Google Search Console index status

Using the indexation feature in Google Search Console allows you to gain an immediate visual insight into how many pages your site has indexed in Google.

This tool is great to spot potential spikes in indexation which could be a sign of indexation bloating after a migration, or a sign of duplication issues.

SEO Spring Clean - image shows Google Search Console data - Silverbean

Alternatively, you can simply search site:yourwebsite.com and view your indexed page count. Please note, this only provides an estimated number of indexed pages and as a result, is not 100% accurate.

Site Index image for Silverbean - SEO Spring Clean

Page Speed

Google uses a variety of factors that contribute to where your site ranks, including page speed. Page speed has become more and more relevant to Google’s algorithm, and later this year Google plans to change its index to prioritise mobile search over desktop.

The introduction of AMP (Accelerated Mobile Pages) and the move towards the mobile first index has meant that it is now common to see mobile amplified pages load in under a second, putting more emphasis on the need for your site to step up its game.

Google has a great free tool to test your current page speed, and even provides tips on how to improve your score, here.

ROBOTS.TXT FILE & SITE STRUCTURE

In order for your site to be easily crawled and indexed, both your robot.txt file and site structure should be carefully considered. Your website is assigned a crawl budget by Google, which it then uses to prioritise which of your pages to crawl and assigns a time limit on site.

It is your job as a site owner or SEO to assist crawlers where possible, for example if you’re an eCommerce site, you can block pages such as your checkout page and log in pages and allow crawlers to focus on pages which are relevant to search engine users. For example:

User-agent: *

Disallow: /checkout/
Disallow: /login/

Similarly, your site structure can assist or hinder crawlers from accessing your site. Google is getting better and better at understanding website usability and how humans interact with content. With that in mind, it is important to ensure your website is structured in a logical manner, with your content displayed in a fashion that would be expected by a user.

For more on site structure, read the latest blog post from Jack Nottidge, here.

SUMMARY

These recommendations for your SEO Spring Clean should help you to build a solid, clean foundation for your bigger, long term strategies to develop from this year.

Staying on top of these basics and keeping your house in order, can perform wonders for your SEO, and go a long way in ensuring your site is as user and search engine-friendly as possible.

Already done your SEO Spring Clean?

We’d love to hear about the tasks you prioritise to ensure you SEO remains in ship-shape. Please share them with us on Twitter, @Silverbean.