If you caught my post earlier this week about site architecture for user benefit, you’ll know how important it is to utilise the right structure to aid users through the customer journey on a site. But did you know that site architecture is just as important for search engines as it is users? Typically, what works well for a search engine will work well for a user. Here are my top points to consider from a search engine’s perspective when making architectural changes.

Visibility

As we are all well aware, there are hundreds of contributory factors to site visibility but did you know that effective site architecture can actually help improve your ranks?

By simplifying your website and linking pages that have semantic relevance, you will help the search engines understand exactly what subject your site is authoritative in and will use your allocated crawl budget effectively. Google assigns a certain amount of time to each site when crawling it and this is based on the size of your site, how popular your site is in terms of users and time on site and how recent your content is within the Google index.

This means that the more popular your URLs are, then the more often they are likely to be crawled however Google are always trying to prevent URLs from becoming stale within the index.

Furthermore, Google have stated that “having many low-value-add URLs can negatively affect a site’s crawling and indexing”.

What is a low-value-add URL? The list below looks at what constitutes a low-value-add URL in order of significance:

  • Faceted navigation and session identifiers
  • On-site duplication content
  • Soft error pages
  • Hacked pages
  • Infinite spaces
  • Low quality content/Spam

The bigger your site is, the more crawl budget that is allocated however exactly how that time is spent and how efficiently it is depends upon your site architecture and internal linking structure. If the search engine bots can crawl through your site unhindered by 404 errors or slow pages then this will actually help your overall rankings by enabling your pages to be crawled and indexed much quicker.

Image shows googlebots everywhere in relation to site structure for search engines

Internal linking structure

Internal linking and efficient site structure go hand-in-hand. Internal linking is the way that pages are connected to each other on the same domain.

There are many benefits that come with efficient and relevant internal linking and search engines will love it if you do this simple task and do it well.

Internal links help to pass authority from one page to another and when Googlebot is crawling your website to see what you may be talking about, if there are multiple relevant pages linked together, then this strengthens the authority that Google will pass to your pages. You are also making life easier for the crawler and using your crawl budget effectively by telling it which direction to head in.

Tip: Use relevant anchor text when linking internally and only link to pages that are relevant to the user. If it is helpful to the user then it will more than likely be helpful to crawler bots.

Test, test, test

As with any project, it is vital to test any changes to make sure they have been implemented correctly. Site architecture updates can be a large project and making navigational changes can have a significantly positive impact on how your site performs and ranks.

They can also have a negative impact if there are errors within your structure. If there are an increased number of 404s that you have not tested, then this is going to be detrimental to the performance of your site.

Tip: Use a crawler tool such as Screaming Frog or Deepcrawl to test your site and highlight any increases in 404s.

Tell the search engines

Once you have made changes to your site structure it is vital that you let the search engines know what you have changed and what they should be crawling. This can be done by updating your site map first and foremost.

Any changes you make to your site architecture should also be reflected in your robots.txt file, and if there are pages that should not be crawled, disallow these to prevent crawlers from accessing them. Search engines will naturally pick up on the new structure, however if you can make it easier for them and simply tell them what you have done, the process will be more efficient.

It is always worth bearing in mind that if you are considering re-structuring content that will result in URL changes, then these will need to be re-directed effectively as well.

Finally, it is important to remember that you must always implement proper 301 re-directs to avoid any duplicate content issues and reduce the number of broken pages on your site.

Tip: Once you have made architectural changes, whether this is to your whole site or just a new section, update your site map and fetch the pages through Google Search Console.

Technically sound website

A technically sound site is a huge pre-requisite to successful site navigation and architecture. Issues such as poor page speed, high numbers of 404 pages or an insecure site can flaw any structural changes you may be thinking of making.

If you are internally linking to a page that does not exist, is extremely slow or is not relevant then you are going to piss off your users and make them want to go elsewhere as well as confusing crawlers.

An absolute cornerstone of the SEO work that Silverbean carry out is to conduct a thorough technical audit to highlight any issues that are a cause for concern.

Tip: Conduct a thorough technical audit to ensure there are no tech issues that are going to jeopardise your navigation work.

Map it out

At Silverbean we are well versed at planning out site architecture whether it is a new hub for an existing site or an entirely new site – using the proper tools makes the job a whole lot easier.

Card mapping is an extremely effective way of visualising the user journey and where pages should sit within page hierarchies however sharing that map can be quite problematic.

Website mapping software can help overcome this. My two ‘go to’ tools when planning any new website structure or architectural updates are Slickplan and Visual Site Mapper.

Slickplan offers a simple user interface with many features allowing you to pull through the structure of an existing URL and simply ‘drag and drop’ the different pages. The software also makes it incredibly easy to share your work with colleagues, clients or just about anyone who needs to see it and have an input.

The image below demonstrates how an example site may look within Slickplan and how I use Visual Site Mapper to quickly see how many pages may be linked to a certain section of a website.

Tip: Slickplan offers a free to use version. Try it out and see how much time can be saved.

Whether you are mapping out an entirely new site, launching a new hub or looking to improve the navigation of an existing site, the approach should be the same and the user journey through your site should be as simple as possible.

If you always put your customer first (which any smart company will always do) and consider what it is that they are looking for, then you will naturally make smart decisions about your site architecture.

If you would like to know more about website architecture, any further best practices on how to structure you website or how Silverbean can help you to re-structure you navigation please get in touch or tweet me @nottidge4.

I’d love to see some of your best and worst experiences with site navigation and what makes a good site.