Technical SEO can often trigger thoughts of flux-capacitors and endless Matrix-style screens of code for those without a technical background. In reality, there are some key areas to focus on and failure to utilise these can put any website on the back-foot, potentially undermining other efforts to improve organic visibility. As we know, digital practices are ever-evolving (which makes it such a darn exciting industry to work in!), so given the time of year, I figured it was time to update you all on the must-haves for technical SEO in 2016, along with some useful advice on how to implement key updates.

Technical SEO helps to improve conformity between traditional website key performance indicators (KPIs) and search engine visibility criteria. The core benefits of improving technical foundations is often two-fold, aiding both search engine crawling efficiency and benefitting the end user.

To help demystify the black hole associated with technical optimisation, I’ve listed the critical elements of technical SEO in 2016 and beyond!

1. Optimising website loading speed

The advancements made with internet speed has created a by-product of impatient users with deteriorating attention spans. The underlining issue of faster internet connections is it creates challenging expectations, more-so if there is a discrepancy between loading times among websites.

Simple website tweaks can go a long way to vastly improving rendering times, giving you a solid advantage to competitors running slower performing sites.

Reduce, consolidate and compress

As my colleague Tim Pike outlined recently, one of the key online customer expectations in 2016 is site speed, so it’s an area of focus for technical SEO in 2016. According to Yahoo, 80% of website requests revolve around downloading various elements of your webpage, including images, CSS and scripts to name a few. This provides insight into the potential biggest gains, although the focus of loading speed optimisation is to strive for perfection across the board.

It could be compared to undertaking a task of making a car faster; it requires a collaborative effort.

Although adding a bigger engine may improve speed there are also potential gains with adding better performing tyres, reducing weight, more aerodynamic features and using better fuel – all contributing (some more than others) to the end goal. Here are 10 ways to reduce website loading times:

• Lossless compress images using a tool such as File Optimizer
• Enable Gzip compression and caching (long expiry date if applicable)
Minify HTML, CSS and JavaScript files
• Reduce or remove non-essential code and features
• Optimise CSS delivery, removing duplicate styling
• Consider installing PageSpeed server Module
• Integrate a reputable CDN (Content Distribution Network)
• Optimise delivery of JS, allowing immediate resources to load without hindrance (prevent render-blocking)
• Reduce the amount of redirects
• Reduce server strains to improve response times

2. Structured Data/Rich Snippets

Structured data is one of the very few tools a website owner can use to influence and highlight non-conventional SERP-displaying information. The term Rich Snippets actually portrays an accurate description of what this mark-up does; provides relevant snippets tailored to the user request.

Rich Snippets is multi-functional over a wide range of data inputs, ranging from reviews to the latest events. In 2016, Rich Snippets are expected to merge into the mainstream, and it’s likely that this technology will stick around for the longer term, based on its ability to improve searcher experience and allow businesses to provide key information.

Although the mark-up has been available for a relatively long time, Google is now giving it more and more attention. Visit the Schema website for more information.

3. Google Tag Manager

Google Tag Manager (GTM) as an isolated entity will not provide much benefit from an organic visibility perspective, however, the capabilities of the tool can dramatically reduce developer input for fundamental SEO elements; such as meta tags and content. GTM is a relatively new technology, designed to seamlessly implement on-page tags without hard coded “rogue tags” and patchy workarounds.

Although GTM is an experimental technology to the search industry, its capabilities are quickly appearing on the radars of search marketers to bridge the gap between implementing fundamental search aspects and unwarranted development costs.

4. HTTPS or HTTP over TLS

HTTPS is one of the topics at the forefront of search discussions, including how viable it is to implement alongside any potential pitfalls along the way. Providing additional website security to visitors is always best practice and it’s a future-proof technology, with very little risk of a rollback.

The implementation of HTTPS can be a daunting task, however, this additional layer of security is likely to provide long-term benefit and warrant the legwork involved. Google has created incentives for early adopters too; we’re likely to see an influx of sites migrating over to HTTPS as businesses discover the benefits for both security and potential search visibility improvements too.

5. Offering a helping hand to crawlers

As the web grows at a phenomenal rate, crawler bandwidth can become limited. 2016 is likely to see more emphasis on tools communicating with search engine crawlers for additional guidance. There are countless occasions of search engine crawlers using valuable bandwidth on pages of no interest or value. Eliminating the risk of a crawler hitting pages of no significance to your site can help to put emphasis on the pages that matter. Listed below are some proactive ways you can provide a crawler safety net:

• Download and review server logs – review which URLs are being visited by crawlers and react.
• Utilise URL Parameter options (with caution) available in Search Console.
• Ensure sitemaps are accurate with last modified date present (accuracy is key).
• Provide guidance through robots.txt, eliminating sensitive pages/directories.
• Make full use of secondary measures such as canonical tags to define structure and priority pages.

Adapt to user-focused site hierarchy

Established websites tend to be reluctant to carry out a reality check of current site hierarchy, often overlooking potential gains to improve site structure to meet user needs. As a website matures, you often see pages serve dual-purposes or the emergence of isolated segments of the site. It’s far more likely for a business to work with a site structure in place as opposed to revisit and conform to best practice and user signals.

Spending time to re-engage with user behaviour, search patterns and trends, whilst comparing to current website structure, can often provide hidden opportunities. Don’t be afraid to diversify your website but keep it in-tune with a well-researched structure, also consider defining URL structure with breadcrumbs to help improve consistency.

2016 is an exciting time for technical SEO, with the emergence of new technologies (such as AMP) and better understanding the capabilities of existing tools, such as Google Tag Manager (keep ’em peeled for another post on this very topic coming soon!). Considering each of the 6 topics raised in this blog post will certainly help raise the technical foundations of any organic search campaign; helping compliment and build upon other digital marketing strategies.

Did we miss anything? Add your comments in the box below or get in touch with us on Twitter!