Technical SEO


Technical SEO is the process of optimising your website for efficient crawling and indexing. Search engines crawl your website, its pages and your code in order to understand the information that your website contains and what topics your website should be added to their index for.

Your website must be set up in a way which allows search engine crawlers (also known as spiders), like GoogelBot and BingBot, to efficiently crawl and index your website. It is vital to the success of your business if your website relies on organic traffic from search engines.

Search engines are customers

More businesses than every rely on traffic directly from search engines, so ensuring that the pages of your website are set up for search engines to effectively crawl, interpret, index and rank your site is paramount.

We have had experience optimising sites which have suffered from;

  • hierarchical & structural issues
  • on page technical problems
  • duplicate content
  • slow page speed
  • canonical tag inaccuracies
  • crawl inefficiencies
  • lack of optimisation for mobile devices
  • noindexing issues
  • legacy redirect & 404 issues
  • low crawl frequency

Some examples of each of these issues are;

  • hierarchical & structural issues; is your website well structured with direct links from your homepage to the rest of your site? Your most important pages should normally be nearer your homepage.
  • on page technical problems; content must effectively crawled and assessed by search engine spiders. Sometimes, content without html text around it, although very useful for users, can be difficult for search engine spiders to understand and therefore difficult to index. Other issues like heavy usage of javascript or content built with old or unsupported code can slow down or stop search engine spiders from crawling. Issues like missing title tags and meta content can also make accurate indexing more difficult.
  • duplicate content; multiple pages with the same or similar content can be inefficient to crawl and cause inconsistency in search engine indexes. Sometimes two pages targeting similar queries can switch in and out of search engine results pages, negatively impacting organic traffic and sometimes user experience.
  • slow page speed; the slower your pages load the more lag time users and search engine crawlers face. Google likes fast websites that are efficient for their computing resource as well as better for user experience.
  • canonical tag inaccuracies; canonical tags can cause technical issues if they’re incorrectly configured. Even a small issue like an additional space or a single character change in a self referential canonical tag can lead to huge crawl inefficiencies.
  • crawl inefficiencies; most medium to large sized websites have URLs they’ll never want indexed. If this is the case, blocking crawling of these pages rather than using nonindex tag can be the most efficient solution for search engine crawlers.
  • lack of optimisation for mobile devices; Important for websites which rely on Google organic traffic. The majority of Google’s index is now “mobile first”. If the mobile and desktop versions of your website are not consistent then you may find your traffic dips.
  • noindexing issues can occur when pages which are not needed in the index are included. Common examples of this can include things like onsite search result pages or user profile pages. If search engine spiders are spending time crawling pages like this, they aren’t spending their time crawling the most important pages of your site.
  • legacy redirect & 404s; as you make changes to your site, like adding and removing pages, URLs can be switched off or changes which can mean internal links to 404 pages or through redirects exist. These types of broken internal links present issues for search engine crawlers which dislike wasting resources crawling dead pages or having to load redirects to resolve a 200 URL.
  • low crawl frequency; Although this cannot be directly controlled, there are ways to increase the crawl frequency by search engine spiders of the content on your highest priority pages. These methods aren’t hacks but a genuine mutual benefit to both website owners and search engines. The search engines gain a better understanding of your website content and therefore can index it accurately, in return your most important pages get crawled more frequently.

As search engines have advanced their ability to truly understand website content, code and hierarchy has improved drastically.

As a result, optimising your website for technical performance is also more complex than ever before.

However, this doesn’t mean that technical optimisation is redundant. Getting ahead in search engines rankings today is not about getting any one aspect of it right but about ensuring that you content and links strategies are built on a website which offers as solid a technical foundation as possible to give your website the best chance of success.

Where can I find technical issues?

Technical issues which can effect organic search performance can exist even anywhere on new or well established websites. Many of these technical issues can be challenging to detect and even harder to fix.

Newly launched sites can be hit by technical issues missed by developers and SEOs when the site is launched.

However, very often, technical issues creep in as websites age and iterations and changes become common place.

Technical issues can exist on any page of your website, even those that perform very well in search rankings.

As these problems are deep, varied and can be hidden from direct view within the code of your website. The best way to be confident of finding and fixing these problems is to contact a professional SEO.

Fixing these issues can take time and very often rely on the current policies of the search engines you are optimising your website for.

There’s not always one clear but solution to these issues so a good SEO with plenty of experience will be able to talk you through the various options available to you to fix the problems and help explain what they are looking for to ensure the fixes have worked.

What’s the benefit of fixing technical SEO issues?

Unlike missed content opportunities, technical issues do not normally mean that you are missing out on a big slice of potential “pie” from a search query you are yet to think of, but can be considered more like a “limiting factor”.

Technical issues are more likely to prevent you from reaching your maximum potential in existing search results.

A number of technical issues on a single site could be the reason for a site ranking on page 2 of Google when in fact it may have the potential to rank on page 1.

Ultimately, like the majority of jobs that good SEOs do for site owners, the net result of technical issue fixes should be to grow the traffic to your website and effectively convert the traffic that you do get.

How Search/Natural can help you

We have extensive experience in optimising websites from a technical stand point and have seen how critical technical optimisation can be to the performance of websites in search.

We can take care of the technical set up of your website so you can focus on your human customers.

Normally, a full website audit is our main priority when working with new clients so that we can fully understand where issues may exist and ensure that we have planned our strategy around fixing these first.

Once we have an idea of the size and work involved in the technical issues we can explain them to you and the likely issues which they could be causing in as much detail as you’d like and organise a workflow so that we can prioritise the fixes that will make the biggest difference to you and your business first.

If you are concerned about the technical SEO set up of your website or have seen traffic dropping recently but do not know why then please let us know and we would be happy to set up a call and free consultation to discuss potential issues.