SEO Predictions

Updated on November 14th, 2021 at 09:35 pm

Here are our thoughts on the biggest changes and trends happening in search engine optimisation in 2020.

We hope you find these useful when looking at what to include in your strategy for the year ahead.

Google Takes More of Your Traffic Than Ever Before

What does this mean? There is evidence to suggest from click-through rate trends collected over many years that Google especially has been gradually eroding the number of clicks that it sends to content publishers, year on year.

It’s not difficult to see how Google is achieving this, for several years now Google has pioneered low and no click search result pages. Some examples of this are informational queries like; “what will the weather be like in London tomorrow?” or a search query like “Manchester United football scores”.

Initially, these types of informational queries are where Google focused on offering this type of content exposure however, more recently Google has also begun to add additional features to differing query types.

Featured snippet content and knowledge graph content have evolved even more quickly as publishers mark up their content with structured data mark-up.

This mark up specifically helps search engines understand content and content entities in relation to the page and it’s content. Search engines use the mark up to understand why this content it should show for a certain set of query types.

Fresh implementation of these Search Engine Result Page (SERP) features include queries that do have commercial intent behind them e.g.; “flights to New York” arguably straddles a line between commercial intent and informational, but there is a good chance that someone searching for this query might end up booking a seat on a flight.

Learn more

What is a SERP?

A SERP (Search Engine Results Page) is the page which search engines serve to you after you type in a query and press enter or search. Traditional SERPs listings contain a link to the website, a meta title and a meta description for each ranked page. Normally, up to 10 organic results are displayed on each SERP, with paid placements listed above organic results.

What are Google’s SERP Features?

Google offer much more than SERPS with just 10 blue links. Their SERP features include;
1. Rich Snippets – a listing may include additional “rich” information like a photo from the page or star ratings for the product
2. Paid Results – Normally found above organic results, ads are identified by the green or black “Ad” text next to them. Ads can also include “rich” features.
3. Universal Results – offers content of all types from all databases, this includes features like videos, local business results and images all in a single search box.
4. Knowledge Graph placements – this includes instant knowledge based information like; weather information, flight times or football scores
5. Featured snippets – a paragraph of text from a website ranked among the top 10 results which Google algorithmically decides will meeting the intent of the user entering a search query.

The appearance of these features will vary depending upon the type of search query entered, the geographic location the user is searching from and the sector in which a user searches.

Another example could be “How to learn PHP” although many free resources exist on the internet to help those interested to learn new skills, this example is one where a YouTube video (a Google property) is featured prominently among the search results.

This type of video content is doubtless a good fit for the user, who would probably appreciate and enjoy a free video on the topic, however, given that YouTube does not rank in the list of 10 blue links on page one at all does indicate that to some extent, YouTube gets special treatment by Google and is arguably an artificial placement.

Ultimately, while better for the user, these types of features can serve to take clicks away from websites and content creators who have invested heavily in SEO to seek returns on content hosted on their websites.

Further examples, affecting sites ranking lower down pages include; top story banners, featured video carousels and people also ask questions.

We have already seen evidence of Google starting to implement changes to limit clicks to publisher sites with the removal of the second inclusion of the publisher who occupies the featured result in January 2020.

Such an enormous change being made at once globally is sure to have a huge impact on CTR curves across the World.

How it will affect you? While some featured content does tend to help publishers gain clicks to their websites, paragraph style featured snippet typically have a positive impact on overall click volumes to a website, although it can vary.

Ultimately, it is likely that the majority of these SERP “features” will erode your traffic, unless you’re competing to be in and taking positions in these spots.

The data to back this up is usefully collected by AWR who, over time have observed this gradual erosion in CTRs.

high-ctrs-november-2014

Average CTRs mobile and desktop 2014.

lower-ctrs-november-2019

Average CTRs mobile and desktop November 2019. Thanks AWR for the data.

The above is due in some part to the effective monetisation of Google SERPs with paid ads, and in recent years, largely down to the growth of featured snippets, featured content and interactive tools in the SERPs.

What you should do? If your site relies on clicks from Google to continue to operate as a business and not just impressions on SERPs (and this is likely case) then you must look to create relevant content and aim to take those top spots in search, using tools like; structured data markup, video content and intent-based approaches to pages and content.

Ironically, competing by using tools like structured data will further aid Google in their quest to “understand” the internet, and more specifically, the code behind your website and how it relates to the entity for which users are searching. Effectively this further enables Google to refine SERPs and produce even more on-page features to keep users on their pages and not yours.

Your solution to keep your traffic comes in two parts;

Part 1

Ensure you are visible. Get the featured snippet content, produce a video and use structured data markup to ensure that your videos appear in the carousel etc. If you are not in that spot, then your traffic will suffer.

Part 2

Optimise your content to encourage a click, don’t give away the information on SERPs so users do not need to click at all.

This is where the difficulty lies, doing enough to get the featured content placement, then taking away content so as not to lose it but to ensure that a user does click on the listing.

It will be a definitive skill in the SEO industry and one that will separate the best SEOs from the rest.

If your site ranks well on page 1 of SERPs (between 1 – 3). You could try adjusting down the max-snippet length down to a level where this is achieved, if your content is well structured enough, adjusting the max snippet length down, to remove the final and critical part of the information could be enough to encourage a click from the SERP to your landing page.

If you are feeling really brave, adjusting the meta tag to “nosnippet” will not allow Google to scrape and replicate you content on the SERP at all, ensuring that you rely on your traditional ranking data to gain clicks.

It isn’t guaranteed to improve CTRs but effectively strategised it could make for a very interesting SEO CTR test. If you think you get more traffic from your traditional 10- blue links ranking position than your featured content then nosnippet could help you prove that theory.

What does this mean? Google is changing how it supports external links on websites. This means that whenever you add a link to your website which targets a different domain there are now multiple things to consider.

Currently, the system is a very simple one. If you add a link without a rel=”nofollow”, then you are endorsing the linked domain with a vote which signals to Google’s algorithm that the page on the linked domain is a trusted authoritative source for information for the topic that page targets. 

Any external links on a site which include the rel=”nofollow” attribute are treated oppositely and discounted from Google’s algorithm, the links are not crawled by GoogleBot and do not count as ranking signals (although some SEOs believe that the tests that have run on nofollowed links in the past have proven otherwise).

So, SEOs primary focus is on gaining as many followed links as possible to their websites, while, in most cases, very rarely giving out followed links themselves.

Links and the nofollow attribute, have been central to the ranking algorithm at Google since 2005, but this is all set to change in 2020.

Google announced back in September last year that they’ll be supporting two additional attributes from March 2020, these are;

  • rel=”sponsored” – adding this to outbound links signifies that they are part of a paid advert, affiliate marketing or sponsorship.
  • rel=”ugc” – links marked up with this will be treated as user-generated content links and will likely appear in user produced content like on forums, in comments or possibly social media.

Rather than signals, Google has suggested that links marked with sponsored or ugc will be treated as hints.

“Links marked with these rel attributes will generally not be followed.”

Google through their webmaster blog.

In typically woolly style, Google leaves this open to interpretation and, importantly, reserves itself the option to consider these links, at least partially, towards (or even perhaps against) the ranking of a website in search.

For webmasters, it’s a headache not gaining clarity and may lead SEOs to ultimately need to put more consideration into how their affiliate partners are using link attributes.

How it will affect you – when this rolls out, it could lead to drastic changes in search engine results pages, especially if sites get the implementation wrong, although this is unlikely given that there is no established right or wrong at this stage.

Generally, total numbers of followed and nofollowed links correlate, pages with lots of followed links generally also have lots of nofollowed links so in these cases, even if linking sites change to the new method, there may not be much change immediately.

However, where we could see drastic changes is where Google, in specific sectors, search types or even for specific websites, does begin to count links differently. Their statements around the launch ultimately bring control of the link graph back to Google.

The impacts will ultimately depend upon many things like; the vertical you’re in, your current backlink profile and where the bulk of your traffic comes from.

However, I wouldn’t be surprised if we see some big winners and big losers in the weeks following this new change.

Even if in theory nothing is changing, practically there is a shift in power towards Google and how it decides to manage the link graph.

Something critical to websites positions in the ranking up to now.

What you should do? If you are worried, contact an SEO professional to get them to review your backliknk profile, your key competitors, their backlink profile and come up with a plan of action based on what you find.

Make the changes you need to for your current set of external links and ensure that anything added, especially by users in comment sections, to the site going forward follows the rules you set.

Keep an eye on your competitors and their performance, but do not copy them without reliable data, what works for their site may not work for yours.

My advice to almost every site would be to; Read through Google’s publication around the link attribute changes, especially sites which invest heavily in affiliate marketing or rely on partners for traffic and leads or those sites which have a high number of outbound nofollowed links to understand the potential impacts and importance of the changes directly from Google.

You may decide you do not need to make any changes at all in advance of the additional attribute support.

Whatever actions you end up taking monitor your site’s traffic after the support rolls out and keep an eye on pages moving around in the rankings, especially on high-value queries. It normally makes sense to monitor the performance of your site when big changes are made.

What does this mean? A theoretical concept forms the basis of this argument and that theoretical basis is, “you cannot buy trust”.

Unlike links, press coverage and even users which all can and, in many cases are, bought, you cannot directly “buy” trust in your brand. It is something that many companies would pay a considerable amount to have but it remains as something so intangible that buying it would be almost impossible.

As very much an earned metric it is something that well established and respected brands have built up, normally over many years. It tends to arise from investment in their product, offering value to customers and through consistency and reliability.

So, how do you measure something as intangible as trust? Well, there probably isn’t an easy answer to this but if something, even quite as intangible as trust in companies, exists then Google sure will do their best to measure it.

Brand mentions have long been something that many marketers have suggested could be a way for Google to begin quantifying trust. Similar to links and how these are counted, brand mentions and sentiment around these mentions, could be a good way for Google to accumulate a picture of how companies treat their customers, how their products offer value and what strategies brands are employing to gain brand loyalty. Especially for digital businesses and e-commerce sites.

Google already visits every indexable website on the internet and collates a huge data set around what makes them work, collating mentions of a specific brand or company is something that wouldn’t be difficult for Google to collect and maybe something they already do record and even use. 

It’s not just Google that might be thinking more deeply about brand, the rise of social listening sites like; Hootsuite, BuzzSumo and HubSpot shows that there is ample demand for marketers to gain insights as to the overall perception of their brand.

Ultimately brand mentions could evolve to become a more reliable metric than links. Links are easy to game, buy them, borrow them, swap them or do nothing other than target growing them. Ultimately, brand mentions are more difficult to game because companies that don’t honour their customers or support their wider communities will be talked about less often than those that do.

How it will affect you? The effects on you as a business owner can only be good. If you’re a professional, reliable business that puts your customers first then you’re already getting lots right.

Every business, large or small, relies on its customers to make it a success so shouting about what you are doing and encouraging those you serve to do the same for you should be high your priorities list.

Gaining more brand mentions doesn’t mean that overnight your business will get national press coverage and explode into a £billion corporation, but you should also remember; you only need to beat your direct competitors to win in the trust game.

Ultimately, trust doesn’t have to be a competitive metric either, provided your customers trust you and know you put their interests first, then maintaining a successful organisation should be possible. If growing businesses is your priority, then getting your brand mentions up is a sound investment both online and off.

What you should do? Set up a testimonials page on your website, host positive feedback there for all, including Google, to see. Encourage users to leave reviews and send you this feedback once you’ve completed a sale. Set up a review page for your company on Google reviews or a similar third-party service, encourage users to write reviews and give feedback here also. Respond to good and more importantly the bad reviews. Don’t be afraid of the reviews with issues, use those that are less positive as an opportunity to put things right for that customer and respond promptly with a suitable resolution.

This gives a platform for you to show how you are earning trust.

Other ways to increase brand mentions could be to include; community project support, volunteering your services or at the very least talking about the success that your business is having with/for customers. Ultimately, it doesn’t matter how you go about it, provided you generate a positive buzz around your brand.

Voice Continues to Be a Digital Marketing “Buzz” Word But Has Minimal Traffic Implications

What does this mean? Voice search has been talked about for several years, new statistics published semi-frequently suggest that this year will be the year that voice jumps from a commonly discussed marketing buzz word into a full-on revenue stream.

You can be skeptical of this for two reasons;

  1. not enough people think it can work both on the consumer side and the side of the marketers.
  2. There’s nothing out there for users to get excited about in terms of voice search.

Sure, you can use your Alexa to put on Madonna songs when you get home from work, yes the Google nest can tell you the weather for tomorrow but there are still so many limitations that it cannot help you with that, to users, it’s currently just a novelty.

Voice-activated technology can do some fantastic things, you can order your shopping from Ocado or get a takeaway from Just Eat however, this is very much a goal-orientated behaviour. I have a specific goal in mind when I want to order my shopping or a takeaway, have food delivered in both cases, ultimately, search is a less specific behaviour than buying food.

I may be interested in a new lawnmower, so I turn to search to help me decide. It is faster for me to use typing to specify “lawnmowers for lumpy patchy lawns” and see the set of results, alongside images, where I can get it and how much it costs than to stand by and wait for my voice device to read all of this critical information back to me.

Besides, what if after lengthy detailing of product specifications, pricing, locality and availability from my voice-enabled device, I load up a picture of my newly selected lawnmower on my phone and I hate the look of it? Despite it being perfect for the specifications of my lawn? Of course, I could continue on a buy it anyway, knowing it is likely the best option or, more likely, I’ll find one less suitable but that I like the look of, why would I want something ugly just because it is technically perfect?

This for me is where the misconception of voice-controlled search exploding in popularity falls apart for me. Without defined specific goals or products in mind or an app that connects the customer to the seller, there’s very little useful functionality for voice-activated search in search engines today.

If you have experience of using voice search for anything then please do get in touch, I’d love to hear some real-life stories of the usage scenarios in which voice has helped you.

How it will affect you? Thankfully this is one that I think you can set aside for now. This isn’t likely to radically affect SERPs this year.

What you should do? Focus on producing great quality content both written and video for your site instead.

SEO Continues to Become More Important As Businesses Try to Understand It Better Than Ever

What does this mean? As more site owners realise the importance of organic search traffic to their websites, demand for SEOs will continue to increase. This combined with Google’s gradual erosion of organic clicks will send publishers into overdrive, trying to find new ways to ensure that they are the biggest beneficiary of Google’s changes, other than Google itself.

Content producers and marketers will feel even more under pressure to maximise the potential of the search engine visibility they already have and be forced to explore new sectors and areas to continue to grow.

This is in part due to Google’s slow consumption of organic traffic but also in part because of Google’s ever-improving ability to truly understand it’s users’ intentions.

B.E.R.T – Bidirectional Encoder Representations from Transformers

BERT, the language processing algorithm developed by Google, now helps indexing tools return results with even greater relevance to searches inputting unconventional or previously unseen search queries. Google’s understanding of “transformer” words within a sentence led to the development and creation of B.E.R.T specifically to help interpret these queries.

Although this doesn’t sound like a huge leap, it could make a very big difference to the results you see in search every day. Or so Google claims.

Previously, sentences were separated and results are returned based upon the individual words based in the sentence. Now, via BERT, Google can in effect, “read” a sentence and understand the words, in the context of the sentence as a whole. They can discern the meaning of the sentence then return a result based on what is meant by that specific user over a more generic set of closely related topical pages.

How will this affect you? This type of evolution in search engine marketing is very much that, an evolution, not a revolution. There’s no immediately obvious way to optimise for B.E.R.T handled queries, and in fact, establishing which types of queries are handled by B.E.R.T has proven quite difficult in itself.

However, theoretically, as B.E.R.T. “learns” publishers may begin to see more specific (but tenuously related) search queries deliver traffic to their sites. So, this could mean that queries better suited to your pages drive traffic there more often. Examples of affected query types are listed here and illustrate the impacts they could have.

It may mean that better delineation through the use of subheaders, subsections and content entity modelling is required to ensure that your pages continue to show for queries that previously drove traffic there.

Some may interpret this to mean “my site needs more pages now” but that isn’t always the best option.

What you should do? There is not too much change for this at the moment. BERT has proven incredibly difficult to optimise for according to the majority of SEOs out there. Another of Google’s machine learning algorithms is driven by users and learns from the interactions that users have with SERPs to continually improve.

From the examples that Google has offered, it would appear to me as though effectively breaking up content into user search focused subsections makes a great deal of sense. However, it does appear as though optimising directly to make the most of or even benefit from BERT searches is difficult. You need to anticipate your users to make the most of it at this moment.

However, as B.E.R.T sees more and more searches, you may find that you get a broader subset of queries delivering traffic with differing intents fed back to you in search console. Those most effective in capitalising on BERT searches will be those that find a way to meet the intent of B.E.R.T searches without compromising overall website quality.  

Some publishers will doubtless get this wrong and create incredibly narrow intent landing pages to cover these types of terms, choosing to target very specific searches with an equally specific landing page. These publishers may then up being hit with quality issues or being visited by panda, due to the newly created “bamboo” plantations across their sites.

What does this mean? Links have, for a long time, been the holy grail of search. Extremely powerful for both Google, who make use of the link graph to formulate the basis of their index rankings, and for publishers who have grown to understand their causal impacts. However, it is not until recently that publishers have started to understand how powerful link acquisition is. There’s been a total explosion in searches for SEO and SEO jobs this January with searches such as link building also growing.

However, the ability for big sites to “game” Google’s algorithm with the production of content and targeted outreach campaigns to establish and maintain dominance over their competitors is causing aggravation for some Google users. I know of many Google users who lament the apparent lack of diversity in search results today, especially on high volume, high-value queries. Many now claim that Google SERPs are, in some instances, less diverse and less useful than they have previously been. Despite some attempts to help prevent this.

This is where there is a line for Google to find (or ignore if they care only about the majority of users), surfacing relevant, useful content that meets the intent of the user has always been what has kept Google ahead of its rivals. However, as it’s market share has grown to a position of near-absolute dominance, there has, for some, been a regression in Google’s ability to deliver accurate results.

The way to think about this might be as accuracy vs precision. Some might argue, that for all of its accuracy, i.e. getting the right type of results in front of the bulk of its users, it has sacrificed the precision required to answer the incredibly nuanced precise results that a portion of its users will want (this is certainly something that I seem to find more and more often).

Diversity in SERPs is something that B.E.R.T may also help to improve, although only if smaller sites can capitalize before the larger sites do.

But personalisation of search engine results could change everything.

John Mueller recently suggested that personalisation is already at play in Google SERPs but I can imagine it going even further than it does currently.

For example, Google property YouTube learns about user behaviour and understands at a topical level what you like. It takes these learnings about which videos you watch and interact with and critically who publishes them to surface more suggested content for you that’s similar to what it knows you already like.

Imagine now if Google began to do the same, learning which type of publishers you typically click on to read news articles, find out information, compare products or make purchases, doesn’t it make sense for Google, a company that puts users first (jointly with profits) to surface content or content types that you are more likely to engage with than content it knows you have little or no previous familiarity or interest in?

Of course, search can be very specific, much more so than YouTube browsing, but in understanding and grouping publishers into “types”, Google could begin to surface content that works for you on a demographic level within that which is voted “best” by users through years of data collection.

How it will affect you? This depends, Google may continue to run with its current strategy of meeting the needs of most users more often and accepting the leakage of users who cannot find what they want going elsewhere. However, I think that now, as Google’s technology evolves and having witnessed the ability of YouTube to serve hyper-relevant suggestions on its homepage, search could be overdue a radical change.

If you can learn to know your user, which, YouTube has done incredibly effectively, you can learn to know what kind of result this user will expect when they type “x-search term” into Google.

2020 is the year that you will begin to see Google move beyond the words on a page search that has driven search engines for the past 15 years and employ insights from B.E.R.T to facilitate this change.

What you should do? To market your site to your customers, you must learn to market your site to Google (also your customer). My recommendation for this ultimately boils down to; knowing your customers better. Learn to understand who they are, what they like, what types of digital marketing activities work for them and reflect that on your website.

Google is not your friend; they don’t intrinsically care about you or your business. If you misstep and they think your site is irrelevant, you’ll be gone from search, although they do tend to give you a chance to notice issues before removing your site completely.

What makes a site relevant in Google’s eyes? – among other things, user behaviour is a common barometer.

So, the basis of your marketing plan for the decade (it’s probably a big job for just a year) is understanding your users better and proposition your brand to appeal to them directly, try to offer your users the types of personalisation seen elsewhere on the web to ensure your brand develops and users have good experiences with you.

What does this mean? Links have blown open SEO. It’s now more widely known than ever just how important links are to the ranking of every website out there. Given they are so critical and so important, more and more publishers seek to reap the benefits of link building campaigns.

Effective link building, while incredibly important, should not be done at the expense of technical SEO but rather, alongside it to ensure the benefits of both are recognised.

In terms of raw growth potential, links are probably more important than a perfect technical set up, although probably comes with many caveats.

Ultimately, every site is different and without a perfect technical set up getting the full value from link building campaigns would be a huge challenge and knowing the full potential value those links could have had would be even more difficult.

How it will affect you? To what degree you will be affected depends upon the current link building strategy of your site. If you’re currently not focused on link building then, in my opinion, it would be sensible to do so.

If you’re not, then the below may convince you of the importance.

Link building has exploded, agencies have been founded on their ability to build links and everywhere sites compete for links from those domains which still hand out followed links. However, as more and more sites understand the power of these links and then fewer and fewer sites are likely to continue to distribute them as freely. Eventually, this may be an issue for search engines.

How does Google measure the authority of a website or page if followed links are handed out less and less? Well like any winner playing a game, if they find they aren’t winning, they change the rules.

As of March 2020, rather than sites having to denote distrusted external links with a rel=”nofollow” and leave trusted links unmarked sites will instead get to choose between;

Link denotationUsage ScenarioMeaningImpact
rel= “nofollow”Distrusted linksLinks will not be followed by GoogleBot, the will not count in the ranking algorithm (but they may use them as hints)Followed links remain a priority. Nofollowed links can now be seen as wins by SEOs and link builders. Their specific impact is mired in fog.
rel=“sponsored”Paid or financially rewarded linksLinks with this included are marked as there existing a financial incentive to appear. Again, it isn’t confirmed whether they are excluded completely as a ranking metric or reserved as a possible hint.Rel=”sponsored” links are unlikely to ever count towards rankings. Will overuse cause issues?
rel=“ugc”Links added to your site by your usersIn this case, you and your site may not specifically “trust” the content mentioned and it may or may not actively factor into the algorithm, however, they are likely to be considered as ranking signalsReserved for the comments section of sites, implementation again will have an unknown impact. It could mean more sites allow linking from comments sections or even resurrect them.

The above removes the binary nature of link value assessments and means that Google can erode some of the value of links from sites trying to gain them to rank in search engines. Rather than nofollowed links automatically being ignored and followed links being counted, Google now has four variations to play with.

In addition to the introduction of rel=”sponsored” and rel=”ugc” both of which will have a largely unknown impact on search ranking algorithms, Google also stated that they will begin, in some instances to consider some nofollowed links as hints regarding the usefulness of the site. They said it themselves;

“When nofollow was introduced, Google would not count any link marked this way as a signal to use within our search algorithms. This has now changed. All the link attributes — sponsored, UGC and nofollow — are treated as hints about which links to consider or exclude within Search. We’ll use these hints — along with other signals — as a way to better understand how to appropriately analyze and use links within our systems.” – webmaster guidelines update.

When I read this, it got me thinking as to what the impact might be.

However, as alluded to previously, link building without good technical and content SEO is only half a job. All three should be prioritised to ensure the maximum growth potential of a site is fully realised.

The best ranking sites are those with a handle on all three SEO priorities covered, not just those which are a link target.

What you should do? For now, focus on all three important aspects of SEO.  Ensure links are built but do not do so at the expense of everything else. Links are needed, but an effective site structure and hierarchy is also important to pass the ranking signals of these acquired links to relevant pages on your site.

Also, as building links carries some risk to the overall health of the site (unnatural link building patterns draw manual penalties), technical SEO does the opposite and mitigates the risk to your SEO performance. Sites with good technical SEO and with content that offers useful, relevant content to users can still rank well without throwing every resource at one aspect of SEO.

Video Content Dominates Search Results

What does this mean?  Video isn’t suggested as a direct ranking factor, nor would I think it ever likely will be (although I guess you never really know). However, the latest brandwatch usage statistics for YouTube are phenomenal. 400 hours of video uploaded every minute, the World’s 2nd largest search engine and 1.9 billion monthly users. The amount of time people sink into YouTube is quite incredible.

Not only that, it’s effectively monetised, generating revenue for publishers and Alphabet (between $16 million and $25 billion each year). Similar to Google before it, YouTube is a great example of a user-generated content website done right!

As clutter has increased on SERPs, YouTube videos have become a staple of search engine results on queries from how to descale a washing machine and searches like best Vietnam tourist sites. Arguably both these exanmple search queries benefit from video content being prioritised high up the page, as they doubtless better meet the intent of a larger number of users.

Ultimately, more video results in SERPs means fewer clicks going to your website and more circling around Google’s ecosystem (which = more money for Google).

How it will affect you? As Google continues to find more and more ways to self-promote their properties through results, they will gradually end up eating your traffic. That is unless you are the publisher of these videos yourself and capture views either on your landing pages or through your effectively monetised YouTube channel.

What you should do? Think of ways to supplement your existing popular text-based content with video-based productions. Even if it’s not identical to the content that you are writing about a supplementary video hosted on your important pages can help you gain an edge in search. Effectively produced, it can increase user interactions and time that users spend on your website (as opposed to on Google) and if well optimised could get you a placement on the video carousel in search results as well as your organic content.

Not to mention the exposure that the video gets on YouTube.

Automation Becomes More Important to SEO’s Than Ever

What does this mean? Nowhere in Google’s Search Quality rater guidelines does it say that automated content is low-quality. I’m having a Winston Smith moment here and forgetting whether it ever actually did suggest automated content was equal to low-quality content but I am semi-sure it did. However fellow SEO’s over the years have suggested automated content should largely be avoided.

That was until humans started to realise that computers are pretty good at writing content and that it saves a massive amount of time. There’s a distinction to make here between automated content and thin content. For example, if you’re going to put live 2,500 pages of geo-targeted pages with content based on the same sentence e.g. “Find the best solicitor’s in insert place name here that will be able to help you with buying  a house, probate, divorce, marriage, criminal defence……” then you are likely to suffer quality issues in Google.

If you’re able to commission writing software on a PC to create a tailored long copy piece around the best picks for next year’s fantasy football teams, based on last year’s data set. Then, provided search exists for this set of queries, then there is a good chance that, provided your piece is accurate and meets Google’s quality content requirements, that it will rank well, generate links and traffic and help you gain subscribers.

Content written by machines has evolved, it’s no longer a case of a simple find and replace on location terms or product-specific variants but an actual system employed by many cutting-edge sites to handle their content needs.

As tools become more advanced and marketers gain access to a wider set of tools, automation in how this content is marketed will also begin to take hold. For example, a blog post I read recently suggested that it wouldn’t be too difficult to automate a system in which the highest visited content pieces had links automatically added to them to areas of high relevance elsewhere on the site.

How it will affect you? At this stage, a single tool to automate all of your SEO, content production and outreach activity does not exist, or at least, isn’t widely accessible right now. However, tools do exist that can make the tedious time-consuming jobs you do easier and as demand for them grows, inevitably more competitors will be produced. It is worth taking note of your immediate competitors and trying to get an understanding of how frequently they publish and update posts and if they are a long way ahead, consider identifying ways to make this more time-efficient for you and your colleagues.

What you should do? One way I am planning to start getting ahead of the automation curve is by continuing to learn and develop complementary skills alongside my existing ones and build on my knowledge of programming languages to help automate my most mundane tasks.

UK Search Results Become Dominated by US Brands

What does this mean? I am not sure if it only me who has experienced this, or if it is commonplace amongst many other Google users, but I appear to come across a great deal more U.S. and overseas domains dominating UK search results than I ever have previously.

There could be many explanations for this including US brands buying UK sites, properties migrating across from other English-speaking locations to compete in new markets or just more dominant, SEO led, marketing strategies by foreign sites, especially those based in the U.S.

This theory is almost in direct disagreement to Sistrix, who saw a large number of non-UK domains being removed from UK SERPs over 2019, (although, these sites had to have gained considerable visibility in the first place to then lose it again right?) and for me, as there are a large number of English language domains in this list, this could be explained by sites failing to correctly implement hreflang, the eternal stumbling block of international SEO.

Again, this could be something that can be explained by links and the prevalence of them globally vs those in the UK. The US had a population of 327.2 million as of 2018, approximately a 5 x larger than the UK’s at 66.4 million in 2018 and let us say for arguments sake that of those populations 5% of each has a website they can make changes on (total guess, the nearest stats I could get were NetCrafts Jan 2020 survey, which isn’t specific to geos.).

Then we’re looking at 16.36 million sites stateside with only 3.32 million sites here in the U.K. (also assuming 1 site for that 5% of the population).

Think about linking root domains and backlinks and how these matter to Google when ranking sites in its index and you will get some idea of how much additional authority U.S. sites could bring over with them to compete against our smaller sites.

How it will affect you? In reality, no one is safe from this, as the biggest and most successful brands abroad, but especially the U.S. grow their sites successfully it seems logical that they may wish to expand outwards to U.K. markets especially. There’s no language barrier to overcome, Brexit has slashed the price of the GBP and new trade agreements may even make working visas for U.S. citizens easier to obtain, especially in skilled industries like tech. Severing ties with Europe and it’s governmental regulations may also make removing existing workforces in companies taken over by U.S. companies even simpler.

What you should do? Ensure your SEO game is up to scratch. It will take time for this to occur (if it does) and U.S. sites from every sector won’t make the jump. Your strategy should be to define a unique value proposition of your business, what is it that you excel at that nobody else does quite so well? Then ensure that your SEO strategy is honed and targeted around that UVP.

That means improving content quality sitewide, understanding your users, building links and improving the overall quality of your product or service and getting a good structural grip of how your site can most effectively rank new content.

More Search Engines to Choose from Than Ever – But Google still takes 95%+ of Market Share

What does this mean? As people begin to realise how invasive tech corporations are becoming, many millions of users will seek out search engines where privacy comes first. This very often, at the expense of the quality of results returned to them. This isn’t just Google but I am including other tech giants like Facebook properties here also.

As sites become able to better track users between sessions, devices and with passive tracking of users becomes more of a certainty, not just more likely. Search engines are ripe for data hoarding and could be crucial in building demographics around users that could be incredibly beneficial to those who can leverage this data.

Interest in the protection of privacy seems to have spiked, especially as users notice sites marketing products through voice recognition and passive listening techniques.

Therefore, as Google begins to prioritise personalisation of search results users could decide to switch to search engines to protect their private data and where results may be less relevant but user data is protected.

Despite the switch, Google will still command a 95% market share as globally the internet becomes more accessible, especially in developing nations and younger generations who have grown up using the internet make decisions and forge loyalties to brands of their own.

How it will affect you? If you aren’t too worried about the impact of tech companies using your data to sell to you then the impact can be fairly minimal. For those that are concerned search engines like; Duck Duck Go, Search Encrypt and Qwant should help you find everything you need online. 

While we’re on the topic of alternative search engines, there’s also Ecosia, the carbon neutral search engine which offsets its carbon footprint by using their profits to plant trees around the World.  82,326,950 trees planted & counting!

What you should do? You’ll need to optimise for Google search. Very rarely does a search engine come along that tries to break the mold and reinvent how search is conducted and operated. Like Ecosia, the majority of search engines sell themselves as something other than the best search engine. There just isn’t room for two of Google in search. 

Google Core Quality Update January 2020 What Has Changed & Why This is Relevant to Our Predictions?

Not that much has seemed to change with core updates, or at least, that’s how it would appear. Typically, the kinds of sites that have benefited in core updates are those that are prioritising SEO. This typically means improving the site for users, answering the queries that users have and offering the right mix of expert, authoritative and trustworthy content. Those that have dropped have generally done the opposite, deceiving or misleading users, achieved the opposite.

Core does seem to regularly offer the biggest shake up’s in SERPs while very often being the hardest update to suggest fixes for. This isn’t normally helped by Google’s consistent message of reassurance around core updates stating that;

“There’s nothing wrong with pages that may perform less well in a core update. They haven’t violated our webmaster guidelines nor been subjected to a manual or algorithmic action, as can happen to pages that do violate those guidelines. In fact, there’s nothing in a core update that targets specific pages or sites. Instead, the changes are about improving how our systems assess content overall. These changes may cause some pages that were previously under-rewarded to do better.”

Google’s accompanying patter around update announcements

So, why do sites drop after core updates? More importantly, what can a site that does drop to ensure that they recover?

Essentially core updates are a website and page-level reassessment and often shake-up of everything out there. What was good before may not be good now, what could have used some improvement before is now on the rise. In many cases, it again seems to boil down to optimising your website to ensure it delivers the correct mix of expertise, authority and trust as alongside uniquely valuable content for your users.

The January 13th update highlighted some enormous drops for both some well-known and some less well-known domains.

For example; car marketplace carbuyer.co.uk lost close to 17% search visibility according to Sistrix and around 21% of organic traffic according to SEMrush. When I looked at the site, mixed content warnings (something which Google Chrome began blocking more aggressively from December 2019) on the homepage and (in my opinion) heavy use of adverts above the fold could be contributing to some of the impacts that the site has suffered since the Jan 13th update.  

However, there do seem to be other problems too. Unlike many sites that have found ways to host multitudes of content on single pages efficiently and effectively, carbuyer continues to use a stepped system where a new URL is used for each subsection of content around a car. For example, the VW polo hatchback requires 10 pages to effectively deliver all of the content required to get an accurate understanding of a single car.

URLPurpose
https://www.carbuyer.co.uk/reviews/volkswagen/polo/hatchback/reviewMain expert review of the car (LP in SERPs)
https://www.carbuyer.co.uk/reviews/volkswagen/polo/hatchback/mpgA whole page for mpg, CO2 & running cost figures
https://www.carbuyer.co.uk/reviews/volkswagen/polo/hatchback/enginesA page about engines
https://www.carbuyer.co.uk/reviews/volkswagen/polo/hatchback/interiorInterior features
https://www.carbuyer.co.uk/reviews/volkswagen/polo/hatchback/practicalityInternal dimensions and practicality
https://www.carbuyer.co.uk/reviews/volkswagen/polo/hatchback/reliabilityReliability & safety
https://www.carbuyer.co.uk/reviews/volkswagen/polo/hatchback/picturesPictures of the car and stock images of the car
https://www.carbuyer.co.uk/reviews/volkswagen/polo/hatchback/owner-reviews#owner-reviewsOwner reviews (of which 1 is available to read)
https://www.carbuyer.co.uk/reviews/volkswagen/polo/hatchback/video#videoYouTube videos made by carbuyer
https://www.carbuyer.co.uk/reviews/volkswagen/polo/hatchback/variantsVariations of the car models, engines drivetrains etc.

All of these URLs appear for just a single model of car (the 2019 version). Granted, this would be a considerable amount of content to host on a single page but the multitude of separate URLs (across hundreds of models) does little for content quality. Imagine also, if a car comes in both saloon and hatchback guise, like the Audi A3 and you’ve got the above number of URLs, with very similar or identical content multiplied by 2.

Structurally, this creates headaches. Multiple URLs across the same level does little to denote hierarchy across the site, something made especially worse by the fact that each of these pages has an identical h1.

This kind of set up will certainly make it more difficult to rank a single page for high volume generic make and model-related queries and arguably could pose users headaches if they’re trying to surface the specific information around a certain page.

You’re also diminishing the value of these URLs as linkable assets. Rather than having a single strong page for sites to link to which will accumulate “link juice” over time, there are 11 different pages to choose from when adding your link.

It’s also not the greatest for users, who may need to load several pages to find the information they want, rather than a single page that has everything. In theory not too much of a problem, unless you’re on a slow or 3G connection.

An alternative site, which seems to better understand efficient delivery of content is Carwow, who hosts all of their Hyundai Tuscon deals content on a single page, with # navigation links to surface specific content for each model type; https://www.carwow.co.uk/hyundai/tucson/deals#trim-premium-se. The same is true of their review content: https://www.carwow.co.uk/hyundai/tucson

If carbuyer were something I was working on I’d be pulling out the specific issues related to these drops, highlighting the URLs where traffic has been lost and suggesting, very urgently a reconsideration of how content is mapped on the site. Grouping content on a single strong page also helps ensure it as the content ages, relevance to users will always fade on content like this as models are replaced, but the old pages can still provide value to owners and evolve to a second-hand buyers guide as time goes on.

URLs like this must be a nightmare to keep track of not to mention a nightmare to keep up to date.

In other examples of core at work, I have seen organic traffic drop by around 45.7% on an anonymous domain, since the January update. Unlike in previous updates, where some terms drop and others rise, changes this time, seem to be in one direction only and that direction is down.

Terms, crucial not just for visibility but for revenue have plummeted across almost every sector for this particular website. According to my analysis, 299 terms have dropped by 5 or more places with volume estimates totalling over 80,000 searches per month (admittedly actual lost traffic volumes will be less).

For this particular site, it seems obvious to me what is responsible for these huge losses. Google suggests there is nothing wrong with a site that has dropped but upon deeper assessment and analysis I’ve been through and pulled some areas where this particular site must do better if it wants to recover the lost traffic.

  • Stagnating content; I spotted content on this particular site which hasn’t been updated in over 2 years, one example hadn’t been edited as far back as 10th December 2018. As a Y.M.Y.L. sites (Your money, your life) content freshness is critical to help show Google that the site is regularly maintained and updated. If Google thinks content is out of date and risks jeopardising the future health, wealth or future prosperity of users they’ll drop it out of search fast. These low update frequency URLs are one I saw plummet in Google’s index.
  • Other trust signals are missing from this site including; customer testimonials. A lack of testimonials is one less thing for search engine’s like Google to use to determine the trustworthiness and usefulness of a site for users. Testimonials and customer feedback on a website is something that Google actively looks for to ensure that sites are legitimate and provide value to users.
  • Thin new content being produced for the site, while not bad in terms of quality, I saw examples of the site recently producing and publishing examples of thin, time-sensitive content that, rather than filling gaps in the existing knowledge graph of the site, adds very little of value to a user. This kind of content production rarely does anything to help any site, let alone sites that rely on providing value to users through education and information.

Ultimately, as with many things in SEO, the fixes are unlikely to require that much of a drastic change in thinking. However, at scale, when there are multiple old and out of date pages, when the site does nothing to reflect user satisfaction and fresh content serves no discernible purpose, core is on hand to reshuffle the sector as it sees fit.

E-A-T Improvements Offer the Biggest Gains – Especially for Sites Yet to Optimise for E.A.T.

So, how do you improve E.A.T? Well; speaking to an SEO makes sense. They will be able to recommend defined steps to take to help your site improve. Whether that is; improving the overall quality of content that your site offers, building links, getting accreditations for authors, removing old and low-quality pages or thinking of ways to display the positive feedback your company gets on the site.

Why E.A.T Will Be Important This Year

As automation thrives across sectors and content can be produced to a better standard by computers, sites are going to need to prove that they are offering value to human users and not just churning out content that can rank with machine learning tools or content automation tools. Therefore, E.A.T signals which Google currently uses to determine the quality of a site will grow in importance and value to all sites.

Like anyone who finds a company to provide a service for them, Google wants to know that you can be trusted with their users, Google will do anything to keep their users happy (to almost the same lengths they go to make millions dollars). For Google, their users and their market share are their most precious commodity and if you jeopardise this by running a site that deceives users then Google will take you down.

SEOs Should Aim for Profit & Rankings Over Good Scores in SEO Tools & Suites

The explosion onto the market of SEO tools, of course, means the industry has become more and more accessible. With greater accessibility comes a greater number of SEO professionals. More SEO professionals beginning their career working alongside SEO tools, services and systems means more of them relying on these tools and services for the right answers.

Unfortunately, this is where the problem may lie, SEMrush is great, I’d struggle to think of an SEO who didn’t find it useful. However, it isn’t absolutely perfect.

SEMrush offers one of the best up to date query tracking services for sites focused on a set of priority keywords, a comprehensive site audit feature that aims to highlight and clarify onsite and technical SEO issues and all sorts of other brilliant features for aspiring SEOs. It really is deep and useful.

However, for all it’s utility, or perhaps because of it, there lies a problem. SEO is very definitely an experience game, if you have the experience levels, while it doesn’t make SEO simple, it can make identification, diagnosis and correction of SEO performance much faster and easier.

What SEMrush does is distil important ranking metrics into elemental problems and issues and assign a generic “score” to each one. Those who have always relied on SEMrush begin to see these scores as the pinnacle of success and critical to SEO growth and even in some cases, a badge of honour as to their SEO proficiency.

In actual fact, SEMrush, like the vast majority of SEO tools, does not offer significant detail on a site by site basis to be considered effective for every site.

What I mean by this is the following; just because you have taken a site from a 67% audit score to a 100% audit score does not necessarily mean you will rank any better.

Although SEMrush does include a great deal of information around technical SEO best practice and suggests relevant optimisation tips for most sites, I still believe that there is no substitute for experienced, knowledgeable and expert SEOs that understand what causes a site’s organic traffic to grow and equally importantly, what can cause it to drop.

Ultimately, as much as I appreciate tools like SEMrush, I would encourage SEOs to use them as part of a holistic vision of the site they are working on, rather than in isolation as a way to indicate a well optimised or poorly optimised site.

I see it more frequently than ever before, SEOs jumping on to a message board or forum asking questions around a particular tool or feature, querying whether it is a problem and should be fixed, despite it appearing to be fine in Google Search Console and other webmaster tool services. I also notice SEOs pushing their abilities to optimise sites to 100% on SEMrush as a selling point or hook on freelancer sites like Fiverr or freelancer.com.

I understand why they do it, company owners may not have the time or inclination to fully understand SEOs, that is kind of what we are paid for, and so a single data metric to show success might be a good way to illustrate what you will achieve, but I would urge caution, SEO success cannot be measured on a single data point.

Although optimising a site to score 100% in the SEMrush audits feels good and might be worthwhile (especially if the sites are suffering heavy technical issues) it does not guarantee increased rankings, more traffic or more profit.

Ultimately, SEOs should be optimising sites for search engines and predominantly at least for now, Google.

SEMrush does not bring your website traffic, so while optimising for a score of 100% in their site audit tool might feel like a priority and might offer you something to show off about within the SEO sphere, it should not become the primary function of SEMrush customers.

Errors and issues flagged within the service should be weighed up against current priorities and fixed as and when suits existing road map schedules unless they are likely to be critical to current rankings or performance.

SEO is a deep, broad and varied industry. There are multiple methods to help ensure success and there are many ways to get things wrong. Ultimately, the experiences of those who have been there, seen it and done the work count for far more than those who look to gain the highest scores in their auditing software suites.

That is all for this post. Good luck with your SEO activity for 2020.

by Ben Ullmer

Ben Ullmer seo specialist

About Ben

SEO Director

Ben is the founder and SEO director of Search Natural. He spent 8 years working in SEO at some of the biggest comparison sites in the UK before setting up his own business to work as an SEO specialist with clients around the world.

About Ben

Ben is the founder and SEO director of Search Natural. He spent 8 years working in SEO at some of the biggest comparison sites in the UK before setting up his own business to work as an SEO specialist with clients around the world.
Open chat
Hi,

Chat to us on WhatsApp.