S/N < SEO News & Blogs < Top 10 Best SEO Websites by the Metrics that Matter

Top 10 Best SEO Websites by the Metrics that Matter

We’ve compiled some big data to help determine which of the most popular SEO websites out there have the best SEO strategies. With this data and analysis, we want to help you understand which SEO metrics really matter in organic search for 2019 and beyond.

The Data Behind the Best SEO Websites

Site NameURLSite type# indexed website pages# Backlinks live# Backlinks recent# Backlinks historic# followed backlinks -live# Referring DomainsNumber of keywords ranking top 3Number of keywords ranking 4 - 10Number of keywords ranking top 10Number of keywords ranking top 11- 20Number of keywords ranking top 20Number of keywords ranking 21 or greaterTotal number of keywords
Mozhttps://moz.com/informational, rank tracking, automated SEO testers, keyword tools61,10024,726,75530,352,03630,120,6987,800,17496,5501,4482,8254,2734,2308,50334,47042,973
Search Engine Landhttps://searchengineland.com/informational, SEO news30,8001,000,000961,1881,322,483935,08672,1759963,4944,4905,54810,03835,53545,573
Distilledhttps://www.distilled.net/informational, consultants2,60025,42029,29059,98822,5376,275451161612714324,3354,767
Search Engine Watchhttps://searchenginewatch.com/informational, SEO news23,700321,985373,253652,259282,06646,7345411,5512,0922,1724,26413,31917,583
Search Engine RoundTablehttps://www.seroundtable.com/informational, SEO news28,600132,976164,351455,773114,12337,4421166087241,3322,0568,96811,024
Search Engine Journalhttps://www.searchenginejournal.com/informational, SEO news21,400328,390410,592958,30640,658235,1127952,7643,5594,4147,97329,11437,087
hobo webhttps://www.hobo-web.co.uk/Consultancy, informational1058,0969,72845,4636,3023,8072576268837051,5885,1766,764
gsqihttps://www.gsqi.com/informational, consultants1904,3394,98112,2953,6841,69041923951181,2521,370
Ahrefshttps://ahrefs.com/tools, backlink tool, keyword tool, site audit, informational2,000164,085118,738449,09064,23928,3195739951,5681,0912,6599,52812,187
SEMRushhttps://www.semrush.com/tools, keyword tool, site audit, informational7,8701,000,657220,476187,38034,00133,0052236718941,1712,06510,16812,233
SiteBulbhttps://sitebulb.com/tools, site crawler, SEO audit tool5607,4651,8232,2611,13558274451114165657822
Screaming Froghttps://www.screamingfrog.co.uk/tools, site crawler, SEO audit tool32320,14924,19025,89216,42810,1571302153454217664,6395,405
DeepCrawlhttps://www.deepcrawl.com/tools, site crawler, SEO audit tool1,1503,4485,1087,4132,8391,970167894221161,5851,701
Similar Webhttps://www.similarweb.com/tools, engagement tracker216,0001,000,000184,628168,659559,85119,6411,2567,3788,63410,80619,44033,53552,975
Sistrixhttps://www.sistrix.com/tools, visibility tracker, informational2,34012,76513,98522,45886,1722,361722112834697522,4273,179
Searchmetricshttps://www.searchmetrics.com/tools, visibility tracker1,63018,93520,81046,09814,2918,998521361882093971,8702,267
Yoasthttps://yoast.com/tools, informational4,540118,597138,131243,68796,42039,7753548601,2141,5202,73412,87215,606

*all data correct as of April 2019.

The Data Driving Organic Search Performance

Although SEO has come a long way. Links, and specifically followed links, still play the single biggest part in getting a website to rank effectively in search engines.

Therefore, assessing the backlink profiles of several of the biggest and most renowned SEO websites can give us incredible insights into who is effectively generating links and implementing consistent link building strategies.

This section covers;

  • how links became the cornerstone of Google’s ranking algorithm
  • how more links usually mean greater ranking potential

Briefly, the importance of links can be summarised as;

  • A way for users & search engines to traverse the web
  • a method of denoting the relevance of one page to another based on topical associations
  • this relevance is quantified as an endorsement within search engine algorithms.

Followed backlinks are the cornerstone of Google’s Search engine ranking algorithm.

Google observed how useful links were for users traversing the web, users were guided by links to find information around topics of interest and discovered related websites that could aid or assist in the completion of your goals.

Google wanted to, and arguably has been best at, replicating user behaviour with their web crawler.

So, within its development phase, Google began counting links as “votes” to help identify those which should rank well for early search engine users.

Not only was it beneficial for real human web users but links were also incredibly useful for Google themselves as they developed search engine bots or spiders which travelled the web for them, reading pages, assessing content, keywords and helping to determine which sites were relevant to a set of queries.

As Google developed their technology, they built a system which counted links, while assessing the anchor text of that link, so that they could determine (at a simplistic level) two things about the page;

  1. which query or subset of queries the page should rank for (commonly used words in the anchor text of links)
  2. where in the rankings the page should sit (number of links)

With the assumption that more links equated to more relevance and a better chance of pleasing their users, Google added this into their ranking algorithm.

Google found it to be such a successful way of surfacing relevant pages for user queries that they have retained the importance of backlinks in every subsequent iteration of their algorithm.

Google’s current approach to backlinks has evolved to become markedly more complex but don’t worry, for this piece it’s enough to understand that followed backlinks are a direct ranking factor.

Indexed Pages as an Indicator

Indexed pages alone tell us very little; many large sites rank very well in search results; many smaller sites don’t rank at all.

However, things get interesting when looking at links and indexed pages together.

Although not a ranking factor in itself, the number of indexed pages a site has can prove incredibly useful when assessing whether or not the targeting of a website is effective.

Simplistically, more followed backlinks to a lower number of indexed pages can equate to high levels of link juice (page rank) flowing through a website.

Conversely, a large number of indexed pages with a low number of incoming backlinks can equate to low levels of linkjuice or page rank through many more indexed pages on a website.

This section of the blog explains;

  • more pages do not always mean > organic search success
  • more pages can sometimes negatively impact your organic rankings
  • websites should be audited frequently and holistically, to help mitigate the risk of search engine ranking penalties and quality-related demotions.

Many sites, especially large ones, seem to host many thousands of indexed pages. This on its own does not mean that the site will have issues or be poorly optimised for search but invariably a greater number of pages can mean greater complexity when managing pages and greater complexity very often leads to complications.

Therefore, large numbers of indexed pages can be a good indicator of the possible existence of quality issues, closely duplicated content and keyword cannibalisation.

Decisive insights, in which actions are taken, cannot be made from looking at indexed page numbers alone.

As each of the businesses behind these sites makes money in different ways, some will benefit more than others from hosting a larger number of indexed pages and legacy content.

Where totalled followed link numbers are low and page numbers are high, we can end up with a low number in our calculated table indicating that perhaps;

  1. link building for the site may not be a priority (certainly not at page level)
  2. the content covered could be varied and spread across multiple pages
  3. sites may suffer from some close duplication, cannibalisation or time-sensitive topics remaining on live pages which are still indexed despite search potential of these pages having dropped off.

It may seem assumptive to claim this without further analysis into each of the sites (and without more detailed page & link level audits this is true).

However, you would be surprised at just how many websites I have come across, many being large £ multi-million organisations, which generate large portions of their traffic from organic search visits that suffer from these kinds of problems.

Linking Root Domains

Although not a page-level metric in my dataset, linking root domains are also an effective indicator as to the viability of a site to perform in organic search.

The concept to remember much like the number of backlinks is;

  • more = better (normally)
  • quality > quantity (almost always)
  • more + quality = success

They’re a really useful concept to get used to when you consider and understand the idea of link juice. Linkjuice as a term is often used interchangeably with page rank (although this isn’t entirely the same thing).

Link juice is the conceptual flow of ranking potential passed when one page or website adds a followed link to another.

i.e. if my page here adds a followed link to BBC news with the anchor text news then, given what we know about followed links as part of Google’s algorithm we know that this link will add a “boost” to the BBC news page which could aid the ranking potential of the BBC page for the search query news.

If links are a core feature of ranking algorithms today then link root domains are their correlative cousins, which provide us insights into things like;

  1. the potential of a domain to attract links from other domains
  2. insight into the quality of domains (given the data offered behind the raw numbers).
  3. Strategies to grow the authority of a domain if it already has a large number of followed backlinks to its priority pages.

Linking root domains can help search engines like Google understand the relative quality of each site compared to another. As the quantity of followed links alone became unreliable, due to things like link farms and private blog networks a need arose to distinguish the high quality and highly relevant links from good quality sites over poor quality, spammy links bought by sites from link schemes.

Therefore, without insider knowledge of the ranking algorithm and the weightings used to determine what each metric is worth, it is reasonable to assume that domains are assigned a strength or importance metric in Google’s index.

With this assumption, it also becomes reasonable to think that a followed link from a strong, authoritative and quality domain like bbc.com would be greater than a less authoritative site like fakedomain.com, especially to a site which published “news”.

In addition to certain domains theoretically having more of a valuable presence in SERPs, and therefore the ability to give out more valuable backlinks, there is also an assumption that a more varied spread of linking root domains i.e. if site a has 200 links from 150 unique LRDs it’ll likely rank higher than the equivalent site B which has 200 links from only 50 unique LRDs.

Keywords, Content & SEO Strategies

It isn’t as difficult as many believe to get to the top of Google SERPs if you’re only looking to rank for low value or low volume queries. Get everything right and you could see your page at the top of a results page in a matter of days or even hours.

The real challenge in SEO comes from identifying the queries that you want to rank #1 for, and making that happen.

Low value and low volume keywords can often be reasonably simple to gain traction. However, the real challenge of SEO is ranking sites for keywords that drive conversions. These keywords are normally extremely competitive and considerably more difficult to achieve page one rankings for.

Given our data set, we can quite simply see which of our chosen SEO sites rank for the highest number of keywords.

How Keywords Per Page Can Indicate SEO Performance

Keywords per page isn’t a ranking factor, certainly not directly. Gone are the days of Google ranking site A over site B because it mentioned the word “news” 15 times more than site B.

However, looking at the number of keywords a site is ranking for can, for our exercise, be a very useful metric to consider.

When looking at domains vs domains, understanding the overall potential of a website to rank well in search engines.

As well as an indication of overall traffic volume (does the site get traffic to all of its pages or not?) it can also help us to determine how effective each site is at targeting and obtaining traffic from relevant keywords they are.

At domain level, it can also help us determine which sites with large numbers of indexed pages likely receive very low volumes of traffic per page.

Of course, these numbers will vary hugely depending upon the nature of the site (news sites are likely to produce a great deal more content which may have little or no long-term organic search endurance).

It can also help SEOs understand whether something like a website audit might be beneficial and if things like content consolidation could help grow search performance.

These kinds of assessments and insights are incredibly important, as search engines like Google have developed algorithms specifically to find and “penalise” sites that attempt to rank multiple pages for one keyword or have legacy pages that cover similar topics.

Consider bulk keyword analysis with a keyword universe to get started with your website.

Our Analysis

Site NameURLbacklinks / indexed website pagesfollowed backlinks live / indexed webpagesKeywords by pagetop 10 keywords by pagetop 20 keywords by page% of total keywords ranking top 3% of total keywords ranking top 10% of total keywords ranking top 20% of total keywords ranking 21 or greater
Search Engine Watchhttps://searchenginewatch.com/13.5911.901.550.110.261.82%7.31%16.88%83.12%
Similar Webhttps://www.similarweb.com/4.632.591.390.120.242.29%8.29%17.51%82.49%
Screaming Froghttps://www.screamingfrog.co.uk/62.3850.860.390.030.071.05%6.57%18.65%81.35%
Search Engine Landhttps://searchengineland.com/32.4730.361.730.170.372.14%9.60%21.50%78.50%
Search Engine Journalhttps://www.searchenginejournal.com/15.351.906.090.781.334.70%12.87%21.82%78.18%
Search Engine RoundTablehttps://www.seroundtable.com/4.653.991.360.120.322.26%8.90%23.66%76.34%
hobo webhttps://www.hobo-web.co.uk/77.1060.020.740.090.183.08%11.90%24.25%75.75%

What Our Calculated Metrics Indicate

Winner: Moz – 404.69

Moz has an incredible amount of links per indexed page, especially when you consider they have over 61,100 pages indexed.

This is no doubt because of their exceptional, easy to understand and engage with SEO focused content, which will draw links organically as well as making outreach simple.

Also, Moz hosts a vast amount of user-generated questions within it’s /community/ subfolder. This dual-pronged strategy of creating great content and providing a forum to host direct answers to specific questions allows sites like Moz to tick along gaining links successfully.

What is interesting is that Moz does not lead the followed backlinks per page list, so let’s assess who does and the possible reasons why.

Winner: Backlinko – 203.90

They have a great deal fewer indexed pages than Moz, approximately 59,900 fewer. However, when looking at the indexed site as a whole, per page they’ve acquired greater numbers of followed links than Moz.

Of course, there could be masses of potential explanations but here are some observations;

  • they produce content along the lines of “the definitive guide to” – these are fantastic for generating links organically and naturally from high-quality sites.
  • given they attract links anyway, it also makes them incredibly easy to outreach.
  • Followed links present a unique challenge in the search industry. Very often, those that know the power of followed vs nofollowed links won’t offer them so freely.
  • This underlines the huge importance of generating uniquely valuable content that contains actionable tips and helps users achieve a goal.

Total Keywords Per Page

It is worth clarifying here, ranking for a large number of keywords does not necessarily indicate success. Lots of keywords exist and it’s inherently easier to rank for some than others. With that caveat covered let’s look at who is dominating, with the assumption that more keywords = better.

Winner: Yoast – 64.42

Another big victory for Yoast, a site that boats exceptional SEO content. Despite not having a backlink profile anywhere near the size of sites like Moz or Backlinkio their content strategy and targeting seem centred on delivering actionable useful content.

Amongst their “how to”, “what are” and “the definitive guide to” Yoast is fantastic for step by step pictures that support their instructional content.

With this content and alongside their plugins and services (products), which have mass appeal to wordpress site owners, they seem to have effectively grasped the value of sharing “recipes” for SEO success.

This is a testament to google ranking pages that help users solve problems or achieve a goal over those that don’t.

Number of Keywords Ranking Top 10 per Page

Winner: Yoast – 8.41

No surprises seeing Yoast exactly where every site would want to rank. A cursory glance at their biggest keywords highlights some huge volume queries that they’re dominating like “search terms” and “meta description”

Number of Keywords Ranking Top 20 Per Page

Winner: Yoast – 15.12

Again, within this set of top 20 queries, we have some huge terms including several related to other brands “WordPress” is one, likely for Yoast’s hugely popular SEO plugin.

Another interesting one is “google news” even though Google themselves host multiple domains ranking for this query and several big news sites also host pages dedicated to covering Google news like the BBC and Wikipedia.

Keywords by %

As with the overall keyword number, % of keywords ranking isn’t a decisive metric. However, it is a useful leveller as an aggregation metric for our study, especially if we allow ourselves to assume that each of these sites is aiming to and can rank page 1 for all of their targeted queries.

% Keywords Ranking Top 3

Winner: Search Engine Journal – 4.70%

Here we see Search Engine Journal returning the greatest % of queries in the top 3 positions. Given that their total number of targeted keywords is fairly low in comparison to other sites, it does seem reasonable to assume that they effectively research and target queries before they publish content to ensure that they are capable of ranking well and ensuring content is optimised to take bigger queries as the nature of the search around the topics change.

% Keywords Ranking Top 10

Winner: Sistrix – 16.30%

A good place to be, 16.30% of the keywords that Sistrix rank for are appearing within the top 10 positions on search engines, based on our data.

Assuming that these keywords have been actively targeted or are at least determined to be semantically a good fit based on the content on the pages then this seems like a commanding position for Sistrix to be in.

% Keywords Ranking Top 20

Leader: Sistrix – 36.70%

Given that, within our data set, Sistrix had the largest % of their total keywords ranking in top 10 positions this is likely a positive position for them to be in.

% Keywords Ranking 21 Or Greater

Highest %: Distilled – 93.18%

Despite having great content, over 93% of their keywords rank in positions 21 or greater. This doesn’t necessarily mean it’s an SEO or content quality issue. Many keywords across verticals of all types can be incredibly competitive and this could be just one of the reasons why distilled aren’t ranking as many of their pages within the top 10 positions in SERPs.

Regardless of the reason behind this, it does appear as though distilled aren’t quite nipping at the heels of the likes of Moz yet.

Get in Touch!

Here at Search/Natural we follow the latest news and keep in touch with organic search developments as they happen so you can be sure that we’re offering up to the minute assistance in what is a fast-paced and dynamic industry.

We write SEO news and blog posts semi-regularly covering topics like modern trends in search and where we think organic performance marketing is headed.

If you’re interested in getting some help with your organic search efforts then check out our services if you need some advice or just want to have a chat then you can always drop us an email.

Why We Put This Post Together

The reason we wanted to put this blog post together is;

We wanted to found our organic search optimisation business with a data-led approach. We think that by leveraging data we help any business to grow its organic search performance.

As well as the fact that data offers invaluable insights for SEOs.

We at Search/Natural, are truly passionate about organic search. We truly enjoy gathering and assessing website data and use it to help understand, from a technical and strategic perspective, where growth in organic search performance can be achieved. We also think data is fun!

Are You Overpaying for Your SEO?

Are you worried you are overpaying for your SEO?

Do you think you could get it for less each month?

Check our SEO price calculator to see if we could offer you a better service each month and save you money in the process.

Data Caveats

Here are the caveats around the data;

  • This isn’t a full SEO audit and nor should this ever replace one
  • SEO audits are vital to effective organic search performance, please get in touch if you’d like one completed for your site.
  • This data is also not intended to show sites that are the most profitable or the “best” businesses. It also doesn’t show who is generating the greatest success in organic search. Google uses many hundreds of ranking factors that are not considered as part of this assessment. However, big data pieces, centred around two critical ranking factors; links and content can be both interesting and useful.
  • All of these sites are different and more than likely so are all of their audiences, targeting, priorities and strategies.
  • We haven’t factored in branded search, sites like likely Yoast have a great deal more search volume with “Yoast” included as an operator due to their plugin. Especially when compared to a site that offers much more generic informational content.
  • This data doesn’t assess the quality of each of the tools and services offered by each site, it’s more of a “how good is the website SEO – a big data assessment” than a which one you should use, although I do have some recommendations.
  • The real value of these sites comes from the great selection of tools they build, content and news they produce and how their platforms enable knowledge sharing and information which can empower SEOs [like me], content marketers and outreach experts to excel in their fields every day.

Which SEO Website Should You Use?

There are three sites in particular that we believe offer exceptional tools and services and which frankly, we would be lost without [Get these tools people!];

  1. SEMrush – their tools, SERP sensor, keyword tracking & tagging service, competitor insights and overall domain analytics software is super easy to use and can help anyone get an understanding of the priorities for their website to iterate on for SEO performance. You can start a 7 day free trial of their “Guru” membership package, normally priced at $199.95 (£159.89) per month.
  2. ahrefs – the undisputed leader of backlink monitoring research and analysis. Most SEOs won’t leave home without it.
  3. Screaming Frog – a second to none web crawler which allows for custom extractions of almost anything on your website and enables analysis of your site’s pages in bulk. All for the bargain price of just £149 per year.

by Ben Ullmer

Ben Ullmer seo specialist

About Ben

SEO Director

Ben is the founder and SEO director of Search/Natural. He spent 8 years working in SEO at some of the biggest comparison sites in the UK before setting up his own business to work as an SEO specialist with clients around the world.

Open chat

Chat to us on WhatsApp.