How to know the authority of a domain and its pages?
At the time, knowing the authority that Google assigned to a page was easy: PageRank was public . There were multiple tools (free browser extensions etc.) to display the PR of any page.
But since 2013 Google began to restrict access to PageRank and in 2016 it closed its doors for good . Although there is no official version, the reason is probably that they considered that giving this information incites Black Hat techniques.
However, it was not something dramatic because there were already alternatives to Google’s official PageRank on the market that even tried to improve the original version with finer scales (from 0 to 100), differentiating between the authority of a page and the entire domain, etc.
Now, you have to be very clear that all these alternative metrics are mere estimates , more or less reliable, but estimates, after all. Official PageRank data no longer exists.
These estimates are based on simulations of Google’s original algorithm and additional information about other relevant ranking factors that Google has been known to add in updates to its algorithm.
We then talk about the most relevant authority metrics and the companies that have created them.
Calculate domain and page authority with Moz (DA and PA)
The “classic” reference in authority metrics, together with the Majestic metrics (which we will see next), are the page authority (PA) and domain authority (DA) metrics from Moz , one of the reference companies in Global SEO.
The good thing is that you can have access to these metrics in a completely free way thanks to the MozBar , a free browser extension that overlays these metrics to each of the Google results as you can see below:
We can roughly say that these metrics try to replicate Google’s PageRank, but differentiating between the authority of the page and the authority of the domain (which would be something like the sum of the authority of all its pages).
Personally, they are the ones I use the most in practice . They are simple , intuitive and, given that these types of metrics cannot be very precise due to their very nature, I consider that they reach a sufficient level of precision and coherence to be useful in practice.
Measure authority with Majestic’s Trust Flow (TF) and Citation Flow (CF)
As I said above, the other great alternative to estimate the authority of a page is Majestic ‘s Trust Flow (TF) and Citation Flow (CF) metrics , “trust flow” and “citation flow” if we translate them into Spanish .
Majestic introduced an interesting new idea with these metrics: a correlation between the two that gives us additional insight into the quality of those links .
Now, these metrics cannot easily be used with free tools similar to MozBar because Majestic only provides them under a paid service.
Let’s see how the idea of ??TF and CF works:
What is the Majestic Trust Flow (TF) and how does it work?
Trust Flow, which is a trademark of Majestic, is a quality -based score on a scale of 0-100. It is based on a manual work where Majestic collects many sites as “trusted seeds” , based on a manual review of those websites .
The TF measures the authority taking into account the number of jumps (clicks) that separate a given page from one or more of the “seed” pages .
That is, if page A links directly to page B, the link is direct. But it could be that A links to B and B links to C. In that case, there would be transfer of authority, but less quantity because it is an indirect relationship.
What is Majestic’s Citation Flow (CF) and how does it work?
On the other hand, described in a simple way, the Citation Flow would be simply Majestic’s metric that emulates PageRank , similar in that sense to Moz’s PA.
How to estimate authority with the Trust Flow / Citation Flow ratio
The truly differentiating contribution of these metrics is when both metrics are correlated . The basic idea is that the TF/CF ratio would give a more accurate estimate of the authority of the site.
For example:
Let’s first take a page with a TF of 40 and a CF of 40. This would give you a ratio of 1 .
Let’s now take another page that also has a CF of 40, but with a TF of only 10. In this case, the TF/CF ratio of that page would be 0.25 .
According to Majestic, given the same CF value (its emulated PageRank), but with a lower TF, this last page would have a significantly lower authority than the first because the proportion of seed sites identified as “trustworthy” by Majestic would be much smaller.
Or what is the same: the profile of high-quality links is much higher in the first case and, therefore, the first site deserves more trust and be considered more authoritative.
If the ratios of all the pages of a website are represented in a graph like the following, we can also have a global vision of the authority of the website:
On paper, this idea of ??Majestic is very interesting, which leads many people to consider it a superior metric to the Moz metrics. However, I personally don’t see it clearly .
The biggest downside that I see in relation to the reliability of the TF is the fact that it is based on manual work .
With the current size of the web (over a billion sites) and the speed at which it changes, is this manual seeding job really feasible and accurate enough for a mid-sized company like Majestic?
In addition to this, there is also criticism of the quality of Majestic’s link database , compared to other major competitors like Ahrefs for example.
In any case, it is not the purpose of this post to enter into this debate, I simply wanted to raise these reservations about the Majestic system. I refer you to gather more information on this issue in other sources if you want to deepen it.
Alternatives to check the authority of a website and the difficulty of a niche
Moz and Majestic metrics have been the benchmark for many years, with hardly any alternatives. But over time several of them have emerged.
Ahrefs UR and DR metrics
This tool has gained a lot of popularity in recent years and has positioned itself as one of the largest in the SEO tools market.
It uses two metrics that, roughly speaking, can be compared to Moz’s PA and DA, also on a scale from 0 to 100 since they have a similar philosophy: UR (Url rating), which would be the equivalent of Moz’s PA and DR (Domain rating), which would be the equivalent of Moz’s DA.
SEMrush Authority Metrics
SEMrush is, like Ahrefs, another tool that leads the market for SEO tools. In fact, it is the tool we use in this blog and my favorite.
This tool also has its own metric called Authority Score (something like “authority score” in Spanish) and which, in turn, is calculated from the Page Score (page score), Domain Score (domain score) and Trust Score , along with a few other metrics.
Without going into nuances, so that we understand each other, although these metrics are based on SEMrush’s own methodology and algorithms, the idea is a bit like a combination of the ideas we saw with Moz and Majestic.
Page Score would be similar to Moz’s PA and Domain Score to DA. The trust score is a metric that is similar to the Majestic TF and all of this combined would give the Authority Score that is similar to the idea of ????the Majestic TF / CF ratio.
It should be added that SEMrush has some very interesting tools and metrics that influence this, such as the Toxic Score , which comes from one of the “toxic” link detection tools (very low quality domains, probably penalized by Google).
Ubersuggest metrics
One tool that I can’t overlook is the recent Ubersuggest .
Along with the MozBar, Ubersuggest is the only one that provides us with free metrics , at least, without the severe limitations of use that freemium versions of paid tools such as SEMrush, Ahrefs, or Majestic have.
Although it cannot be compared with the set of functionality of SEMrush or Ahrefs, the truth is that Ubersuggest has a very complete functionality for a tool that, today (we’ll see how long it will last…), is 100% free.
In this sense, it also has an authority metric that is the DS (Domain Score) that is similar to the other domain authority metrics, although it does not have a page authority metric .
And what are page and domain authority really for?
All this is very interesting, but you may have wondered what it is really for, how we can use it for our purposes.
The answer is simple: apart from being able to follow the evolution of the authority of our site and its pages, these metrics serve to analyze to what extent we can compete in Google according to what searches and what strategies to use to do it in the best possible way.
What is the weight of page and domain authority in web positioning?
As I mentioned at the beginning, links (and therefore the authority of a website) are among the most important ranking factors . So much so, that with little authority you are going to have it really raw in searches that do not have a low level of competition.
Now, with that said, don’t despair, you can still do things if you’re in a good topic niche (in demand, ie with good search volume).
On the one hand, it is a matter of being humble and focusing on more specialized searches with less competition . This is also known as the “long tail” strategy and we will see it below with a concrete example.
On the other hand, also remember that links are not the only ranking criteria .
You will often find results from domains with a lot of authority, but that do not respond 100% to the search intention . Good opportunities are hidden here by exactly responding to the search intent .
What they are and how to use the various “Keyword Difficulty” metrics
To simplify the estimation of the level of difficulty to compete in a certain search (which is what it is ultimately about) by reducing that difficulty to a single value, almost all current SEO tools also have “keyword difficulty” metrics that typically they also move on a scale from 0 to 100.
In the case of Ahrefs, for example, by its own definition, this metric “calculates the Ahrefs KD score by taking a weighted average of the number of domains linking to the current top 10 ranking pages and projecting the result on a log scale of 0 to 100.”
The details of the algorithm vary in each of the other tools (SEMrush, etc.), but the end goal is basically the same.
SEMrush, like Ahrfs, calls this metric KD, while Ubersuggest talks about SD (SEO Difficulty) .
Anyway, whichever you use, don’t forget that this type of metric is absolute . That is, it does not take into account the authority of your website and, therefore, the comparative information of contrasting the level of authority of your website or specific pages of your website with the others is lost.
How to analyze in which searches you are most likely to position yourself
Now we have seen enough to apply everything we have learned to what really interests us: analyze your possibilities to position yourself for a certain search.
To be practical, here I am going to explain it to you with a very simple example. If you want to see this part in more detail, also download our SEO eBook for beginners:
Let’s go with the example:
Let’s say you’re a very young gardening blog author, with a Domain Authority of 10 , according to the Moz DA, and you’d like to publish an article on caring for bonsai .
You have analyzed the niche with an SEO tool and you have seen that there are many specific searches that respond to this search intention and all of them add up to a total volume of thousands of searches per month. So, for now, we are doing very well with this idea.
The next step is to do the search “how to care for bonsai trees” in Google to analyze the level of competition . To do this, we are going to use the MozBar.
After doing so, you find this panorama on the first page of results (which is the only one that really interests us):
Wow… things look bad. All the results respond quite exactly to the search intent and have a lot of page authority, some even come from domains with a very high authority (uncomo.com), although this is much less relevant than the page authority (as you can see in its positioning).
Let’s not despair, remember what I said above: in these cases a “long tail” strategy usually offers solutions .
This concept of “long tail” or “long tail” of SEO refers to the large number of specialized variants (longer in number of words) of general searches, such as the one in the example.
In this case, for example, there is a long line of search variants according to the type of bonsai: “how to care for a ficus bonsai” , “how to care for a carmona bonsai” , “how to care for a ginseng bonsai” … and so on, dozens of they.
You have enough experience with Carmona bonsai, so you take a look to see if there is enough search volume. If you used SEMrush, you would see the following results:
Good news: although the total volume is noticeably lower than in the more generic search, almost 500 searches per month that add up the different long-tail variants, it is still an interesting volume .
So let’s see how we compete with this search intent. When looking for “how to take care of a carmona bonsai” we get:
If you look at the search results, at first glance, you can be scared that, again, everything is full of websites with enough authority against which you will not have any chance to compete.
But, analyze the results well : have you noticed that only one of the results really responds well to the search intention (the one that is highlighted)?
If you look closely, the others are the results of generic information about this type of bonsai, they do not specifically respond to how to care for them .
Furthermore, notice that the highlighted result that responds well to the quest has much less AP and AD than the ones below it. According to Moz, it wouldn’t even have a single link.
Why is this happening?
Well, because it is the only one that adequately responds to the search .
The result above it will probably answer it in its content (in a subsection, for example), even if it does not reflect it in the title (that is, it is not the main topic of the content), but thanks to its high PA and DA ( and possible some other factor), manages to overcome the exact answer.
Probably just with a little more AP (between 15-20 I estimate) the exact result would already be above.
Also look at the KD values ??shown by SEMrush, they are relatively high degrees of difficulty.
This is more or less consistent with the DAs of the results, however, it is not a reflection of the real situation because there is hardly any content that accurately responds to these searches . Moreover, those that exist have very little authority.
Hence, my criticism of this simplistic metric, I think it trivializes this issue so much that it stops fulfilling its function in practice. That’s why I personally hardly use them.
Conclusions
In short, in the end all this is quite logical and intuitive, don’t you think?
However, if, after reading this post, you have been left with the feeling that all this also has a lot of fuzzy logic , congratulations, you have hit the nail on the head . Indeed it is and so you should take it.
In the example, everything has been squared very well and it has turned out very nice, but in practice you will also find more than once with figures that you do not see much sense in.
Other times you will find that a content that was very clear that you would position without problems according to all the metrics, turns out that it does not position itself even after three.
And it is that we must not forget that, although it is essential to do this analysis when you want to position a content, it is still working with estimates without taking into account, in addition, that the ways of Mr. Google are often as inscrutable as those of the rest. of divine beings.
As for the tools, although I am an absolute fan of SEMrush , especially for its wonderful keyword exploration tool (Keyword Magic Tool), when it comes to analyzing the level of competition I still use the MozBar practically exclusively .
And it is that not only do I continue to square (generally…) the PA and DA metrics with the search results, but I cannot conceive of this analysis outside of the Google results page .
In the previous example it has been possible to see very clearly: it is not about shuffling PA and DA figures like an automaton, but there is a lot of subjective part, a lot of “art” in assessing to what extent the results respond or not to the intention of search having to enter even many times in the content of the result to fine tune this.
For all this, in my personal experience, the MozBar concept, being an extension that integrates into the browser and merges with Google results, has turned out to be the best formula to date for this job.