By Eric Enge
Branko Rihtman has been optimizing sites for search engines since 2001 for clients and own web properties in a variety of competitive niches. Over that time, Branko realized the importance of properly done research and experimentation and started publishing findings and experiments at SEO Scientist. Branko is currently responsible for SEO R&D at RankAbove, provider of a leading SEO SaaS platform – Drive.
1. Freshness of data (delay between link being detected and indexed )
Majestic – 8/10
Moz – 6/10
In a lot of cases, especially when the links involved are of low quality, they will not be shown in OSE, only in Majestic Fresh Index. I have seen this myself and have heard similar experiences from people in Forums and Twitter. For example, look at Danny Sullivan�s post based on his link building rant at SMX Advanced. Majestic reports 183 links from 49 domains, even though they haven�t crawled the page itself yet:
Open Site Explorer has no backlink data about that page:
This issue became even more critical now that people are trying to identify low-quality links, which are in a lot of cases simply not reported in OSE. Another important use is in negative SEO � crappy links that competitors are throwing at your site in thousands are usually not visible at all in OSE, making the analysis and possible prevention steps impossible.
BTW, Bing Link Explorer reports only 47 links to that SEL URL.
2. Predictability of updates (conformance to scheduled times)
Majestic – 10/10
Moz – 5/10
There is no comparison here. Majestic Fresh index is updated daily and is pushed to historic index on monthly basis. As I mentioned above, I am already finding links in Majestic to articles published just a few days ago, while OSE has no data for those URLs. Majestic even reports on links not yet reported by search engines so Majestic takes this one by far.
3. API performance (uptime and speed of response)
Majestic – 7/10
Moz – 9/10
They are both performing great for what I needed (and that was very small scale), however OSE API has a free version which, even though is throttled, can help a lot for concept testing and small tasks. Majestic has, as of today, no official free API version (unless you ask for it and give something in return), so slight advantage to OSE.
4. Percent of links reported by Google Webmaster Tools that Majestic/Moz know about from a variety of sites
Majestic – 7/10
Moz – 5/10
OSE has made great improvements in their percentage of coverage recently and I believe they will continue to improve here. I took a random sample of 10 URLs from 7 domains in very different industries. As can be seen from the chart below, the results vary, although, in a lot of cases, MajesticSEO (blue columns) gives numbers higher than GWT, even if OSE (red columns) reports numbers lower than GWT. If we take into the account the high decay rate that Majestic Historic Index has, discounting those links would probably bring the link numbers closer to the real number than OSE. This is something that I have noticed in real life situations as well.
5. Percent of links reported by Moz & Majestic that still exist
I have actually done an extensive study on percentages of live links reported by OSE and Majestic (and other tools) and Majestic Historic Tool had the highest rate of decay, while the Fresh index had the lowest decay rate of all the tools reported.
|Tool||URL 1||URL 2||URL 3|
|Open Site Explorer||3.07%||34.80%||7.46%|
|Link Research Tools||42.21%||54.22%||25.84%|
However, this metric alone can be misleading, since if one of the tools has a much larger database, naturally it will have more dead links in it. Furthermore, Majestic numbers (as reported in their Site Explorer) include the links that Majestic knows are deleted and can be easily filtered out in advanced report. What I compared in the study is the number of reported links after discounting the dead links (both links taken out through advanced reporting and then manually checked):
|#||Ahrefs||Majestic Fresh||Majestic Historic||OSE||Link Research Tools|
And by that measure Majestic outperformed OSE by a huge margin.
6. Correlation of key metrics (e.g. PA/DA and ACRank) with SERPs
If we compare PA/DA and ACRank, OSE metrics outperform ACRank by a huge margin. I don�t want to speak about correlation to rankings, since none of the metrics measured today by itself correlate with rankings in a significant manner and all the studies saying otherwise are, IMHO, flawed. One of the examples I recently saw is Ontolo�s Industry Reports where in one case, there was a slight negative correlation between ranking and DA.
Majestic have recently released a new set of metrics which correlate with PR much better than any other metric (not that it means anything about correlation to rankings) and are slowly moving away from the ACRank. So I am not sure the comparison between PA/DA and ACR is relevant anymore, but if I stick to the original question, SEOMoz metrics outperform the ACRank by far.