Tag: Audits

52 Tools And Counting: Mostly Free SEO Tools – I Actually Use

52 Tools And Counting: Mostly Free SEO Tools – I Actually Use
52 Tools And Counting: Mostly Free SEO Tools – I Actually Use

Work In Progress

Thanks for checking in with Ultimate SEO, this site is a side project as my client’s sites are the main one.  This means I may often have edits and unfinished elements to circle back too.  I encourage you to feel free to let me how this site could be better using the contact form.

Ultimate SEO”

Bad Backlinks: 100 Sites You Don’t Want A Backlink From.

Bad Backlinks: 100 Sites You Don’t Want A Backlink From.
Bad Backlinks: 100 Sites You Don’t Want A Backlink From.

Bad Backlinks

UltimateSEO.org has backlinks from about a thousand domains.  In a recent review of these I found an odd reoccurring link from multiple domains but all with the same content and titles.  I was introduced with “The Globe” which charges sites to NOT list them or makes money from SEOs paying them to not backlink to them.  At $36 a link they’re likely insane and I bet its bringing in some money.  But before we go all crazy and start paying Ransomlinks (if its not a word I claim it … Ransomlinks are backlinks from bad sites meant to lower your SEO score unless you pay to not be linked too.)

In reviewing the situation I ran across a list of the most disavowed sites.  I figured Id share that with you below, but before I do what outcome did I choose for these bad links pointed to my site?

  1. Option 1 Pay: Heck No! Then the terrorists win.
  2. Disavow: No! Don’t use disavow unless Google has placed a manual action against your site.  I’m skeptical anyhow of the tools purpose and Google itself says there is no need to use the tool unless you’ve been penalized and told by them you are being penalized.
  3. Do Nothing: Yes! Don’t do anything. Google likely knows about the Ransomlinks scheme and has already penalized the site by deindexing it.  There are so many random domains its going to be a mess to address so let it be unless you have a seen a negative affect.  In other words…before you saw your leg off wondering if that spot is cancer…stop and find out.
  4. An idea: 301 Redirect Them…seriously…all of these links point to a subdomain that until now hasn’t existed.  Most others who are talking about this site note a similar subdomain targeted.   I could create the targeted subdomain and redirect all links to it from my site back to theirs.  🙂  

I’m opting for the third as I dont have any indication that Google cares about these Ransomlinks.  They may actually bring some random traffic of use so redirecting them would take that from my site.

[democracy id=”2″]

And now the most disavowed sites…

Most popular websites disavowed by webmasters

1blogspot.com
2blogspot.ca
3blogspot.co.uk
4ning.com
5wordpress.com
6blog.pl
7linkarena.com
8yuku.com
9blogspot.de
10webs.com
11blogspot.nl
12blogspot.fr
13lemondir.com
14blog.com
15alonv.com
16tistory.com
17searchatlarge.com
18dvpdvp1.com
19typepad.com
20nju-jp.com
21bluehost.com
22wldirectory.com
23tumblr.com
24hyperboards.com
25directoryfuse.com
26prlog.ru
27informe.com
28ligginit.com
29theglobe.org
30pulsitemeter.com
31articlerich.com
32weebly.com
33the-globe.com
34blogspot.no
35theglobe.net
36articledashboard.com
37dig.do
38seodigger.com
39cybo.com
40fat64.net
41bravenet.com
42cxteaw.com
43askives.com
44mrwhatis.net
45insanejournal.com
46xurt.com
47freedirectorysubmit.com
48commandresults.com
49sagauto.com
50internetwebgallery.com
51freewebsitedirectory.com
52ewbnewyork.com
53000webhost.com
54tblog.com
55directorylist.me
56analogrhythm.com
57snapcc.org
58bravejournal.com
59weblinkstoday.com
60m-pacthouston.com
61linkcruncher.com
62tripod.com
63cogizz.com
64niresource.com
65over-blog.com
66ogdenscore.com
67free-link-directory.info
68alikewebsites.com
69folkd.com
70djsonuts.com
71uia.biz
72bangkokprep.com
73forumsland.com
74punbb-hosting.com
75hostmonster.com
76blogspot.in
77siteslikesearch.com
78bookmark4you.com
79siliconvalleynotary.com
80listablog.com
81poetic-dictionary.com
82linkspurt.com
83cultuurtechnologie.net
84azjournos.com
85exteen.com
86articletrader.com
87blogspot.com.au
88delphistaff.com
89altervista.org
90media-tourism.com
91woodwardatelier.com
92holdtiteadhesives.com
93lorinbrownonline.com
94tech4on.com
95popyourmovie.com
96trilogygroveland.com
97foqe.net
98directorybin.com
99eatrightkc.com

Ultimate SEO”

MAJOR GOOGLE SEO CHANGE FOR SOME: Website Traffic CREDITED To Where Google Chooses

MAJOR GOOGLE SEO CHANGE FOR SOME: Website Traffic CREDITED To Where Google Chooses
Wednesday, February 06, 2019

In Search Console, the Performance report currently credits all page metrics to the exact URL that the user is referred to by Google Search. Although this provides very specific data, it makes property management more difficult; for example: if your site has mobile and desktop versions on different properties, you must open multiple properties to see all your Search data for the same piece of content.

To help unify your data, Search Console will soon begin assigning search metrics to the (Google-selected) canonical URL, rather than the URL referred to by Google Search. This change has several benefits:

  • It unifies all search metrics for a single piece of content into a single URL: the canonical URL. This shows you the full picture about a specific piece of content in one property.
  • For users with separate mobile or AMP pages, it unifies all (or most, since some mobile URLs may end up as canonical) of your data to a single property (the “canonical” property).
  • It improves the usability of the AMP and Mobile-Friendly reports. These reports currently show issues in the canonical page property, but show the impression in the property that owns the actual URL referred to by Google Search. After this change, the impressions and issues will be shown in the same property.

Google Search Console

When will this happen?

We plan to transition all performance data on April 10, 2019. In order to provide continuity to your data, we will pre-populate your unified data beginning from January 2018. We will also enable you to view both old and new versions for a few weeks during the transition to see the impact and understand the differences.

API and Data Studio users: The Search Console API will change to canonical data on April 10, 2019.

How will this affect my data?

  • At an individual URL level, you will see traffic shift from any non-canonical (duplicate) URLs to the canonical URL.
  • At the property level, you will see data from your alternate property (for example, your mobile site) shifted to your “canonical property”. Your alternate property traffic probably won’t drop to zero in Search Console because canonicalization is at the page, not the property level, and your mobile property might have some canonical pages. However, for most users, most property-level data will shift to one property. AMP property traffic will drop to zero in most cases (except for self-canonical pages).
  • You will still be able to filter data by device, search appearance (such as AMP), country, and other dimensions without losing important information about your traffic.

You can see some examples of these traffic changes below.

Preparing for the change

  • Consider whether you need to change user access to your various properties; for example: do you need to add new users to your canonical property, or do existing users continue to need access to the non-canonical properties.
  • Modify any custom traffic reports you might have created in order to adapt for this traffic shift.
  • If you need to learn the canonical URL for a given URL, you can use the URL Inspection tool.
  • If you want to save your traffic data calculated using the current system, you should download your data using either the Performance report’s Export Data button, or using the Search Console API.

Examples

Here are a few examples showing how data might change on your site. In these examples, you can see how your traffic numbers would change between a canonical site (called example.com) and alternate site (called m.example.com).

Important: In these examples, the desktop site contains all the canonical pages and the mobile contains all the alternate pages. In the real world, your desktop site might contain some alternate pages and your mobile site might contain some canonical pages. You can determine the canonical for a given URL using the URL Inspection tool.

Total traffic

In the current version, some of your traffic is attributed to the canonical property and some to the alternate property. The new version should attribute all of your traffic to the canonical property.

MAJOR GOOGLE SEO CHANGE FOR SOME: Website Traffic CREDITED To Where Google Chooses

Canonical property
(http://example.com)
Alternate property
(http://m.example.com)
CurrentMAJOR GOOGLE SEO CHANGE FOR SOME: Website Traffic CREDITED To Where Google ChoosesMAJOR GOOGLE SEO CHANGE FOR SOME: Website Traffic CREDITED To Where Google Chooses
New, based on canonical URLsMAJOR GOOGLE SEO CHANGE FOR SOME: Website Traffic CREDITED To Where Google ChoosesMAJOR GOOGLE SEO CHANGE FOR SOME: Website Traffic CREDITED To Where Google Chooses
Change+0.7K     |        +3K-0.7K        |          -3K

Individual page traffic

You can see traffic changes between the duplicate and canonical URLs for individual pages in the Pages view. The next example shows how traffic that used to be split between the canonical and alternate pages are now all attributed to the canonical URL:

Canonical property
(http://example.com)
Alternate property
(http://m.example.com)
OldMAJOR GOOGLE SEO CHANGE FOR SOME: Website Traffic CREDITED To Where Google ChoosesMAJOR GOOGLE SEO CHANGE FOR SOME: Website Traffic CREDITED To Where Google Chooses
NewMAJOR GOOGLE SEO CHANGE FOR SOME: Website Traffic CREDITED To Where Google ChoosesMAJOR GOOGLE SEO CHANGE FOR SOME: Website Traffic CREDITED To Where Google Chooses
Change+150     |        +800-150     |        -800

Mobile traffic

In the current version, all of your mobile traffic was attributed to your m. property. The new version attributes all traffic to your canonical property when you apply the “Device: Mobile” filter as shown here:

Canonical property
(http://example.com)
Alternate property
(http://m.example.com)
OldMAJOR GOOGLE SEO CHANGE FOR SOME: Website Traffic CREDITED To Where Google ChoosesMAJOR GOOGLE SEO CHANGE FOR SOME: Website Traffic CREDITED To Where Google Chooses
NewMAJOR GOOGLE SEO CHANGE FOR SOME: Website Traffic CREDITED To Where Google ChoosesMAJOR GOOGLE SEO CHANGE FOR SOME: Website Traffic CREDITED To Where Google Chooses
Change+0.7K      | +3K-0.7K      | -3K

In conclusion

We know that this change might seem a little confusing at first, but we’re confident that it will simplify your job of tracking traffic data for your site. If you have any questions or concerns, please reach out on the Webmaster Help Foru

Ultimate SEO”

Backlinks 101 – SEO’s Off-page Often Ignored Power Ranker

Backlinks 101 – SEO’s Off-page Often Ignored Power Ranker

First off a little disclosure this article over laps the Backlinks category of the FAQ.

What Are Backlinks?

Backlinks are links from other sites.  Think of them as votes of affirmation.  Only one vote can come from a domain so for SEO purposes it doesn’t matter if there are 100 or 1 link from the same domain one link is the count you gain.  Subdomains are viewed separately…thats why yourblog.tumblr.com isn’t a tumblr backlink.  Now those other links may increase traffic to your site but in regards to SEO value its one vote.  Some call this metric “Domain Pop” … how many different domains link to a site.  It’s also gotten more complicated as people would host multiple sites on a shared IP.  How many backlinks come from different IPs is “IP Pop.”  It’s common to see a little higher domain pop than ip pop but if its a huge gap its suspect.

The more domains that link to you the more authoritative you must be right?  Well Kinda.  If 1000 domains link to your site you likely are more authoritative than a site that 3 sites link too.  Not all domains or votes or backlinks….are the same.  A link to your site from UltimateSEO.org carries with it the weigh attributed to that site by its backlinks.  People refer to this as “link juice” basically the backlinks coming in to a site fuel the backlinks leaving a site.

Link juice prevents someone of registering 10 new domains and making 10 backlinks to their original site because those ten new sites probably lack link juice from their own backlinks.  Generally said though backlinks increase a site’s domain athority or citation flow.  Different companies refer to the authority of a site differently.  Beyond DA and CF there is also LIS but I have found DA to be the best singular indicator of a sites worth.

You can see a site’s backlinks from many indexes, most are paid.  Ultimate SEO recommends Monitor Backlinks if you want a tool that is really good at backlinks.  UltimateSEO received nothing for that endorsement.  The endorsement or vote …. as you see it’s a backlink.

What Kinds Of Backlinks Are There?

No-Follow vs Do-Follow Backlinks

Beyond saying Good and Bad there are actually a couple to be aware of “follow” or “do-follow” and “nofollow.”  They get their names by the instruction they give search engine crawlers…no follow links mean don’t follow this to that site.  In theory a regular link is a follow link and serves as the backlink you ultimately want.

Some reason folks went a little crazy and no-followed everything … even internal links.  No-follows were meant to combat link building schemes such as blog comments.  While its fine to have no follow links to your site there should be a limited mixture of them in relation to your actual do-follow links.  No-follow links are still indexed and I feel strongly have some SEO value still even if its just to drive traffic to your site.  In the end you want do-follow links because they come as full fledged votes for your site, where as no-follows are more or less saying “here is this link to a sight I don’t want to be associated with necessarily.”

It’s that distance that makes “no-follows” a poor source of SEO efforts and its why you should use them sparingly in internal links.  Why would you send a signal to Google that you don’t stand behind an internal link to yourself?  Some try to hold onto all the link juice coming in and no-follow” every external link, this is a poor practice and its been shown that linking your content to relevant good external content helps you.

No-Follow Internal Links…Just Don’t

I rarely use no-follow links, I kinda think they system there is broken so I just follow them all.  Sometime ago people started no-following everything and lead their link juice to specific pages they wanted to rank.  Since this was a misuse of the no-follow Google changed how it handled no-follows, it doesn’t keep the juice in your site or on a page it just disappears.  No-follows take the same amount of link juice as a regular link but no one gets it.  Pointless then right?  We’ll debate that more another day.

What Is Anchor Text?

Anchor text is the “keyword” of a backlink.  Ultimate SEO for example is anchor text for the link http://blizzardseo.com which that link had no anchor text.  Anchor text defines the backlink vote.  If enough people make a link to your site like “Miserable Failure” it will teach Google that the target of that link is a miserable failure.  This happen to George W Bush’s White House biography page long ago and is called a “Google Bomb”.  It’s that old saying…if you say something often enough it can become true.

How Many Do I Need?

A lot.  You need as many as you can get from as many places as you can get them.  Just keep in mind that a backlink from my personal site isn’t as powerful as a backlink from the CDC.gov website …. they have the authority.  Thats also why .edu and .gov backlinks are especially coveted.

BacklinksA quick rule of thumb to determine how many you need is to simply Google the keyword you are attempting to rank for….lets say “cool music from the 60s.” I get pastemagazine.com leading the pack.  According the SEMrush.com that site has 2.7 million backlinks coming from 36,700 domains on 44,300 IP addresses.  So roughly keep that your target if you want to rank #1 for “cool music from the 60s.”

How Do I Get Backlinks?

Many ways….the Gods honest truth is to do it the obvious way …. by having content worth linking too.  If you want a page to rank at the top you need a site thats fast, optimized on page, has relevant … awesome content … and people will backlink to you.  Over years and years and years and you’ll need to keep that content better than everyone else’s … thats not super realistic though.  Sometimes the best content is on page 2 and it’ll stay there…I often Google something and skip the first things just cause they are often just the most SEOed things.  BUT most people by far pick the first result, then the second and so on.

What Are Some Popular Link Building Techniques?

So you need to prime the pump and simulate organic growth and popularity and now you’re in a link building scheme.  Some are looked down upon more than others but make no mistake any attempt to gain backlinks is a link building scheme.  Press Releases, Guest Blogging, Commenting in Forums, making profiles on other sites, link swapping, selling or buying links and finally PBNs.  PBNs are private blog networks where you make zombie sites that link to your important site … but considering that the example above had 37,000 domains linking to it how effective is a network of say 200 sites?  Well surprisingly effective…and thats why Google hunts PBNs like Buffy the vampire slayer.

Thats our Backlinks 101…I’ll talk more about some of these concepts in future posts.

Ultimate SEO”

Updates That Matter AND Updates That Don’t :SEO Basics

Updates That Matter AND Updates That Don’t :SEO Basics

Adwords Template With Search Console, Google Analytics In Data Studio

Adwords Template With Search Console, Google Analytics In Data Studio
Adwords Template With Search Console, Google Analytics In Data Studio

SEO & PPC Data Studio Report Using Adwords, Google Analytics and Google Search Console All-In-One Template

Google Data Studio Reports are some fun things.  Here at Ultimate SEO you love visualizations and thats partially why we like Data Studio. Beyond the looks its also integrated easily with Google Sheets, Google Analytics and Search Console to name a few. These few though create a powerful free SEO PPC tool.

You can check out the report directly by clicking the link above, here is an embedded look at the nine pages of live data thats basically always right.  It’s nice to be able to pull in data from two very different Google tools.  Lots of people know of Google Analytics and think it covers Google Search Console but it doesn’t and I’ll discuss that more in another post but the unique data from these sources can all mix to form one handy live report.

You can check out all the information pulled here in this report and change the dates as needed using the drop down.  To personalize the report to your own site simply copy it and set the data sources to your own Google Analytics and Search Console sources.  Word of caution on the Search Console aspect there are two connections, one is the site and the other I believe is the page urls.  So make sure to connect those correctly.  Just like in electrical work it’s like to like.

Across these nine pages you’ll find insights into any site with an Adwords campaign including keywords, search terms, CTR and CPC.

Ultimate SEO”

SEO Site Auditing With SEOprofiler Reports Part 1

SEO Site Auditing With SEOprofiler Reports Part 1

SEO Site Auditing With SEOprofiler Reports Part 1Ultimate SEO”

SEO Site Auditing With SEOprofiler Reports Part 1

SEO Site Auditing With SEOprofiler Reports Part 1

SEO Site Auditing With SEOprofiler Reports Part 1Ultimate SEO”

Updates That Matter AND Updates That Don’t :SEO Basics

Updates That Matter AND Updates That Don’t :SEO Basics

A Comprehensive Analysis of the New Domain Authority

A Comprehensive Analysis of the New Domain Authority

Moz’s Domain Authority is requested over 1,000,000,000 times per year, it’s referenced millions of times on the web, and it has become a veritable household name among search engine optimizers for a variety of use cases, from determining the success of a link building campaign to qualifying domains for purchase. With the launch of Moz’s entirely new, improved, and much larger link index, we recognized the opportunity to revisit Domain Authority with the same rigor as we did keyword volume years ago (which ushered in the era of clickstream-modeled keyword data).

What follows is a rigorous treatment of the new Domain Authority metric. What I will not do in this piece is rehash the debate over whether Domain Authority matters or what its proper use cases are. I have and will address those at length in a later post. Rather, I intend to spend the following paragraphs addressing the new Domain Authority metric from multiple directions.

Correlations between DA and SERP rankings

The most important component of Domain Authority is how well it correlates with search results. But first, let’s get the correlation-versus-causation objection out of the way: Domain Authority does not cause search rankings. It is not a ranking factor. Domain Authority predicts the likelihood that one domain will outrank another. That being said, its usefulness as a metric is tied in large part to this value. The stronger the correlation, the more valuable Domain Authority is for predicting rankings.

Methodology

Determining the “correlation” between a metric and SERP rankings has been accomplished in many different ways over the years. Should we compare against the “true first page,” top 10, top 20, top 50 or top 100? How many SERPs do we need to collect in order for our results to be statistically significant? It’s important that I outline the methodology for reproducibility and for any comments or concerns on the techniques used. For the purposes of this study, I chose to use the “true first page.” This means that the SERPs were collected using only the keyword with no additional parameters. I chose to use this particular data set for a number of reasons:

  • The true first page is what most users experience, thus the predictive power of Domain Authority will be focused on what users see.
  • By not using any special parameters, we’re likely to get Google’s typical results.
  • By not extending beyond the true first page, we’re likely to avoid manually penalized sites (which can impact the correlations with links.)
  • We did NOT use the same training set or training set size as we did for this correlation study. That is to say, we trained on the top 10 but are reporting correlations on the true first page. This prevents us from the potential of having a result overly biased towards our model.

I randomly selected 16,000 keywords from the United States keyword corpus for Keyword Explorer. I then collected the true first page for all of these keywords (completely different from those used in the training set.) I extracted the URLs but I also chose to remove duplicate domains (ie: if the same domain occurred, one after another.) For a length of time, Google used to cluster domains together in the SERPs under certain circumstances. It was easy to spot these clusters, as the second and later listings were indented. No such indentations are present any longer, but we can’t be certain that Google never groups domains. If they do group domains, it would throw off the correlation because it’s the grouping and not the traditional link-based algorithm doing the work.

I collected the Domain Authority (Moz), Citation Flow and Trust Flow (Majestic), and Domain Rank (Ahrefs) for each domain and calculated the mean Spearman correlation coefficient for each SERP. I then averaged the coefficients for each metric.

Outcome

Moz’s new Domain Authority has the strongest correlations with SERPs of the competing strength-of-domain link-based metrics in the industry. The sign (-/+) has been inverted in the graph for readability, although the actual coefficients are negative (and should be).

A Comprehensive Analysis of the New Domain Authority

Moz’s Domain Authority scored a ~.12, or roughly 6% stronger than the next best competitor (Domain Rank by Ahrefs.) Domain Authority performed 35% better than CitationFlow and 18% better than TrustFlow. This isn’t surprising, in that Domain Authority is trained to predict rankings while our competitor’s strength-of-domain metrics are not. It shouldn’t be taken as a negative that our competitors strength-of-domain metrics don’t correlate as strongly as Moz’s Domain Authority — rather, it’s simply exemplary of the intrinsic differences between the metrics. That being said, if you want a metric that best predicts rankings at the domain level, Domain Authority is that metric.

Note: At first blush, Domain Authority’s improvements over the competition are, frankly, underwhelming. The truth is that we could quite easily increase the correlation further, but doing so would risk over-fitting and compromising a secondary goal of Domain Authority…

Handling link manipulation

Historically, Domain Authority has focused on only one single feature: maximizing the predictive capacity of the metric. All we wanted were the highest correlations. However, Domain Authority has become, for better or worse, synonymous with “domain value” in many sectors, such as among link buyers and domainers. Subsequently, as bizarre as it may sound, Domain Authority has itself been targeted for spam in order to bolster the score and sell at a higher price. While these crude link manipulation techniques didn’t work so well in Google, they were sufficient to increase Domain Authority. We decided to rein that in.

Data sets

The first thing we did was compile a series off data sets that corresponded with industries we wished to impact, knowing that Domain Authority was regularly manipulated in these circles.

  • Random domains
  • Moz customers
  • Blog comment spam
  • Low-quality auction domains
  • Mid-quality auction domains
  • High-quality auction domains
  • Known link sellers
  • Known link buyers
  • Domainer network
  • Link network

While it would be my preference to release all the data sets, I’ve chosen not to in order to not “out” any website in particular. Instead, I opted to provide these data sets to a number of search engine marketers for validation. The only data set not offered for outside validation was Moz customers, for obvious reasons.

Methodology

For each of the above data sets, I collected both the old and new Domain Authority scores. This was conducted all on February 28th in order to have parity for all tests. I then calculated the relative difference between the old DA and new DA within each group. Finally, I compared the various data set results against one another to confirm that the model addresses the various methods of inflating Domain Authority.

Results

A Comprehensive Analysis of the New Domain Authority

In the above graph, blue represents the Old Average Domain Authority for that data set and orange represents the New Average Domain Authority for that same data set. One immediately noticeable feature is that every category drops. Even random domains drops. This is a re-centering of the Domain Authority score and should cause no alarm to webmasters. There is, on average, a 6% reduction in Domain Authority for randomly selected domains from the web. Thus, if your Domain Authority drops a few points, you are well within the range of normal. Now, let’s look at the various data sets individually.

A Comprehensive Analysis of the New Domain Authority

Random domains: -6.1%

Using the same methodology of finding random domains which we use for collecting comparative link statistics, I selected 1,000 domains, we were able to determine that there is, on average, a 6.1% drop in Domain Authority. It’s important that webmasters recognize this, as the shift is likely to affect most sites and is nothing to worry about.

Moz customers: -7.4%

Of immediate interest to Moz is how our own customers perform in relation to the random set of domains. On average, the Domain Authority of Moz customers lowered by 7.4%. This is very close to the random set of URLs and indicates that most Moz customers are likely not using techniques to manipulate DA to any large degree. 

Link buyers: -15.9%

Surprisingly, link buyers only lost 15.9% of their Domain Authority. In retrospect, this seems reasonable. First, we looked specifically at link buyers from blog networks, which aren’t as spammy as many other techniques. Second, most of the sites paying for links are also optimizing their site’s content, which means the sites do rank, sometimes quite well, in Google. Because Domain Authority trains against actual rankings, it’s reasonable to expect that the link buyers data set would not be impacted as highly as other techniques because the neural network learns that some link buying patterns actually work.

Comment spammers: -34%

Here’s where the fun starts. The neural network behind Domain Authority was able to drop comment spammers’ average DA by 34%. I was particularly pleased with this one because of all the types of link manipulation addressed by Domain Authority, comment spam is, in my honest opinion, no better than vandalism. Hopefully this will have a positive impact on decreasing comment spam — every little bit counts.

Link sellers: -56%

I was actually quite surprised, at first, that link sellers on average dropped 56% in Domain Authority. I knew that link sellers often participated in link schemes (normally interlinking their own blog networks to build up DA) so that they can charge higher prices. However, it didn’t occur to me that link sellers would be easier to pick out because they explicitly do not optimize their own sites beyond links. Subsequently, link sellers tend to have inflated, bogus link profiles and flimsy content, which means they tend to not rank in Google. If they don’t rank, then the neural network behind Domain Authority is likely to pick up on the trend. It will be interesting to see how the market responds to such a dramatic change in Domain Authority.

High-quality auction domains: -61%

One of the features that I’m most proud of in regards to Domain Authority is that it effectively addressed link manipulation in order of our intuition regarding quality. I created three different data sets out of one larger data set (auction domains), where I used certain qualifiers like price, TLD, and archive.org status to label each domain as high-quality, mid-quality, and low-quality. In theory, if the neural network does its job correctly, we should see the high-quality domains impacted the least and the low-quality domains impacted the most. This is the exact pattern which was rendered by the new model. High-quality auction domains dropped an average of 61% in Domain Authority. That seems really high for “high-quality” auction domains, but even a cursory glance at the backlink profiles of domains that are up for sale in the $10K+ range shows clear link manipulation. The domainer industry, especially the domainer-for-SEO industry, is rife with spam.

Link network: -79%

There is one network on the web that troubles me more than any other. I won’t name it, but it’s particularly pernicious because the sites in this network all link to the top 1,000,000 sites on the web. If your site is in the top 1,000,000 on the web, you’ll likely see hundreds of root linking domains from this network no matter which link index you look at (Moz, Majestic, or Ahrefs). You can imagine my delight to see that it drops roughly 79% in Domain Authority, and rightfully so, as the vast majority of these sites have been banned by Google.

Mid-quality auction domains: -95%

Continuing with the pattern regarding the quality of auction domains, you can see that “mid-quality” auction domains dropped nearly 95% in Domain Authority. This is huge. Bear in mind that these drastic drops are not combined with losses in correlation with SERPs; rather, the neural network is learning to distinguish between backlink profiles far more effectively, separating the wheat from the chaff.

Domainer networks: -97%

If you spend any time looking at dropped domains, you have probably come upon a domainer network where there are a series of sites enumerated and all linking to one another. For example, the first site might be sbt001.com, then sbt002.com, and so on and so forth for thousands of domains. While it’s obvious for humans to look at this and see a pattern, Domain Authority needed to learn that these techniques do not correlate with rankings. The new Domain Authority does just that, dropping the domainer networks we analyzed on average by 97%.

Low-quality auction domains: -98%

Finally, the worst offenders — low-quality auction domains — dropped 98% on average. Domain Authority just can’t be fooled in the way it has in the past. You have to acquire good links in the right proportions (in accordance with a natural model and sites that already rank) if you wish to have a strong Domain Authority score.

What does this mean?

For most webmasters, this means very little. Your Domain Authority might drop a little bit, but so will your competitors’. For search engine optimizers, especially consultants and agencies, it means quite a bit. The inventories of known link sellers will probably diminish dramatically overnight. High DA links will become far more rare. The same is true of those trying to construct private blog networks (PBNs). Of course, Domain Authority doesn’t cause rankings so it won’t impact your current rank, but it should give consultants and agencies a much smarter metric for assessing quality.

What are the best use cases for DA?

  • Compare changes in your Domain Authority with your competitors. If you drop significantly more, or increase significantly more, it could indicate that there are important differences in your link profile.
  • Compare changes in your Domain Authority over time. The new Domain Authority will update historically as well, so you can track your DA. If your DA is decreasing over time, especially relative to your competitors, you probably need to get started on outreach.
  • Assess link quality when looking to acquire dropped or auction domains. Those looking to acquire dropped or auction domains now have a much more powerful tool in their hands for assessing quality. Of course, DA should not be the primary metric for assessing the quality of a link or a domain, but it certainly should be in every webmaster’s toolkit.

What should we expect going forward?

We aren’t going to rest. An important philosophical shift has taken place at Moz with regards to Domain Authority. In the past, we believed it was best to keep Domain Authority static, rarely updating the model, in order to give users an apples-to-apples comparison. Over time, though, this meant that Domain Authority would become less relevant. Given the rapidity with which Google updates its results and algorithms, the new Domain Authority will be far more agile as we give it new features, retrain it more frequently, and respond to algorithmic changes from Google. We hope you like it.


Be sure to join us on Thursday, March 14th at 10am PT at our upcoming webinar discussing strategies & use cases for the new Domain Authority:

Save my spot

Source

Ultimate SEO”