Google Disavow Tool: when and how to use it

Fernando Maciá

Written by Fernando Maciá

Google Disavow Tool: a Google Penguin consequence

Cutting off bad links with Disavow ToolGoogle announced the implementation of its algorithm update by the name of Penguin in late April 2012. According to the search engine, this update focussed on detecting websites that violated Google Webmasters’ quality guidelines. Two major aspects had become Google Penguin’s goals: over-optimised websites (those abusing keyword density in their content) and websites with artificial backlinks (link farms, low-quality directories, etc.).

Whilst website over-optimisation is related to on-page relevance, and thus, it is completely under our control (we can easily increase or reduce keyword density by editing our content), when it comes to links, it’s another matter, and the possibility of penalty for artificial backlinks opened the door to “negative SEO” actions. Why? Because any of our competitors could easily try to damage our website’s rankings by generating links that Google could detect as artificial. For example, link anchor texts optimised with keywords (also known as exact match anchor texts or partial match anchor texts) and pointing to our home page, sitewide links (blogrolls, footers, etc.), links from websites not thematically related to our content, or low-quality websites (such as certain directories), etc. Given that competitors could potentially use Penguin to damage another website’s positioning fuelled Google to announce only a few months later –in October 2012– the implementation of a tool which would allow us to deauthorise select inbound links to our website: Google Disavow Tool. At least in theory, using Google Disavow Tool, as webmasters we could deauthorise or “repudiate” those links, which Google might detect as artificial or manipulative, avoiding ourselves a penalty.

After implementing Penguin, Google began sending messages regarding detection of artificial links through Google Webmaster Tools, and in August 2013 it added a special section called Manual Actions, as a subdivision of Search Traffic, which was used to warn about the identification of potential issues at a global level –affecting the whole website– as well as in some measure –affecting specific content sections only–.

Nevertheless, Matt Cutts had already said in this video that using Disavow Tool wouldn’t pose a problem, even if we didn’t receive any manipulative link alerts from Google. We can use it freely, whenever we detect unnatural links or whenever we think we might be subject to someone else’s negative SEO tactics.

The Google-recommended route for webmasters to disavow links includes the following steps:

  1. Identify all suspicious links: this means not only contemplating links displayed by Google in its message warning or in the Manual Actions section of Google Webmaster Tools, but also identifying other similar ones that could be detected –by their situation on a page, domain of origin, anchor text, etc.– as artificial or manipulative. The data we can use to start identifying these links can be found in the Search Traffic -> Links to Your Site section in Google Webmaster Tools, or the “New” section in MajesticSEO.
  2. Try to remove all suspicious links: this means removing links that were generated by us, as well as other links created by other people and which coincide with Google’s guidelines to classify a link as manipulative. To do this, we should try to identify webmasters responsible for these websites and request them to remove these links. And only after this…
  3. Generate a disavow file: which we will then send to Google through its Disavow Tool. It must be a simple UTF-8 or 7-bit ASCII encoded text .txt file. We can create it using TextEdit (Mac) or NotePad (Windows). Later in this post we will see what this file’s syntax exactly looks like.
  4. Upload the file to Disavow Tool: easy-peasy, we just go to Google Disavow Tool, choose our domain, choose our disavow.txt file and upload it.
  5. Some things to consider: first, Google warns that it will take some time to process this file containing deauthorised links we added to it; and second, we can upload updated versions of this file, meaning that each new version will replace the previous one. Last, but not least, disavowed links will continue to appear in “Links to Your Site” in GWT, but this doesn’t mean Google is ignoring our disavow.txt file.
  6. Request for review: once we’ve completed all the aforementioned steps, we can submit a request for review by clicking the appropriate button in Search Traffic – > Manual Actions in Google Webmaster Tools, where we must include all the necessary information regarding steps used to identify problematic links and to remove them.

The convenience of using or not using this tool has been extensively talked about. Sujan Patel, for example, shows his disagreement regarding use of this tool in one of his posts, and there are many threads on Google forums discussing its effectiveness. Even though it’s true that when used incorrectly this tool can damage your rankings, there are particular circumstances –such as the one we’re going to talk about shortly– when we think it’s actually the best route to follow.

In any case, it’s worth taking into account recommendations and warnings grouped together by Razvan Gavrilas in his excellent post, particularly with regard to risks that may entail de-authorising links globally at a domain level, and the importance of gathering as much relevant information as possible to write a reconsideration request. Also keep in mind this post written by Marie Haynes with advice regarding things about Disavow Tool that are usually ignored, in which she states:

  • Disavowed links can be reactivated.
  • The disavow.txt file has a 2MB size limit.
  • De-authorising a link pointing to a redirected URL doesn’t usually work (in this case, Marie Haynes recommends including both the link source URL and the one from which the redirection is made).

Google Disavow Tool in practice

Evidently, telling something is not the same as actually experiencing it, so we’re going to share with you how we detected an abnormal link growth pattern towards our domain in July, and how we used Google Disavow Tool to try and stop these suspiciously artificial links from damaging our domain’s rankings.

First Manual Action warning

In April 2013 we received a warning from Google Webmaster Tools of Manual Actions -> Partial matches, with Google alerting us about the detection of “artificial, deceptive or manipulative links pointing to pages on this site”. It was referring to links, that were, indeed, included in some of our posts, and which had been replicated on other websites. In this case, our posts were published on websites reposting press releases, and they went back as far as 2004-05!!!

Even though in some cases it wasn’t easy, we managed to regain access to these websites to remove said links (by the way, many of these links, at the time of this warning, were links pointing to 301 redirect URLs, as they were posts published on older versions of our website.

Google Webmaster Tools: partial matches in Manual Actions

Link removal and/or modification

Given that this warning concerned Partial matches, and we hadn’t detected a decline in traffic or positioning, we decided to limit our response to eliminating links that appeared in the alert. As we removed these first links, others began to appear. In general, they matched links of the following type:

  1. Links in old reposted articles or content plagiarised several years ago: this first case refers to posts we actively redistributed in the past (from 2003 up until 2008 or perhaps even 2009), through reposting websites. These were articles, news and press releases, with their text barely modified, if at all. Many of these websites used feeds that in turn were used by a number of other websites too, so our articles ended up appearing in many other domains. This content linked absolute URLs, so unless they were manually edited by the webmaster, each new post gave our domain new links, with very relevant anchor texts, which back then positively influenced our rankings. Together with this reposted content, we also detected a great number of websites, which simply copied and pasted our posts, original links included in most cases. As these backlinks began to appear in Manual Actions, we modified and removed them, or requested their removal when it was necessary, and up until now we’ve been successful in this endeavour.
  2. Sitewide links: these were links published on blogroll sections or footers of websites. For most part, these were detected on blogs of our friends, colleagues and students, among others, who included a link towards our domain. In general, Google detected them as artificial when besides being sitewide they also had an exact match anchor text to our domain. For example, some detected links came from blogrolls with anchor texts such as “Search Engine Optimization”, but when the anchor text was “Human Level Communications” it wasn’t identified as artificial. So, as these links began to appear in Partial matches, we contacted each person responsible for every website on the list, requesting them to remove or to modify them. On the other hand, links pointing to Human Level Communications that were located on page footers usually came from some of the first websites we developed, and back then it was common practice to give credit to the developer with a “Website designed by Human Level…” or something along those lines, including a link to the domain.
  3. Links in directories: the last type of “odd” links corresponds to links included in directories (that were not created by us). Especially when it comes to links using exact match anchor texts. In some cases we were able to identify the person responsible for the management of the directory to request removal of our links, but in some others it wasn’t possible.

After receiving the first warning in Manual Actions, we began to monitor inbound links to our domain much more closely, mostly using these two tools: Google Webmaster Tools, through Search Traffic -> Links to Your Site, and MajesticSEO.

Artificial links warning

We ran into the following chart when reviewing inbound links in MajesticSEO, in July 2014. An abnormal increase evident in both Fresh and Historic indexes:

Majestic chart

We went from a more or less sustainable evolution of new link discovery, to detecting a sudden increase in newly discovered links every day, starting from mid-June ’14. Has our domain’s popularity really sky-rocketed all of a sudden? Were we being subjected to some negative SEO technique? And since this is what really matters here, could those new links benefit us in some way, or could they damage our rankings?

Identifying source domains of these links

Reviewing the list of domains from which we got new links, we discovered many legitimate ones, to whom we would publicly like to express our gratitude (thanks, María José Cachón), but there were also several directories that caught our attention. Amongst them we found:

  • Etc.

After visiting some of these websites, it was clear that:

  • Some of these directories were developed using the same tool and basically the same template. Their contact information confirmed that, indeed, they do belong to the same person and luckily enough there was a telephone number to which to call. The main problem with these domains was that they provided an over-optimised anchor text (“posicionamiento en buscadores”, or “positioning in search engines” in English). An excessively high number of links with this anchor text pointing to the root domain could penalise it precisely for this search query:

Backlinks in 'Link Links'Two more domains were also related. These domains didn’t feature their owner’s name, contact e-mail or a telephone number. So we did a quick Whois, which revealed the name of the owner of both domains, and even their mobile number. The main problem was that the anchor text change was odd, and the location of this link was equally strange, of this “scrape & spin” type, which black hat users like so much:

Backlinks to disavowAs for BigBozz, it was the Spanish version of a network of directories present in several different countries. There wasn’t any contact information, and a Whois search only revealed the owner’s name, but not their e-mail or telephone number.

Together with these directories, we also identified a domain ( which had plagiarised an old article of ours, including its absolute links, and it appeared quoted in a “suspicious links” sample, in Manual Actions report from Google Webmaster Tools.

Request for link removal

According to Google’s recommendations, on the 1 August we carried out the following actions:

  • We contacted the managers of the first three directories by telephone, who were most kind and ready to execute our request for removal immediately.
  • In the case of InformaticaAutonomos and DirectorioInformatica, we also contacted their owner, who appeared to be rather surprised that we called. He was willing to remove our links “when he had time”, but a month later they continue to be on this directory.
  • We sent a message to DragonMoonDesign’s contact e-mail address, also requesting the removal of our domain. To this day, we still haven’t received an answer.
  • We weren’t able to find contact information for the remaining directories, or an answer to our request.

Disavow file generation

On 19 August, and after detecting the BigBozz directory in the Links to Your Site section in Google Webmaster Tools, we decided to send an updated disavow.txt file including links from all these directories.

We generated this file using TextEdit on Mac and UTF-8 encoding, with the following content:

#Unsolicited links published in directories. Owner (xxxxxxxxx) contacted on 1 August 2014 via telephone. I requested the removal of all links towards Link removal confirmed on 8 August 2014
#Unsolicited links published in directories. Owner (xxxxxxxxx) contacted on 1 August 2014 via telephone.I requested the removal of all links towards The owner said that they will remove them when they have time. On 19 August links are still active
#It was impossible to reach owner (xxxxxxxxxxx). There isn’t any contact information or a telephone number in the domain registry or their website
#We contacted their support service, requesting the removal of all links to our site from this URL, but didn’t get a reply. On 19 August links are still active


As you can see in this example, the syntax of a disavow file is very simple. You only need to add a hash (#) before the comment lines, and include each URL where there’s a link pointing to your website that you want to deauthorise. And if you want to deauthorise all inbound links coming from a particular domain in bulk, you can just start the line with “domain” followed by a colon (:), and the domain you want to block. Make sure that you’ve entered everything correctly, and once it’s done, save it as UTF-8 encoded. Be careful when deauthorising entire domains, though, and make sure you’re not blocking legitimate links that could be valuable to your domain.

It’s worth noting that, according to John Mueller, comments are not read by Google supervisors, and the file is processed automatically. So if you’re going to include comments, make sure it’s only information regarding your own actions that you want to keep in mind for monitoring purposes. In short, what you’ve done and when you’ve done it.

Despite our success in removing links from first three directories, we decided to include them still in our disavow file, for security reasons.

In Disavow Tool all we need to do is to select the domain to which we want to apply the newly created disavow.txt file, locate said file on our hard drive and confirm its upload. The tool will upload this file and let us know if it’s correct. We will also receive a message requesting confirmation that we really want to apply this disavow file to our domain. This message will also get to Google Webmaster Tool account admins, and it can be also seen in Site Messages of same tool.

Requesting a review in Google Webmaster Tools

Once we’ve sent our disavow file to Google, we decided to wait and see whether this had some effect. This post is being published on 25 August, and we’ve already seen that the DragonMoonDesign URL no longer appears as a suspicious link in Manual Actions in Google Webmaster Tools, although it does indicate now –in a somewhat generic way– that “Some inbound links” (without pointing out any in particular) still seem suspicious.

From now on, we are going to give it some time before checking again Google Webmaster Tools’ Link to Your Site, and we will review all inbound links listed in there another time around, to pinpoint which ones are being detected by Google as manipulative.

We will also continue to carry out a periodic review of data provided by MajesticSEO to discover new inbound links pointing to our domain as soon as possible, analysing their Trust Flow, used anchor texts, and thematic authority of every source domain.

We will consecutively increase the number of domains and/or URLs in our Disavow file, including comments of when and why we added a URL or a domain to our list. We will use this information to complete the Request for review only when we are completely certain that we’ve managed to remove all suspicious links.

To what extent can these inbound links be damaging?

According to the consequences Penguin has brought on to other domains, detection of artificial links with over-optimised anchor texts usually harms rankings of a website for these exact terms that are being used as anchor texts. Thus, we must keep a very strict and frequent control over anchor texts of our inbound links, particularly those, which could be harmful. That is, if they are too focussed on a specific keyword, or if they far too distant from our positioning goals, as we’ve seen in two examples presented in this post.

Some links from this directory belong to the first type, whilst those from DiarioInformatica belong to the second one. This directory appears to work with the help of some bot, which changes target URLs and anchor texts pointing to them, which is why we see them potentially harmful.

Given that links can be potentially used to negatively affect other website’s rankings in search engines, it’s certainly worth our time to keep a close eye on emerging new links pointing to our website, in order to control that they comply with Google’s quality guidelines. In this post we’ve told you how we do it at Human Level Communications, but we would love it if you also shared your own tactics and told us your own results.

Fernando Maciá
Autor: Fernando Maciá
CEO of Human Level. He's an expert in SEO, online marketing plans and web internationalisation. He's also a digital marketing teacher and an accomplished writer, having published several books on web positioning, social media marketing, and strategies to earn customers on the Internet.

Leave a comment

Your email address will not be published. Required fields are marked *