The disavow tool was hugely anticipated when it was rolled out in October and since then the general feeling surrounding it has been of confusion and disappointment. One of the biggest issues behind the tool is how you use it, how it actually works and whether it really does what it claims to. On first impressions, using it appears to be very easy - you add a list of URLs/domains you want to disavow and there you go!
Not quite. The issue isn’t how you submit a disavow links list, it’s what you submit in it. The virtue of the tool is that you can use it to account for all the links you weren’t able to remove prior to a reconsideration request… The problem is that, for many, this doesn’t seem to be the case.
In a series of discussions on the new G+ communities (here and here) and also on many prominent SEO blogs, there is a general shared feeling that the disavowing of links does not appear to take effect - or at least doesn’t immediately aid the reconsideration request. This isn’t an issue with the disavow tool itself, but the whole process of manual penalties and how to get them removed. In fairness to Mr Cutts, during his unveiling of the tool at Pubcon, he did say only a small amount of sites should use this tool - whether this was to stop webmasters disavowing valuable links or for another reason we’ll never know, but it was clearly a suggestion that it wouldn’t be for everyone. At least that certainly seems true so far.
So what’s the problem? You’ve spent time exporting backlink data from different third party providers, meticulously checked the quality and anchor text of each and finally produced a master list of the links you think are toxic. Then, cap in hand, you politely contact webmasters and ask for removal; taking note as you go of the sites that have a) blackmailed or b) ignored you - these all go on the disavow tool list. But to then be told you still have a toxic linking profile, after you’ve taken these time-consuming actions, it can leave you a little deflated to say the least.
Finding the Toxic Links
The trouble is, the link removal process can be a tricky one. One of the biggest issues is that how you might personally value a link is, in many instances, going to be different to how Google itself values a link. This means that the process of scrutinising your links is going to take much longer because you have to work harder to weed out the bad ones. Now, if you’ve exported from three or four different tools (Ahrefs, Majestic SEO, Open Site Explorer etc) and cut out the duplicates you are still potentially looking at thousands of links that could be an issue! Dan Petrovic from DejanSEO has recently dug out this video, from a Google hangout with Swiss Googler, John Mueller, where he indicates that Google Webmaster Tools’ back link data will contain all the links you should be worried about.
“generally speaking, you can find all the links you need for webspam or algorithmic reasons in webmaster tools. So it’s not the case that you would need a third party or an external tool to kind of dig up all of those links to find the ones that are problematic, because usually if they are not showing up in webmaster tools then they are generally not something that you need to worry about.”
This makes the process easier (potentially) but raises more questions about GWT’s backlink data than it actually solves.
When Will Disavow Take Effect?
One of the most asked questions about the disavow tool is how long it takes for the effects to be felt. It is widely believed that you have to wait until Google reindexes those links (usually a couple of weeks at least).
“John Mueller has made the point often that Google doesn’t do anything with info in the disavow until the page the link is on is indexed….so really deep pages could take a long time to be indexed and acted on.”
This for me flags up something glaringly obvious; if Google only disavows links that it indexes - what about deindexed links? Surely these are going to be the ones you want removed the most when looking at the reconsideration request. Once again John provides some more clarity:
“If you have unnatural links from pages that have been removed for web-spam reasons, I’d still list those in the disavow tool — you never know when the removal situation will change.”
So still add these domains although they might not be crawled and actioned at the moment, do it just in case. It now makes sense that you’ll want to add these to the disavow tool, to show that you’re at least trying to clean up your act.
The main issue I have here is, that at the moment it appears that links from deindexed sites are not appearing within Google Webmaster Tools. Having just cross referenced those identified by the Link Detox tool as “TOX1″ (deindexed) with GWT’s backlink export. So if it is the case that this export should provide all the links you need to worry about but doesn’t contain those that are deindexed, it seems to suggest that deindexed links aren’t required when filing the disavow request and quite possibly the reconsideration process as a whole.
I know there’s a lot of inferences here and more checks need to be made - I’m waiting on further clarification on this and will update the post when I have more information.
Whole Domain Disavow
One of the other of the lingering issues I’ve had is using the “domain:” command within the request. This is fairly self explanatory because it denounces the entire domain rather than a single page. This is great for disavowing directories rather than trying to find every page which contains your link - again, much harder with a deindexed site! But what happens if all you’re disavowing is directories? isn’t a request full of “domain:” suggestions going to look a little lazy in the eyes of the Google Webspam Team? Or at least that was my concern. But Mueller has again give some clarity here, during the same G+ discussion:
“domain: entries make things much easier when you’re sure that you want to apply it to all links from that site. There’s no problem with submitting them like that.”
Nothing to worry about here then.
To Wrap Up
There are still a lot of questions yet to be answered about the disavow tool, however these are clearly some more concrete-ish points to work from:
- GWT contains all the link data you’ll need - but (possibly) not deindexed sites
- Add deindexed links to your disavow request - just in case
- “Domain:” entries are perfectly acceptable en masse if required
So what questions do you still want to know about the disavow process? Is there anything that’s been missed here or that you want further clarification on? Has this post created more questions than it’s answered?