Written by Craig Charley – Fri 25 Apr 2014
We always look forward to the panel debates at BrightonSEO as the discussion is more open. It's even better when they involve ex-members of the Google Search Quality team (and a "former" spammer).
A lot of the discussion centred around the disavow tool and disavow files.
The first question was - should you bother cleaning your links or should you just submit a disavow file?
A number of agencies & consultants claim that you don't actually need to do any link cleanup. Just identify all your bad links, disavow the domains, and submit a reconsideration request.
The consensus was 'don't believe everything you read on the web'.
Gareth was quick to point out that this is one area that Google is actually quite clear about. They explicitly state:
'You should still make every effort to clean up unnatural links pointing to your site. Simply disavowing them isn't enough.'
You have to show Google that you've had 'pain' during your cleanup attempts and that the disavowed links are the ones you genuinely cannot remove.
Kaspar and Fili both agreed. You can attempt it but a recovery is unlikely without putting the work in.
Fili used the analogy of a child spilling a drink and their mother handing them a towel to clean it up. The parent isn't cleaning the mess, they are just helping.
Google is the parent, the disavow tool is the towel; the webmaster still has to clean up their mess.
So why do so many agencies claim that only disavowing works?
It's quick, it's cheap and it's easy.
It can feel like a waste of time emailing a dead inbox for the 10th time knowing that it won't be read or responding to, so why not skip that step and just disavow it?
It probably won't work.
There are some cases where recovery is possible without removing a lot of links but Kaspar described these as unique, giving one example where the numbers just make link removal impossible (we're talking in the millions here).
Later on, Kaspar mentioned that the disavow tool is simply a suggestion; there is no precedent for Google to listen. Therefore, it should never be your only bet.
Read our disavow tool guide for more information on when to use the tool and how it works.
An audience member asked if the disavow file is a giant crowdsourcing project to help Google identify bad links/domains.
The answer was a firm yes from Fili Wiese. Google are using the disavow tool to crowdsource bad links.
This shouldn't be a surprise, Google will use all the data they can get their hands on (last year we found out Google track every click in Chrome).
What they do with the data is more to the point.
The audience member was concerned that there could be collateral damage from webmasters adding innocent domains to their disavow file.
Although Google is known for large amounts of collateral damage, in this case there is a sensible system in place.
If a site really is innocent, it's only going to receive a couple of disavows - not enough to trigger a problem.
However, there is a critical mass where when a site comes up again and again in disavow files then that signals that there's an issue.
Having seen his fair share of disavow files, Gareth said it's amazing what people think is a bad link - someone at some point has actually disavowed YouTube, despite the fact that all YouTube links are nofollow anyway...
This led to some worries in the audience that it's just too hard to know what a bad link is now, and can the panel help clear it up?
This is where Gareth came in with "made for SEO, has to go."
Fili - something that violates webmaster guidelines.
There were a lot of laughs at this but it's the most sensible answer! Although Google does change its guidelines, this is usually to go after the latest link building trend.
If you're exploiting a tactic, it's going to end up on the blacklist at some point.
The other consideration is scale. Kaspar says that you won't be penalised for one or a handful of links (although we know that this is not always true!)
If you're unsure whether or not to start removing links, download our free Link Removal Guide.
Definitely don't worry about natural links to your site; if they're really natural then they won't set off any alarms at Google.
Fili brought up quite a common suggestion at the conference:
"Link building is fine, as long as you build it for the traffic."
Gareth raised another point about data - Google knows if a link is getting clicked. Why would they penalise a link that was attracting clicks? It's clearly not just for SEO.
Someone dared to ask - how many links would I need to buy to knock a competitor off the top spot?
Gareth pointed out that you can buy thousands of links on fiver, but also asked why you would go down that route?
Why not spend time doing good things instead of bad things?
This led the panel into a discussion around negative SEO.
After penguin, negative SEO was a big fear for webmasters. If Google is algorithmically penalising sites with bad backlink profiles, then surely every unscrupulous SEO is going to start building dodgy links to their competitors?
Interestingly, according to all three panel members, they've seen very few genuine successful negative SEO attacks.
Often it's actually a spam attack or miscommunication at the business.
Many negative SEO attacks could actually be a rogue employee or dodgy SEO agency building links without others knowing.
Kaspar says that he has never seen a case where a completely 100% clean site has been hit by negative SEO, it's usually the ones who have done plenty of spamming themselves.
If you're worried about negative SEO, it's all about the numbers. 5 links is not a negative SEO attack - they are characterised by 1000s of domains suddenly linking to you.
While the panellists don't think you should be too worried about negative SEO, there are some ways that you can check for an attack. Run through our simple process for identifying spikes in links, and then make sure you check internally to make sure it wasn't a team member!
Someone claimed that algorithmic penalties (i.e. Penguin) are inherently unfair. There is no point of contact between Google and the penalised webmaster to explain why the site has been dropped.
Kaspar defends Google here, saying that Penuin has nothing to do with manual actions. People feel victimised by algorithm penalties but Google is not targeting their site.
A drop in search visibility could be for any number of reasons. Fili points out there are algorithm updates happening all the time, and industry flux.
If your site has a problem then fix it, don't dwell on it.
There's a feeling that webmasters spend too much time listening to rumours and trying to solve their problems on blogs. Fili claims that there's usually a good reason you rank better or worse so find out why on a case by case basis.
Earlier in the debate, Gareth stated that his goal is to be mentioned in a Google team meeting.
Fili shattered this dream, claiming that Google is not interested in SEO at all, they only focus on the user.
Users rule Google, not SEOs.
There are 1 billion Google users - that's who they have to keep happy, not SEOs. And so the changes they make are going to have collateral, and they are going to continue to combat spam to improve user experience.
Here everyone agreed that SEOs have destroyed a lot of stuff which is fun for normal users: guest posts & infographics for example.
Fili says that we should stop trying to think of new ways to spam Google, and instead switch our focus to user experience marketing.
Google's goal is to provide an excellent experience for the users, and they want their search results to reflect this, so you have to provide an excellent experience too.
Answering a question about the after effects of a penalty, Gareth compared Google to a cheated spouse - they might take you back but they won't trust you again.
Fili and Kaspar both disputed this, describing Google instead as 'a very forgiving wife', as long as you show that you have truly repented.
Gareth later asked what would happen if he was to remove 25% of links, submit a disavow file with the remaining 75%, wait until the penalty was lifted and then remove the disavow file.
Fili pointed out that Google does keep records on you, you're going to get flagged again and it's not going to end well. In the end, "it's all about the signals".
They all agreed that if you're caught again then it can be very, very hard to earn that trust back. Drawing on a lot of experience with penalty recover, Gareth said that it's far more difficult to recover from a second penalty.
The final question came from an SEO asking if it's possible to be penalised by association and whether or not that is fair.
The panel asked for an example and the audience member mentioned MyBlogGuest & PostJoint.
Gareth pointed out that if he's using those sites, then he is breaking the rules. They might be about guest blogging, but their made for SEO, and that's going to be noticed by Google.
Thank you to all the panellists, we really enjoyed the debate (even though we missed lunch) and hope that there's a good one in September.