Does Google use a set standard for manually removing webspam

Today we have a really fun question from Billy in Sacramento.

Billy asks, "When Google does a manual review, do you guys use a set standard when banning, removing from the index sites?

Or do you guys ban based on if it looks bad or even smells like spam?"

So let me try to tackle the question that I think you're worried about and then tell you a little bit more about how we try to tackle webspam.

One thing that we don't do is just say, oh, someone is being critical of Google, therefore, take action, right?

We're big believers in the Voltaire saying, "I might not agree with what you say, but I'll defend to the death your ability to say it." So just because you're critical of

Google, that's not the sort of thing where we're going to mark you as spam.

Now we do have very clear, in my opinion, webspam guidelines, the sort of things that cover all the normal, deceptive, manipulative stuff that people try to do to manipulate search engines.

So cloaking, JavaScript redirects that are really sneaky, thin affiliate sites that don't add any value, all kinds of spamming tricks.

And I think we do a pretty good job on all those sort of simple gibberish, very clear-cut spam kind of cases.

We do try to work hard to make sure that we're consistent.

So when you start in the manual webspam team, we absolutely do have training, stuff that people try to read up on and learn about how to interpret stuff.

We have experimented with something that we call shadowing, which is one person will watch what another person is doing and give them coaching or advice.

Certainly the first few times you're checking in stuff, that will be subject to a review.

And then you'll get feedback to let you know, yes, this looks good, or it doesn't.

And then, over time, you'll build up more trust and more autonomy.

But at any given point, we do spot checks to measure the quality in terms of how consistent people are.

So there's a lot of stuff that we do to try to make sure that we train people up to be able to handle all these interesting corner cases and weird ways that people try to deceive search engines.

For a while, we would even have these sort of energizer sessions where people could try to stump Matt.

And so you'd get these really hard philosophical questions in the gray zone.

And we'd sort of, as a group, come to a consensus and say, OK, this is why this would be considered spam, or this is why this would be considered cloaking or not cloaking.

So we do try to make sure that we're consistent.

The creativity that spammers show in trying to spam ranks and all the different tricks that they do is pretty staggering.

And so there are some cases that are kind of unusual.

But most cases are actually relatively straightforward.

That's a very quick overview of how we do manual review, how we do training, the fact that we do try to be consistent, and that just because somebody's critical of Google doesn't mean that we're going to take action on a particular site.

One thing to be aware of is we don't have to be bound by a very narrow view.

If we have knowledge that this person is a repeat spammer, if we know that they've done severe stuff in the past, if it's something involving malware or hack sites, really malicious stuff, we try to look at the whole holistic thing.

And, in fact, if you look at our webspam guidelines, we say, this is a list of stuff that's bad to do.

But anything that's essentially counter to the spirit of the quality guidelines-- deceptive or manipulative-- we do reserve the right to take action.

And so if we find some new attack, it's not as if we tie our hands behind their back.

We are willing to respond to that and then find ways to make sure that it doesn't hurt the user experience.

But I think that we do try to be precise.

We try to make sure that we respond as well as we can.

And I think you do want somebody-- if you were running

Google, and we were on the outside, we try to run it that way so that we're not going to be fooled if somebody's using slightly different techniques.

We do try to respond to that.

But we also try to be fair.

We allow people to do then reconsideration requests.

We do take those appeals.

And we do review those.

So hopefully that's a little bit of an overview about the ways that we view spam, the ways that we take action, and the ways that the team tries to make sure that we protect users while also trying to-- a lot of us are webmasters as well.

And we try to step into thatpoint of view and see how somebody could have gotten into a particular situation as well.