Forum Moderators: Robert Charlton & goodroi
[webmasterworld.com...]
Tedster observed:
This kind of intentional disruption of the competition is really not an 'seo specialist' kind of thing, in my opinion. It takes no expertise at all, and it is essentially a plague on the web that gives REAL seo a bad name. It's in the same category as the 302 hijacking wars of a few years back. It's like kids scuffling on the playground, and not professional seo work.Unfortunately, it can also be effective at times - and Google has their hands full combating the distortions it produces. So consider your own back link profile to be like your body's immune system. We all get exposed to the same germs, but only some of us get sick. Get the strong antibodies in there and you won't get sick as easily.
If the black hats out there can negatively impact on your site's positioning by rounding up a bunch of spam links pointing to you, why shouldn't the site owner be able to defend himself?
It seems to me a useful thing Google could do would be to allow site owners themselves to flag and "delete" spammy and inappropriate links into their own sites. It's only fair. It would allow us to help Google minimize the "distortion it produces."
The entire process could be automated on Google's part, and could save on their algo tweaking and resources.
1. Google provides a comprehensive list of backlinks into your site.
2. Included in that list is a clickable button to "delete" that inbound link, which basically tells Google to ignore that link now and in the future.
3. It should also include a way to block links from a site just by typing in the domain of the inbound link source, and why not be able to block entire country codes as well? Email programs allow domain blocking, why shouldn't Google?
I think such a program would cut down on artificial manipulation from external sources. Why shouldn't Google let US help fight the crap that hurts our sites by allowing us to protect ourselves?
How about it, Google?
Anybody see any real problems with this idea?
[edited by: tedster at 4:08 am (utc) on April 1, 2008]
[edit reason] add quote box [/edit]
I still say that it's a whole lot harder to hurt a site with both a solid backlink profile and a solid business or informational offering. You might say that content is not king, and links are not king -- real substance is king.
Also, that would just lead to a situation in which, I'd be getting loads of spam links to my sites and test how long they survive.
If the site gets penalised I will "figure out it is possible to report it" and wait for penalty removal.
They are hiring daily and have a lot of employees these days so it is feasible. The algo can easily narrow the sites needed to be checked.
I like this idea, just add it to webmaster tools.
It could be very useful to Google's fight against spam, if a threshold number of webmasters block links from a particular domain, site or IP range then Google could just block the whole lot comming out of those and save all of the webmasters who failed to spot the problem.
They could have a kind of spamrank voted on by webmasters blocking bad links. Then they wouldn't need so to pay people to do it. I bet we would do a much better job as well.
Cheers
Sid
PS: They could also allow use to choose our dream Top10 and see how many match on those choices.
Google follows the link in from the bloody spammer and gets 403'd straightaway, and our serps are the better for it ... I'll blackhole those scrapes right down to my last knuckle if I have to, and not even give it a second thought.
Google follows the link in from the bloody spammer and gets 403'd straightaway, and our serps are the better for it ... I'll blackhole those scrapes right down to my last knuckle if I have to, and not even give it a second thought.