homepage Welcome to WebmasterWorld Guest from 54.167.138.53
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Google Should Offer Self Defense Against Spammy Inbound Links
Beachboy

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3615824 posted 1:06 am on Apr 1, 2008 (gmt 0)

"Google Bowling" was touched on here:

[webmasterworld.com...]

Tedster observed:

This kind of intentional disruption of the competition is really not an 'seo specialist' kind of thing, in my opinion. It takes no expertise at all, and it is essentially a plague on the web that gives REAL seo a bad name. It's in the same category as the 302 hijacking wars of a few years back. It's like kids scuffling on the playground, and not professional seo work.

Unfortunately, it can also be effective at times - and Google has their hands full combating the distortions it produces. So consider your own back link profile to be like your body's immune system. We all get exposed to the same germs, but only some of us get sick. Get the strong antibodies in there and you won't get sick as easily.

If the black hats out there can negatively impact on your site's positioning by rounding up a bunch of spam links pointing to you, why shouldn't the site owner be able to defend himself?

It seems to me a useful thing Google could do would be to allow site owners themselves to flag and "delete" spammy and inappropriate links into their own sites. It's only fair. It would allow us to help Google minimize the "distortion it produces."

The entire process could be automated on Google's part, and could save on their algo tweaking and resources.

1. Google provides a comprehensive list of backlinks into your site.
2. Included in that list is a clickable button to "delete" that inbound link, which basically tells Google to ignore that link now and in the future.
3. It should also include a way to block links from a site just by typing in the domain of the inbound link source, and why not be able to block entire country codes as well? Email programs allow domain blocking, why shouldn't Google?

I think such a program would cut down on artificial manipulation from external sources. Why shouldn't Google let US help fight the crap that hurts our sites by allowing us to protect ourselves?

How about it, Google?

Anybody see any real problems with this idea?

[edited by: tedster at 4:08 am (utc) on April 1, 2008]
[edit reason] add quote box [/edit]

 

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3615824 posted 4:44 am on Apr 1, 2008 (gmt 0)

I hear you. Having gone from the statement "There's nothing a competitor can do..." to "There's almost nothing a competitor can do...", Google has essentially admitted that sometimes backlinks can cause ranking disruptions. Now a lot of webmasters get concerned at the first sign of any backlink from the neter regions.

I still say that it's a whole lot harder to hurt a site with both a solid backlink profile and a solid business or informational offering. You might say that content is not king, and links are not king -- real substance is king.

idolw

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3615824 posted 9:30 am on Apr 1, 2008 (gmt 0)

it is so difficult to define the meaning of "spammy" that I am not sure it can work.

Also, that would just lead to a situation in which, I'd be getting loads of spam links to my sites and test how long they survive.
If the site gets penalised I will "figure out it is possible to report it" and wait for penalty removal.

walkman



 
Msg#: 3615824 posted 7:57 am on Apr 2, 2008 (gmt 0)

Tedster,
I think that once a site has achieved a certain score (maybe manually reviewed and noticed that it is legit, sitelinks, solid backlinks etc) if Google notices a major influx or direction change they will probably manually review it.

They are hiring daily and have a lot of employees these days so it is feasible. The algo can easily narrow the sites needed to be checked.

Hissingsid

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3615824 posted 9:20 am on Apr 2, 2008 (gmt 0)

Hi,

I like this idea, just add it to webmaster tools.

It could be very useful to Google's fight against spam, if a threshold number of webmasters block links from a particular domain, site or IP range then Google could just block the whole lot comming out of those and save all of the webmasters who failed to spot the problem.

They could have a kind of spamrank voted on by webmasters blocking bad links. Then they wouldn't need so to pay people to do it. I bet we would do a much better job as well.

Cheers

Sid

5ubliminal

5+ Year Member



 
Msg#: 3615824 posted 5:37 pm on Apr 8, 2008 (gmt 0)

Who on Earth would take few thousand links and analyze them to see which should be taken into consideration? Some people are actually busy.
It's a nice idea but un-implementable. Let's be serious.
How could one go over so many links? How could Google present them in a usable manner?

PS: They could also allow use to choose our dream Top10 and see how many match on those choices.

Beachboy

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3615824 posted 6:20 pm on Apr 8, 2008 (gmt 0)

IdolW said:

<<it is so difficult to define the meaning of "spammy" that I am not sure it can work.>>

Think about this for a moment. Does it matter whether we are all on the same page as to whether an inbound link is "spammy"?

Bewenched

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3615824 posted 4:06 pm on Apr 9, 2008 (gmt 0)

I believe we've touched on this here before.

we just need a way to let google know not to count those links. I'd love to remove those scraper site links to us.

mcneely

10+ Year Member



 
Msg#: 3615824 posted 6:38 am on Apr 26, 2008 (gmt 0)

I've been writing out bad inbounders for years.
And I think Google knows it.

Google follows the link in from the bloody spammer and gets 403'd straightaway, and our serps are the better for it ... I'll blackhole those scrapes right down to my last knuckle if I have to, and not even give it a second thought.

Marcia

WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3615824 posted 7:35 am on Apr 26, 2008 (gmt 0)

It's not always black hats (competent ones) either. There are scraper sites being sold with "feeds" with a promise to get rich quick.

Google follows the link in from the bloody spammer and gets 403'd straightaway, and our serps are the better for it ... I'll blackhole those scrapes right down to my last knuckle if I have to, and not even give it a second thought.

What's the code for doing that? I've got some pointing to a site from multiple XML "feed" type pages on some domain, found them with Site Explorer.

mcneely

10+ Year Member



 
Msg#: 3615824 posted 8:08 am on Apr 26, 2008 (gmt 0)

I'll write them out much the same as I would referer spam, through .htaccess.

Here's a bit of a sample for you;

RewriteEngine On
RewriteCond %{HTTP_REFERER} ^http://www.the scraper site .com/article-blah.php$
RewriteRule !^http://[^/.]\.your web site .com.* - [F]

Receptional Andy



 
Msg#: 3615824 posted 9:31 am on Apr 26, 2008 (gmt 0)

Google's spider doesn't send a referrer header, so that code is not workable to prevent spidering - it just prevents visitors from following those links.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved