homepage Welcome to WebmasterWorld Guest from 54.205.99.71
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Blocking referrals from spammy domains to clean up backlinks - a way to deal with Penguin?
andy_boyd




msg:4498443
 8:45 pm on Sep 21, 2012 (gmt 0)

I'm of the opinion that there is more to Penguin than just analysis of backlinks. However, it is part of the problem, which is why I've been removing links for over one month.

However, as we know, not all webmasters respond to emails. So if links are truly out of your control, is it an option to block referral traffic from that specific IP address and therefore 'kill' the link? Or would it just result in a world of pain?

 

Andy Langton




msg:4498478
 11:29 pm on Sep 21, 2012 (gmt 0)

I would strongly agree that Penguin isn't a backlink thing. Backlinks just happen to be a very strong signal.

As for blocking referrers, this isn't going to affect the algo in any direct way, since Googlebot only sends referrers in a fairly small percentage of cases, so it won't have any idea that you blocked them. What would meet the requirement, I suppose, is the mythical 'disavow links' tool.

bluntforce




msg:4498518
 5:40 am on Sep 22, 2012 (gmt 0)

@Andy,
Thanks for your 'disavow links' comment.
It seems like such a simple thing, webmasters pointing out the sites linking to them where the webmaster wants nothing to do with the referring site.
I don't see how it could be gamed, but it might impact sites that currently enjoy reasonable SERPs.

Your reference made me laugh, which is good enough.

tedster




msg:4498519
 6:23 am on Sep 22, 2012 (gmt 0)

I don't see how it could be gamed

I could see people testing the waters by trying out various kinds of spam link networks and thinking they can just disavow the links if they get penalized.

One of the problems with "Neanderthal spam" is that once someone starts that ball rolling, they can't stop it or reverse it. A disavow links tool might be abused in this way, unless a preventative algorithm was already in place at launch time.

Otherwise, in addition to helping innocently punished websites, the tool might attract MORE people into trying the spam route.

--------------------

In recent years, I've seen many penalized sites who proclaimed they never bought links or participated in link rings, etc. Yet when I checked their backlinks, the paid link pattern often jumped out at me and I confronted them with it.

Sometimes they sheepishly admitted what they had done. Other times they located an employee who had actually done the deed - even in very large corporate environments. And sometimes (much more rarely) they seemed to have actually been the victim of a negative SEO campaign.

-------------

Another common reason for spammy backlinks is that your server has been hacked. Today many hacks are cloaked so that only googlebot sees the parasite links that were injected - links that your site IS ACTUALLY HOSTING without your knowledge.

The hacker who placed that parasite content then starts to throw link juice at those pages, in order to power up the hidden links that are in place.

So if you find spammy backlinks, especially ponting to internal URLs, check to see if you were hacked. You can switch to a googlebot user agent, but that's not going to be good enough to resolve every case. The sure-fire test ios Google's Fetch as googlebot tool - because not only the user agent but also the IP address will be correct. Then even the best cloaked parasite content will show itslef if its there.

I would say be ruthless about your business actually having created or accepted those backlinks links, possibly through an employee, a contractor, or some third party. Most online niches are not competitive enough for negative SEO to be commonplace. If your site is in such a cutthroat market, you probably already know that ;)

However, the hacked server/parasite hosting scenario, may be put in place to benefit a site in one of the cutthroat markets. Parasite hosting is actually illegal, and not just a competitive tactic. These people aren't fooling around - they are criminal.

----------

I agree that Penguin is looking at a lot more than backlinks. However, safeguarding yourself against bad backlinks is an important step.

klark0




msg:4498525
 7:03 am on Sep 22, 2012 (gmt 0)

Google says if you 404 the page that has undesirable incoming links, then they ignore it. It isn't exactly helpful if your homepage has the links in question, tho.

[productforums.google.com...]

Your block the referrer trick won't work.

andy_boyd




msg:4499314
 12:45 pm on Sep 24, 2012 (gmt 0)

Thanks everyone.

@klark0 - I read John Mueller's comments, but it sounds like SEO suicide to 404 pages with 'bad' links because they're most likely your money pages.

triggerfinger




msg:4499414
 5:27 pm on Sep 24, 2012 (gmt 0)

An SEO that was on the 404 train early after penguin made the good point that it's probably much easier to contact webmasters with good links and get them to update a URL then it is to contact thousands of webmasters of crappier sites and get them to take a link down. Even if its a money page, if Penguin is hurting the money, it may be worth it to drop the old URL (404) and start a new one.
Another problem is with bookmarks/direct to site, which would see a 404 page. You may want to include a nofollow link to the new page so as to avoid a bad user experience/bounced visits.

Sgt_Kickaxe




msg:4499439
 7:19 pm on Sep 24, 2012 (gmt 0)

Some things we know:

- Google has made clear they don't want webmasters sculpting pagerank.
- Google doesn't follow links, they visit pages directly.

For those two reasons I don't think the suggestion in the O.P. would work. Google wouldn't want it to nor would they ever directly follow a link so blocking the road between site A and site B would not help.

g1smd




msg:4499447
 7:40 pm on Sep 24, 2012 (gmt 0)

Blocking referrers is a fruitless exercise. Don't go there.

Planet13




msg:4500287
 4:56 pm on Sep 26, 2012 (gmt 0)

The sure-fire test ios Google's Fetch as googlebot tool - because not only the user agent but also the IP address will be correct.


I noticed that using the google cache version of a page will often show spammy links / text. In your opinion, is that a viable way of searching out hidden spam? Or is the fetch as googlebot the only way (google cache is much easier on the eyes than looking at the html code that is returned in fetch as googlebot).

Andy Langton




msg:4500298
 5:16 pm on Sep 26, 2012 (gmt 0)

The Google cache is a essentially Google's (separate!) store of the entire HTML of a page Google retrieved. From that point of view, it's an accurate view of what they retrieved at a particular point in time.

But you have to be a little careful, since this isn't necessarily the copy that Google evaluates for search results and also because you would typically open it in your browser, you may execute javascript and parse CSS that distorts "what you see". But if you know what you're doing the cache is a reasonable way to assess "what Google sees".

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved