| 11:29 pm on Sep 21, 2012 (gmt 0)|
I would strongly agree that Penguin isn't a backlink thing. Backlinks just happen to be a very strong signal.
As for blocking referrers, this isn't going to affect the algo in any direct way, since Googlebot only sends referrers in a fairly small percentage of cases, so it won't have any idea that you blocked them. What would meet the requirement, I suppose, is the mythical 'disavow links' tool.
| 5:40 am on Sep 22, 2012 (gmt 0)|
Thanks for your 'disavow links' comment.
It seems like such a simple thing, webmasters pointing out the sites linking to them where the webmaster wants nothing to do with the referring site.
I don't see how it could be gamed, but it might impact sites that currently enjoy reasonable SERPs.
Your reference made me laugh, which is good enough.
| 6:23 am on Sep 22, 2012 (gmt 0)|
|I don't see how it could be gamed |
I could see people testing the waters by trying out various kinds of spam link networks and thinking they can just disavow the links if they get penalized.
One of the problems with "Neanderthal spam" is that once someone starts that ball rolling, they can't stop it or reverse it. A disavow links tool might be abused in this way, unless a preventative algorithm was already in place at launch time.
Otherwise, in addition to helping innocently punished websites, the tool might attract MORE people into trying the spam route.
In recent years, I've seen many penalized sites who proclaimed they never bought links or participated in link rings, etc. Yet when I checked their backlinks, the paid link pattern often jumped out at me and I confronted them with it.
Sometimes they sheepishly admitted what they had done. Other times they located an employee who had actually done the deed - even in very large corporate environments. And sometimes (much more rarely) they seemed to have actually been the victim of a negative SEO campaign.
Another common reason for spammy backlinks is that your server has been hacked. Today many hacks are cloaked so that only googlebot sees the parasite links that were injected - links that your site IS ACTUALLY HOSTING without your knowledge.
The hacker who placed that parasite content then starts to throw link juice at those pages, in order to power up the hidden links that are in place.
So if you find spammy backlinks, especially ponting to internal URLs, check to see if you were hacked. You can switch to a googlebot user agent, but that's not going to be good enough to resolve every case. The sure-fire test ios Google's Fetch as googlebot tool - because not only the user agent but also the IP address will be correct. Then even the best cloaked parasite content will show itslef if its there.
I would say be ruthless about your business actually having created or accepted those backlinks links, possibly through an employee, a contractor, or some third party. Most online niches are not competitive enough for negative SEO to be commonplace. If your site is in such a cutthroat market, you probably already know that ;)
However, the hacked server/parasite hosting scenario, may be put in place to benefit a site in one of the cutthroat markets. Parasite hosting is actually illegal, and not just a competitive tactic. These people aren't fooling around - they are criminal.
I agree that Penguin is looking at a lot more than backlinks. However, safeguarding yourself against bad backlinks is an important step.
| 7:03 am on Sep 22, 2012 (gmt 0)|
Google says if you 404 the page that has undesirable incoming links, then they ignore it. It isn't exactly helpful if your homepage has the links in question, tho.
Your block the referrer trick won't work.
| 12:45 pm on Sep 24, 2012 (gmt 0)|
@klark0 - I read John Mueller's comments, but it sounds like SEO suicide to 404 pages with 'bad' links because they're most likely your money pages.
| 5:27 pm on Sep 24, 2012 (gmt 0)|
An SEO that was on the 404 train early after penguin made the good point that it's probably much easier to contact webmasters with good links and get them to update a URL then it is to contact thousands of webmasters of crappier sites and get them to take a link down. Even if its a money page, if Penguin is hurting the money, it may be worth it to drop the old URL (404) and start a new one.
Another problem is with bookmarks/direct to site, which would see a 404 page. You may want to include a nofollow link to the new page so as to avoid a bad user experience/bounced visits.
| 7:19 pm on Sep 24, 2012 (gmt 0)|
Some things we know:
- Google has made clear they don't want webmasters sculpting pagerank.
- Google doesn't follow links, they visit pages directly.
For those two reasons I don't think the suggestion in the O.P. would work. Google wouldn't want it to nor would they ever directly follow a link so blocking the road between site A and site B would not help.
| 7:40 pm on Sep 24, 2012 (gmt 0)|
Blocking referrers is a fruitless exercise. Don't go there.
| 4:56 pm on Sep 26, 2012 (gmt 0)|
|The sure-fire test ios Google's Fetch as googlebot tool - because not only the user agent but also the IP address will be correct. |
I noticed that using the google cache version of a page will often show spammy links / text. In your opinion, is that a viable way of searching out hidden spam? Or is the fetch as googlebot the only way (google cache is much easier on the eyes than looking at the html code that is returned in fetch as googlebot).
| 5:16 pm on Sep 26, 2012 (gmt 0)|
The Google cache is a essentially Google's (separate!) store of the entire HTML of a page Google retrieved. From that point of view, it's an accurate view of what they retrieved at a particular point in time.