homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

Convoluted Backlink Fix - Is there a Better Way?

 1:05 pm on Jul 26, 2012 (gmt 0)

I have received WT messages informing me that some of the backlinks pointing at my site are not compliant with Google's guidelines. It's a fair cop: many of these links are 7 or 8 years old and seemed like a good idea at the time.

Anyway, the bulk of these are links that I cannot control, or that the hosting webmasters cannot be bothered to change. Google's advice in such a situation includes:

"Redirecting the links to an intermediate page that is blocked from search engines with a robots.txt file".

Ok. I can do that. The thing is that I want users hitting that page to be redirected straight to our genuine homepage so that if they want to bookmark or link to us then they will link to the homepage rather than some page that's blocked in robots.txt.

Our site is on a Microsoft server and I think we use ISAPI Rewrite for our 301s. I'm guessing that if I 301 from that blocked page's URL then Google might think I'm trying to sneak the juice through to the homepage. So, how about I use a JavaScript redirect (or even meta refresh?) from that blocked page to the genuine homepage?

It would get the user where they need to be while achieving Google's desired outcome of not passing PageRank from non-natural links...

Am I crazy or does this kind of make sense? Am I reinventing the wheel in a complicated and silly manner?

Many thanks for any input.



 1:41 am on Jul 27, 2012 (gmt 0)

Here's the big issue with your idea. If you treat googlebot differently from other visitors, then you may get slapped with another penalty - for "cloaking". Whatever you do here, you're going to need to treat all requests the same. And by the way, Google has been following meta-refresh and javascript redirects for a few years now, and they even send link equity through such redirects.

Are you sure there are visitors coming to these URLs with the bad backlinks? Couldn't you just serve that content at new URLs without doing much damage?

Also - roughly how many URLs are we talking about here?


 8:34 am on Jul 27, 2012 (gmt 0)

Thanks for the reply.

Yeh, kind of thought it probably wouldn't fly.

Actually the click through traffic from these links is low volume, but relevant and quite well pre-qualified from a sales POV. I can scratch the JavaScript/meta redirect idea and leave the traffic on that page. The number of bookmarks and links likely to be generated by that traffic is negligible. I just don't like incomplete solutions.

It's a low number of target URLs in this case, <=20, and the bulk of the links pointing at one URL. The number of links is in the tens of thousands.

Of course I had talked myself into that if-there-is-no-evil-intent-Google-won't-see-it-as-cloaking thing. I should know better.

I'll just leave the traffic on the robots.txt-blocked landing page and let it fend for itself. There really is no good reason to take such a small likelihood of links/bookmarks into account.

I'm grateful for your input. Amazing how dumbly one's perspective can get warped when under pressure. :)


 8:36 am on Jul 27, 2012 (gmt 0)

Is "dumbly" a real adverb? :/


 8:40 am on Jul 27, 2012 (gmt 0)

Oh, and while I was aware that G can follow JavaScript links and redirects, I hadn't realised that they passed juice.

I'm obviously not keeping up as well as I thought.

Thanks again.


 8:48 am on Jul 27, 2012 (gmt 0)

The problem with Google advice is that you can do it for new links, but, if I understand it correctly, for an existing link you need redirect all requests for the original URL through a robots.txt blocked URL, so you lose the benefit of all links to that URL.

@buddhu, ues it is a real adverb [chambersharrap.co.uk ] :)


 9:21 am on Jul 27, 2012 (gmt 0)

Yeah, Graeme, that's about the size of it.

I was prepared to take the loss of pagerank by blocking the landing page (preferable to a penalty!). The issue was that I had got it into my head that I wanted to take the traffic off that page and onto a page that was not blocked so I'd get the benefit of any genuine links that might result from that traffic; obviously having them link to a robots.txt-blocked page would be of no ranking benefit.

There was no sneaky intent, but it was a clumsy idea and, as tedster points out, it is a kind of cloaking.


 9:28 am on Jul 27, 2012 (gmt 0)

Actually, tedster, thinking about it (now I've had coffee) the redirect would take the user traffic to a page that was identical to the landing page blocked in robots.txt (which would be basically a dupe of our homepage). So, there should not be an issue with the user and Google seeing different content.

I guess the remaining issue is whether I could redirect that traffic in a way that doesn't pass pagerank, and that would satisfy Google.

This is now kind of an academic exercise. In practical terms, as I say, I'm fine with forgetting the redirect bit and leaving the traffic to fend for itself.


 2:44 pm on Jul 27, 2012 (gmt 0)

Thinking about this a little bit more, if the intermediate URL is blocked from googlebot, then PR would not flow - because Google will not crawl and learn where the final destination is. This is similar to a scheme they used to recommend for affiliate links.


 3:58 pm on Jul 27, 2012 (gmt 0)

My question is what if these links are pointing to the homepage? Should we direct by referred to an intermediate page and then do a javascript redirect to homepage?


 4:24 pm on Jul 27, 2012 (gmt 0)

"Redirecting the links to an intermediate page that is blocked from search engines with a robots.txt file".

Is google thinking that people are clowns who will dance to their tunes? I feel like giving amit singhal a tight slap...


 4:30 pm on Jul 27, 2012 (gmt 0)

@indyrank: yes and we will. If your audience uses Google, and 90+% of mine does, what choice do you have?


 4:33 pm on Jul 27, 2012 (gmt 0)

+1: Tight slap to Amit Singhal (& M. Cutts).


 6:33 pm on Jul 27, 2012 (gmt 0)

If your audience uses Google, and 90+% of mine does...

Then the question becomes, if these URLs are penalized then how much traffic are they still getting? And is the Home Page really relevant for the traffic that comes to an internal page?

Trying to preserve value from URLs that don't get search traffic is what it sounds like to me. It may be that you've thought yourself into a complex knot and there is probably a much simpler solution. Here some more direct approaches.

1. Use the Reconsideration Request to explain the removal requests you made, list which ones were acted on.

2. Just remove those URLs and left the links point to a 404.

Global Options:
 top home search open messages active posts  

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved