Welcome to WebmasterWorld Guest from

Message Too Old, No Replies

Inaccessible linking setup in Google

9:03 am on May 3, 2011 (gmt 0)

5+ Year Member


I wondered if any other members had experience in setting up inaccessible links on a page? I have a situation where a websites I work on has a 'buy' option but uses:

<a href="#">Buy</a>

The page is automatically updated with the information (item added to checkout), but the page has around 500 'buy' links. So I was hoping that I could setup a piece of code that doesn't pass PageRank on, whilst maintaining the user experience.

I have setup a JavaScript method, however the destination page will still be found (although not as much weight will be carried).

Any ideas?
4:49 pm on May 3, 2011 (gmt 0)

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

If you do not want Google to see links you can use javascript. Over the years Google has been trying to crawl more & more javscript and with good success. Eventually they will likely be able to crawl any javascript they want to. Using javascript to hide links is probably not the best long term solution.

Another approach you could try is to place the link in an iframe for a page that is in subdirectory blocked from Google. That is not the easiest to achieve or maintain over the long term.

Personally I would not worry about your situation. Pagerank has been decreasing in importance and will continue to do so as Google adds more quality signals to its algorithm. Leaking pagerank on one or two links on page is not ideal but from my perspective it would be a low priority.
11:48 am on May 4, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

Your link: <a href="#">Buy</a> does not pass pagerank nor drop pagerank on the floor ala nofollow, even with an onclick event.

Google does scan javascript in the page and can discover and start crawling urls that are found only in javascript. It is not clear to me if those urls have a pagerank effect or not.

You could put the urls in js and hope google doesn't use any pr for them, scramble them so they don't look like urls and Google's heuristics don't identify them, or put all the urls in a separate js file that is blocked by robots.txt.

I would consider any of that to be gray hat. I don't think any of it is specifically called out by Google as black hat, but it seems to be trying to fool Google. Google often doesn't look kindly on such things once it puts some algo in place to detect it.

Featured Threads

Hot Threads This Week

Hot Threads This Month