Welcome to WebmasterWorld Guest from 54.226.32.234

Forum Moderators: goodroi

Message Too Old, No Replies

The best way to add uncrawlable links?

Following on from a post about spider friendly JavaScript for collapsible c

     

jonny0000

3:34 pm on Mar 16, 2010 (gmt 0)

5+ Year Member



Following on from a post about spider friendly JavaScript for collapsible content, I wondered if anybody had intentionally used uncrawlable pieces of JA or Ajax to prevent search engines crawling specific links.

As an example, many ecommerce solutions have multiple product links which screws up the balance of the internal linking structure. If you could restrict access to duplicate links and allow bots to crewel only a single text link (whilst not impacting usability) then the internal linking structure would be a lot healthier.

For the greater good make certain areas uncrawelable. Thoughts?

goodroi

6:16 pm on Mar 16, 2010 (gmt 0)

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



#1 It depends on the situation.

#2 For some situations it can definitely help to block the spiders from reaching certain links & pages. I have done it on a few projects and have been rewarded with better rankings & traffic.

dstiles

9:32 pm on Mar 16, 2010 (gmt 0)

WebmasterWorld Senior Member dstiles is a WebmasterWorld Top Contributor of All Time 5+ Year Member



All my web sites (except where a customer requests otherwise) block or remove links from certain pages (eg contact forms) if a bot of any kind is detected, in addition to the page being blocked in robots.txt. If the bot hits the unlinked page then it gets a 405 returned.

jonny0000

9:09 am on Mar 17, 2010 (gmt 0)

5+ Year Member



Thanks dstiles. I am thinking about preventing access through certain channels rather than preventing access to certain pages altogether where of course robots protocol could be used.

goodroi, what methods have you tested when preventing access through certain links and which have you seen the best results with?

jameswsparker

7:47 am on Mar 18, 2010 (gmt 0)

5+ Year Member



You can request Google to crawl specific sites/links for it's search engine by adding the URL into Google:

[google.co.uk...]

Entering your site here will get Google to crawl all the links it can find on that site.

goodroi

12:15 pm on Mar 18, 2010 (gmt 0)

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



every situation is different. i have personally used robots.txt, metarobots and a few others. the method i use depends on the scale of the project and the preexisting structure.
 

Featured Threads

Hot Threads This Week

Hot Threads This Month