Forum Moderators: goodroi

Message Too Old, No Replies

Spidering of my code pages

Wrong pages being spidered and losing referals

         

djfelip

9:53 pm on May 19, 2005 (gmt 0)

10+ Year Member



I recently took over the web operations for a rather large travel site, through my marketing tools I have noticed that a good majority of my referrals from yahoo and google are coming in on my code processing pages so the visitors are just closing the browser and moving on. I think I made the necessary edits to my robots.txt BUT is there a way that I cna test this to make sure? I would hate to have to wait for the engines to respider and find out i didnt do this correctly it is costing us alot of valuable traffic.

Reid

10:47 am on May 21, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



referrals from yahoo and google are coming in on my code processing pages so the visitors are just closing the browser and moving on.

Can you translate this a little. You mean intro pages, browser query type pages? What are they cgi php?
What code are they processing? What did you put in robots.txt? what are you trying to disallow?

djfelip

3:13 pm on May 25, 2005 (gmt 0)

10+ Year Member



The site is an ASP site that lists vacation homes, some of my processing pages to run a property search for instance are being indexed so when someone comes in on that page they get a page cannot be displayed.

Span

3:48 pm on May 25, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You can use any user agent string to test your robots.txt at wannaBrowser [wannabrowser.com]

Reid

11:18 pm on May 25, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



just disallow those pages in robots.txt This will prevent bots from listing them.
In order to remove them from google you need to validate your robots.txt and then submit it to google removal tool. any pages in the google index which are disallowed will be removed.