Forum Moderators: goodroi
I recentley got dropped in serps for Google, and currentley investigating possible problems with pages I have in googles cache.
My main site is domain.co.uk, and under
site:www.domain.co.uk I have 138 pages indexed.
From them all URLs are clean however there are 2 issues I have, my guestbook is being spidered, so all my comments are shown in google under /guestbook/comment.php? followed by the strings. So what i have decided to do it put a disallow /guestbook/
I also have some pages that redirect to certain pages using a little sql script, this comes from a php page, so I have also decided to remove googlebot to access this file so that these pages are not cached.
I have just recently updated my .htaccess file so that all links get directed via 301 to the one site.
php_flag session.use_trans_sid off
Options +FollowSymlinks
RewriteEngine on
RewriteBase /
RewriteCond %{HTTP_HOST}!^www\.domainname\.co\.uk$
RewriteRule ^(.*)$ [domainname.co.uk...] [R=permanent,L]
So with all this in place, I then wanted to check some competitors robots.txt files and noticed when I goto their domain /robots.txt nothing exsists?
How can this be, and how are they top spots in Google?
Also to point out, when I did a site:www.domain.com I have 24 pages indexed, do you think due to the "duplicate rule" this is effecting my serps?
Any advice on the points above would be appreciated...
da95649
Also, it's not all that simple to merely look for the file in the root.
Look here:
[webmasterworld.com...]
And read what Brett is saying. This is a good example of how sophisticated you can get with this file. He is feeding different files based on user agent.