Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: goodroi
I changed the robots.txt as advised but am seeing weird results now in google serps. I would like to know if the advice I was given is correct and if this robots.txt will work properly for google. Also I would like to be sure that I have not dissallowed any of the forum topics
Secondly can anyone explain what is happening with my google site search results as described below
example.co.uk returns around 500 pages
example.co.uk/forum returns around 25,000 pages including ones which webmaster tools say are disallowed
However searching for non competitive keywords from any of these 25000 pages do not show up in google.
Before messing with the robots.txt I used to rank well on non competative-local keywords on about 1000 forum pages.
Im pulling my hair out so any help would really be appreciated.
[edited by: goodroi at 1:44 pm (utc) on Jan. 7, 2009]
[edit reason] please no specific urls [/edit]
Yes I have webmaster tools and it shows lists of pages that are disallowed. However many of those disallowed pages are showing amongst the 25,000 in the site:domain/forum check
Would any of you guys be willing to cast an eye over my robots.txt just to put my mind at rest that all is well and I havent blocked any of the site topics by mistake.
Are you able to explain the discrepancy between site:domain which shows far less than 1000 pages and site:domain/forum showing 25000 pages ?
If you have used Google's robots.txt tool and it verified that your robots.txt is correctly allowing and disallowing what you want then you just need to wait. While you are waiting go work on your internal and external links that will help your search rankings and probably also increase the speed which google indexes your pages.
Thanks for your reply. I am still a little confused though. Ill try explain why.
I closely monitor my site stats, webmaster tools, analytics, and Hit tail. I regularly check to see how many pages I have listed in google. In the 4 years my site has been running, the maximum amount of pages Ive had listed is 3,500 which was made up of around 500 pages from the main domain and 3000 from the forum. In more recent times I had roughly 1,500 pages listed from the whole forum and they all featured very well for local low competition keyword searches.
The thing that has me alarmed at the moment is the figure of 25000 pages listed for the forum. Since I disallowed the wap 2 pages I expected them to be deindexed and replaced with normal topics. The figure of 25000 has really thrown me and that fact they are not being returned in searches has me even more concerned.
Also just to put my mind at rest would you possibly take a few minutes look at my robots.txt to check there is no glaring errors.
the pages at those urls are already indexed.
someone might actually be linking to those urls.
you have to decide what signal you really want to send the bot.
for example, you can tell the bot "go away" (Disallow) or "there's nothing here" (404) or "go somewhere else" (301/302) or "take this" (200).
there are other response options, but Disallow doesn't mean "Forget about it".
It appears my pages are allowed.
Just one more question with regards to specific forum pages. Is it normal for forum pages to be viewed as a directory ?
The analysis tool says :
Detected as a directory; specific files may have different restrictions
[edited by: goodroi at 2:05 pm (utc) on Jan. 8, 2009]
[edit reason] examplified [/edit]