Forum Moderators: goodroi
2)
robots.txt must be placed in the root of your domain.
3)
you won't have to do a thing, once a crawler finds it, it will crawl it if it wants to.
created robots.txt containing this:
User-agent: *
Disallow: /forum/posting.php
Disallow: /forum/admin
Disallow: /forum/images
Disallow: /forum/privmsg.php
Disallow: /forum/profile.php
Disallow: /forum/memberlist.php
Is there anything else I need to add?
I've read some stuff about having to disable session IDs for *G* to effectively spider the forum pages. Does this need to be done?
You'll need to mod_rewrite the URL's on phpBB so that google can crawl the pages easily.
There are lots of bots which you might want to block in your robots file. The only bots you really want to let in are the search engines whos indexes you want to be indexed in.
Anything else coming in is just a waste of your bandwidth and server load.
TJ