Forum Moderators: open
I have fixed URL's so they are www.example.com/page/news
Each page is like that and contains lots of content regarding the subject; I have also included this statement in the Meta tags
<META NAME="revisit-after" CONTENT="4 days">
<META NAME="robots" CONTENT="FOLLOW,INDEX">
What else might I do? My site has been up for 6 months and still reads in googles index.
ps. apologies for posting my websites link before.
Read the forum charter, read the FAQ and then read some of the thousands of posts that have been written here about the subject. Use the search function when you are wondering about something that isn't included in the FAQ - chances are high that your question has already been answered in a previous thread. :)
A good place to start when you've read the FAQ is this terrific thread started by Brett:
[webmasterworld.com...]
The most important thing for a bot to crawl your entire site is providing spiderable links to your interior pages.
Any links in on-page Javascript, such as for rollover menus, can give googlebot problems (though it is supposed to be getting better at following these). And G-bot will not follow any links in external Javascript files.
A good site map is one way to be sure spiders can find all of your pages. Link to it at least from the main page or in a footer or header common to all pages. Depending on the size of the site you might want to split it into multiple pages (categories, locations, etc) to make it easy for visitors to use.
Jim
Thank you.
I will add 3 things :
Get more links from outside.
Add new content every day.
Try to keep your all your pages inside a max. of 3 folder deep structure.
<added> We try to avoid specifics as much as possible. more on image maps later...</added>
[webmasterworld.com...]
GB wont have problems crawling it. You can improve link text using Title = keyword in the <a> tag. But, as far as I know, this wont improve deep "crawlability".