Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: open
I currently have a site that has a large number of stories (1400+) that I would like indexed.
However, it uses a typical pagination scheme, and I am not quite sure how google will be able to see all the stories.
It goes something like this:
where **** will display a page containing 20 stories at a time. At the bottom of the page it has a typical 'PREV 1 2 3 4 5 NEXT' kind of setup. The way I figure though, google would have to crawl something like 15 pages deep (1400/20/5) in order to see all of the content.
each story on those pages is linked to showstory.php?a=**** where xxx is the story name. ( I was cautious to avoid using 'id' ).
How can I best setup these pages so google can access every story? I was considering adding a separate page that just listed all of the 'pages', which would only be about 70 or so. ( it would just have a list of links like story.php?page=1 , story.php?page=2 ... all the way to the last page ). This way each page listing would only be about 2 links deep.
How do you guys handle this?
My experience (with asp) has been that Googlebot's better at hitting "NEXT" than many users. :) :(
>>I was considering adding a separate page that just listed all of the 'pages', which would only be about 70 or so.
That should speed things up. Make it palatable for users too, call it a 'site map', and link to it from your home page.
There's lots of other ways to break a site map up (alphabetically, by topic, etc. etc.), so I'd look for a solution that's intuitive for users and that at the same time has pretty high fanout (you don't want your sitemap to be 7 levels deep; 1-3 levels deep is much better), while keeping individual pages under 100K or so.