I just notice that Googlebot has been crawling pages in a category that do not exist. Here is an example link:
http
://www.mysite.com/category.php?id=Blue-Widgets&page=276
This category only goes to page 11 so page 276 just shows the header of the site and some summary info about the category but no items.
I do not want visitors or Google surfing pages that do not exist so obviously I need to check my max page number before displaying the page.
I guess I could display a 404 since the info doesn't exist but that isn't very nice for a visitor that happens to stumble onto this page somehow (not sure how Google did since I don't link to actual pages in categories). One other thing, the number of items on a page is configurable if a user is logged in. So they could show 5 items per page or 20 which would affect the total number of pages. What is an acceptable way to handle this?
1- Just show the last page of the category even for a non existing page
2- 404 error
3- Do a 301 redirect to the real last page of the category?
4- ?
What would be the correct way to handle this in Google's eyes while not showing the user some browser error page? I would think number 3 except the page never really did exist though but it could in the future if more items get added to the category.
Thanks for any replies
.
[edited by: Robert_Charlton at 10:44 pm (utc) on Jan 3, 2013]
[edit reason] delinked sample url [/edit]