| 10:59 pm on Jan 17, 2008 (gmt 0)|
Yes, that is a common Duplicate Content issue.
Each of those pages should have only one URL.
| 11:03 pm on Jan 17, 2008 (gmt 0)|
Yes, it's an issue. Either change your navigation to not include the Offset if it's 0, or add &offset=0 to all the other links on other pages linking to that page.
| 11:05 pm on Jan 17, 2008 (gmt 0)|
Go with the shortest URL as being the canonical version if you can.
| 3:50 am on Jan 18, 2008 (gmt 0)|
I just don't know how I could change the navigation to exclude &offset=0
so would this work in robots.txt?
| 4:42 am on Jan 18, 2008 (gmt 0)|
somewhere in the code there is a variable that tells you what page you are on, lets call it "page"
all you have to do is add some code like (not tested):
if page-1=0 then
Response.Write "<a href=""" & Request.ServerVariables("SCRIPT_NAME") & """>Previous Page</a>"
Response.Write "<a href=""" & Request.ServerVariables("SCRIPT_NAME") & "?offset=" & page - 1 & """>Previous Page</a>"
This works with a page that has "Next" and "Previous" type pagination, but it should be simple to adapt it to 1 / 2 / 3 type
| 9:33 pm on Jan 18, 2008 (gmt 0)|
You can put that disallow statement in your robots.txt file, but the links would still exist on your site.
There are two negative effects from that:
1. They waste PageRank flow within your site; PageRank that could have been channeled elewhere.
2. They URLs are still visible to visitors who may well cut and paste those URLs in to the page content on other sites, thereby creating links to your site that are useless to you for PageRank.