Forum Moderators: Robert Charlton & goodroi
The first, and oldest, uses a querystring, like /index.php?section=forum&topic=123
The second uses a somewhat more search engine friendly format like /forum/topic/123, and my site is mainly listed in Google using this format.
Recently I've been looking more into SE optimization, and I want to change the format of the urls to a third format like /forums/sub-forum-title/topic-title. I would also set-up the other two formats to do a 302 redirect to the new format.
By changing the url, I assume my site will lose it's PR. However, the PR is only 1 since it doesn't have much inbound links (yet).
Does anyone know of any other things that might get affected by changing this? I currently get around 2000 unique visitiors a month comming directly from Google, and want to keep it that way.
Things I've taken in considderation:
Cons:
- loss of PR (willing to sacrefice that)
- breaking current inbound links (although 302'd to new location)
Pros:
+ (imho) more professionally looking (does Google take that into account? I think the average human will though...)
Anyone else knows of things I must considder when changing the format? Or any opinions on the eventual results of this makeover?
Cheers,
Tom
Keywords in URL, and a folder-like structure, will not necessarily give you a boost in rankings, so you might want to think this through a bit more.
Most forums are badly indexed because they have multiple URLs that can access the same content, the same title and meta description on multiple pages, and many pages that do not ever need to be indexed showing in search results (like "new post" and "new reply" and "send PM" and "edit profile" pages) all with the same "Error. You are not logged in." message. Additionally, get all the "print friendly" pages out of the index too. They are yet more duplicates.
Get everything apart from thread listings, and threads, out of the search engine indexes, then look closely at what is left. If you are using PHPbb, vBulletin, or a similar package, then you will see that each post on your forum has at least 12 different URLs that can access it. Work on getting all of the duplicates out of the index, either by using robots.txt or the robots noindex tag on the pages themselves (test the URL that was requested, and add the noindex meta tag for all but one version).
One major design fault in those forum packages is the URL generated from the "nextnewest" and the "nextoldest" links on every thread on the site. They are the biggest problem in causing duplicate content on tens of thousands of sites.
I made a much longer post, or two, on this topic just a month or so ago, if you care to search this forum for that too.
As for duplicate content, I didn't really look at it that way. I found some post stating that a 301 will remove the old url from the index, and add the new one. (I also found this info on Google's website)
I'll also be looking into the 20 url's for the same page problem. I'm using a custom written forum, but it does also indeed have such 'feature'. I can't really call it a bug, because it is used to directly link to the correct page a post is on, without having to know the pagenumber (which is quite usefull in large discussions, since page numbers can change when posts get deleted).
I'll be adding a meta robots noindex,follow to the individual links, and only have a robots index on the actual pages that need to be indexed. The rest of the 'useless' pages I'll rule out with robots.txt
Anyway, thanks a lot for the usefull info.
Cheers,
Tom