homepage Welcome to WebmasterWorld Guest from 54.226.173.169
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Need Advice on Handling Similar Category Pages on Site
Coleman123




msg:4617867
 1:26 pm on Oct 19, 2013 (gmt 0)

I have a website that provides listings throughout the US. Back when I was first building the site about three years ago, I created separate category pages for categories that were very similar to each other.

So, for every city and zip code in the US we provide listings under 5 different categories, and three of those categories are very similar. Basically the 3 categories are the same but the term used varies greatly depending on the region of the country you're from.

Up until recently we were getting a high volume of traffic from each category. But now I think these 3 similar categories are being considered duplicate content, or at minimum cannibalizing each other in G SERPS.

I believe this is true because, besides the major loss of traffic, now when searching for any of the three category keywords the remaining two will get highlighted by G if they appear in the title/desc. of any results displayed.

My question is what's the best strategy for removing two of the categories from the index and setting the third as the version to index? I still want to keep the two being removed from index available to maintain user experience.

I'm thinking the best strategy is to set a canonical tag on the two categories being removed to the one category remaining in index.

Would I also update the robots tag to NOINDEX or 'NOINDEX, NOFOLLOW'?


Also, there are about 70,000 pages on the site per category. So this update will possibly be no-indexing/manipulating 140,000 pages on my site. With my site already down in the SERPS it's vital that I don't screw this up!

Would you perform this on a small test section first(I'm thinking yes now as I ask this)?

 

Coleman123




msg:4617917
 4:27 am on Oct 20, 2013 (gmt 0)

any thoughts here?

aakk9999




msg:4617941
 1:29 pm on Oct 20, 2013 (gmt 0)

I agree with your plan to implement canonical link element on two city/zip category pages to point to the third category that you choose to be indexed.

I would not set up robots meta noindex. The result of implementing canonical is that pages with canonical pointing to a different URL will be dropped from the index anyway.

I would agree with your thoughts of testing this with one small section of the website first, but I would choose a selection of zip codes/cities that get crawled more often, and I would monitor my logs for the chosen URLs to see if they have been re-crawled before drawing any conclusions. This is because with such a big number of pages, if you chose to implement this on pages not crawled often, you may have to wait a long time to see the results and if you do not monitor whether Google re-crawled these pages, you may end up drawing a wrong conclusion.

goodroi




msg:4618677
 7:25 pm on Oct 23, 2013 (gmt 0)

It sounds like you have massive amounts of questionable quality and near duplicate pages. You can fix this one case (and I agree with what aakk9999 said) but you might want to take a step back and look at what unique value you provide because you are probably going to have many more issues with Google traffic.

bumpski




msg:4655726
 12:30 pm on Mar 20, 2014 (gmt 0)

Coleman123:
How did you make out on this?

Coleman123:
I still want to keep the two being removed from index available to maintain user experience.

It sounds like you have massive amounts of questionable quality and near duplicate pages. You can fix this one case (and I agree with what aakk9999 said) but you might want to take a step back and look at what unique value you provide because you are probably going to have many more issues with Google traffic.
Here appears to be yet another example of Google failing to succeed, yet the recommendation is to change the site to Google's tastes, not serving the visitor's?
aakk9999's advice appears to be the best approach from an SED perspective (Search Engine Degradation), but again, it appears it shouldn't even be required

Many sites need multiple topically organized indexes to be easily navigated by a visitor without Google's aid, or in spite of it.
Like at the library; author, title, and subject indexes which of course will have very similar file names, file paths, titles, descriptions, etc; but should one just acquiesce to Google's poor quality algorithm.
As stated by the OP, the pages are needed for the best user experience, and the evidence appears to indicate Google has intentionally penalized these pages.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved