|Are almost-identical geo-targeted pages problematic?|
I would bet that someone here has experience with a question I have about multiple pages that contain repeated content. I'm worried about the problems that could come my way when they are indexed and ranked.
My website has a series of pages for data that is categorized by state. I'll use "price of a gallon of milk" as an example.
That's 50 state pages plus one for DC. Each page is about 750 words in length, but only about 20 words are unique --the state name of course, and the pricing info, which is numeric.
I usually hate cookie-cutter pages, but there is honestly no way I can think of to present this data without repeating the descriptive text it requires.
To make matters worse, I also have 51 pages for "price of bread", "price of juice', and "price of sugar". So that makes it 204 pages of mostly repeated content.
Is my site going to be seen as a low-quality cookie-cutter operation? Do you have any ideas how to improve on this predicament?
Thanks to the smart people here in advance.
If you were Google would you be happy with pages that are 99% identical? I sure would not be happy.
Try to add relevant content that will be different per state. Think about what users would also like to see on the page to make your site more attractive to them and more unique to Google. Some examples: local coupons or special offers, reviews from local residents, recipes for local favorites, local store locations/descriptions, historical price change for the state.
I believe this has caused me a problem.
I have zoomable maps - click on a pixel in a map and it takes you to a zoomed in version of the map and lists geo targeted info for the new map. The problem is that adjacent pixels are likely to produce pages with identical text.
Clearly this would not work well with Google and am now no-indexing those pages.
After the first map, there are 2 further levels of zoomed map, each with 81250 pixels, that can be clicked on. I think that makes around 6.6 billion pages, many of which would be identical!
Thank you goodroi and denisi.
Goodroi: I should reveal a little more about the topic. It's health care services. I've gotten pretty creative in an overall sense - there's plenty of good pricing and comparison info about the health topic. I think it exactly answers questions everyone facing these issues asks.
The problem is, I have this data on 51 state pages for 4 topics.
I get your meaning, and I can't see how to go in your suggested direction. I believe I am screwed on my own content.
|The problem is, I have this data on 51 state pages for 4 topics. |
JD: Thanks. I thought <link rel=alternate> was pretty much reserved for 100% identical pages in different languages.
The 5% of my content that is unique to each page is what makes it relevant to someone from any particular state. That's the cost for a health service in that state.
I'm trying to remove all introductory and general text. That gets the word count down by 25%, and raises the unique content % by (something?).
I am also wondering about presenting the data in a mega page that would contain a table for all the states, with links for additional info to my state pages, which I would declare NOINDEX and perhaps use a <link rel="canonical" on them pointing to the mega page.
|I thought <link rel=alternate> was pretty much reserved for 100% identical pages in different languages. |
I understand and I think that's what most people think since that's the example Google gives, but the actual definition of "alternate" according to w3.org is: (In the context where it's used without a hreflang or type attribute.)
|The keyword creates a hyperlink referencing an alternate representation of the current document. |
I use it quite a bit in situations similar to yours, because that's exactly what some of my pages on sites are. They're "alternate representations of the current document" with "small differences" based on the specifics of what someone is looking for.
|I'm trying to remove all introductory and general text. That gets the word count down by 25%, and raises the unique content % by (something?). |
Last I heard was you need a page to be about 85% unique for it to be considered unique, so I'd stick with alternates for the current "different location based versions" of the same page and not "fix the text" if it's not broken.
|I am also wondering about presenting the data in a mega page that would contain a table for all the states, with links for additional info to my state pages, which I would declare NOINDEX and perhaps use a <link rel="canonical" on them pointing to the mega page. |
I like this idea too... I won't apply in some of the situations I'm in, but if you can do this somehow, then I'd look in to it and maybe use "fragment links" to navigate to the specific info for different states. EG <a href="/the-page.html#alabama"> with <section id="alabama"></section> wrapped around the info specific to Alabama.
I have a similar situation with events, and not all of them have extensive descriptions, so I have to distinguish the pages with images, video, maps, weather information, donation information, back story, commentary, whatever I can scare up. Plus I have comments and reviews by the users.
If you really can't find a way to somehow expand your niche or serve it up in a different way where you can combine similar states or price points or health categories, then yes, you're probably going to have problems.
Some niches just ain't built for Google. And vice versa.
Thanks again Jd. Good points.
Netmeg: I was thinking of your site when I wrote this, that your event pages must be pretty challenging.
I'm leaning toward the mega page and <link rel="canonical" strategy. Perhaps in a new domain, so that I don't bring new problems to my original site.
The users help too, they bring me stuff I can post that's unique to the event or the location.