Hi Everyone, I would really appreciate some of your views on the best way to do this.
We have a site specialising is offering a certain “day out” offers.
The way they have the site structured you can find an offer for one of these days out in a number of location based ways:
1.Drilling down from a large cross sections of the country (i.e East, North East, London etc). Underneath this you have counties and underneath that you have town/cities. Each level is clickable and you then see the offers in that area. This results in urls such as:
http://example.com/venues/east/cambridgeshire/cambourne/
2.Once you reach a destination, you can then also specify a radius so 25 miles, 50 mile radius etc. This results in urls such as:
http://example.com/venues/east/cambridgeshire/cambourne/~rad:20
3.You are also able to find offers by searching for a postcode.
This results in a url like:
http://example.com/venues/cb2/
The problem is that the current sitemap and Index has all of the possibilities included. However, keep in mind that although the actual final "offer" or product may be the same across differing levels of locations, the combination of offers on that page will be different. i.e in the example above,
http://example.com/venues/cb2/ might have 4 offers. Two of which are the same as on:
http://example.com/venues/east/cambridgeshire/cambourne/
Hope that makes sense.
My Gut Instinct:
Having so many pages indexed by Google with what essentially is a variation/combination of the same listings could get us in trouble with Panda (if not already). If that is the case, its not only a sitemap that we need to recreate, but noindex a lot of the existing page. can dropping such a large number of pages have a negative effect on the site?
I am tempted to NOINDEX every page that has a radius or postcode as above and just leave the cleaner location urls.
Create a sitemap to reflect the above change.
Thoughts?