| 3:39 am on Sep 3, 2011 (gmt 0)|
There is a flood of coupon sites these days. I think you're going to need a major innovation to get Google to rank you well. Otherwise, yes - your site will bang smack into a ranking problem.
And I don't think a noindex meta tag on the coupon page alone is going to give you the edge you want. If I had an innovative idea for a coupon site, I'd probably be building it myself.
| 4:27 am on Sep 3, 2011 (gmt 0)|
Kind of off subject but this site WILL do good guaranteed.
The problem with most coupon sites I believe is the lack of content. I have yet to see (besides mine) a coupon site with content filled category page....
It seems like the made for adsense coupon "blogs" (that don't offer coupons) are always at the top of SERPS.... Probably due to the content.
| 5:00 am on Sep 3, 2011 (gmt 0)|
You might be able to augment coupons with product reviews. That would provide content as well as an additional service to your visitors. But tedster is right, there is a flood of coupon sites.
| 12:10 am on Sep 4, 2011 (gmt 0)|
With all due respect the thread isn't about my abilities to create a successful coupon site. (Send me a PM on your stats for a successful coupon site and I will PM you when I get there.)
My main question is will google devalue the site if 90% of the pages are noindexed....
| 1:11 am on Sep 4, 2011 (gmt 0)|
Google could very well devalue your site if your business model is poor, or you're jumping into an already crowded niche without adding significantly to the user experience. That's what people are telling you. It's not the answer you asked for, but it's the answer you got.
| 1:15 am on Sep 4, 2011 (gmt 0)|
And I'd answer "yes" to your question. 90% of your URLs being noindex also sounds like a potential devaluation to me. I can't swear to this as 100% gospel, but I certainly don't know of any counterexample either.
Given the way Panda works, and Google's interest in rating the entire site by it's quality, I can only assume it would be an issue.
| 7:11 am on Sep 4, 2011 (gmt 0)|
To get traction with a coupon site you're going to need something that sets the site apart from the army of coupon sites being built right now. Backlinks and web browser hits that google can see and say "hey, lots of people visit that site and they seem to like it more than the others" are your priority, fast.
Coupons are overdone.
As for your original question I would not add noindex to the individual pages right away, I'd wait to see how the search engines react first and if they completely ignore the pages by all means fix that or remove them later. My favorite site is full of articles that everyone who enjoys the topic loves to read BUT there is also a shop section tucked away via navigational link that Google seems to think is much more important than the articles. Let search engines guide you with their responses to your changes.
| 2:06 pm on Sep 4, 2011 (gmt 0)|
[Sorry I am going off topic here - I will get back on topic in a moment, promise]
On the other hand, I was searching for some coupon sites the other day to find some usable coupons and noticed there was a dearth of any good quality sites. They all seemed like typical "Give us your email so we can spam you... er.... we mean, send you valuable coupons" type sites.
I think the world could use a really good manufacturers / stores coupon site. One that actually gives out coupons without trying to harvest emails.
Maybe they are out there somewhere. I would love to know about one or two of them. When I started a thread in the Foo forum asking if anybody knew of any good coupon sites, nobody replied.
Back to being on topic:
If I read you correctly, it sounds like you will set up your site so that each individual coupon will have its own page, is that right?
Would it not be better to have pages that are coupons sorted by group, such as: Cereal coupons, produce coupons, coffee coupons, or to set up pages where the coupons are grouped by manufacturer or store?
I think if you had those pages as your main indexed pages with a lot of content on them, and either noindexed the individual coupon pages, or put the individual coupon pages in a directory and robots disallow that directory, that should be fine.
It might be similar to how lots of forums I see operate. They have lots of posts that rank well. Yet they link out to lots of profile pages of forum members that are disallowed via robots.txt (so as to avoid people spamming with profiles). That doesn't seem to hurt the ability of those forum posts to rank well.
The webmaster world forum pages are actually a good example of this.
It's not a perfect analogy, but the similarity is that each of those forum pages that DO rank well will have several links to (member profile) pages that are NOT INDEXED / blocked via robots.txt, and that doesn't seem to hurt the ability of the forums to rank well.
so a coupon CATEGORY page could rank well even if it links out to dozens of individual coupon pages that are noindexed or are disallowed via robots.txt
| 3:18 pm on Sep 4, 2011 (gmt 0)|
The problem with most coupon sites not is that they don't offer any real coupons. Search the term "Dog Food Coupons" and let me know what you guys find... If you guys think its going to be hard to beat out these guys that don't even offer coupons I feel bad for you. (Most niches are the same way)
Thanks for your reply Planet13....
I am doing exactly as you stated.
As of right now I have about 20 Main Category Pages. Each filled with a nice amount of content (300-500 words). From those main category pages I have linked to about 300 sub categories all of which have content filled pages (300-500 words).
So as of right now the site is filled with some nice content.
The problem I was seeing was the site is built on wordpress and every time I had a new coupon, that coupon is its own post (thus indexed page) These post have just a couple sentences describing the coupon. So my first thought was of Panda and thin content.
I didn't want to noindex the coupon pages and after I have added 5000 coupons over the course of the next year google turns around and slaps me on the hand for having too many noindexed pages..
I plan on ranking for my category pages and not so much the individual coupon pages.
| 3:24 pm on Sep 4, 2011 (gmt 0)|
Make sure you have a system for removing expired coupons.
| 5:09 pm on Sep 4, 2011 (gmt 0)|
People get the question wrong, all the time.
The question isn't "Will Google like/value my site?".
The question is "Will people like/value my site?".
Google will eventually be irrelevant if it doesn't focus on the latter.
At one time links represented sufficient/valid social voting, genuine human intelligence = "people must really value this site". Then links from "certain sites" represented such a vote - authoritative sites, etc. - but even that model became subject to manipulation as links increasingly became a currency of exchange.
Google gets that it faces irrelevancy in the discovery process if it fails to take into account social votes, endorsements, social signals, personal/behavioral signals, etc. Ergo the effort to use social factors to help rank websites.
Why would anyone vote for your site? Okay, they're your sisters, so you got 3 votes.
So, what else you got? Who else?
| 6:50 pm on Sep 4, 2011 (gmt 0)|
|my first thought was of Panda and thin content |
Panda does not go after "thin content". It tries to identify "shallow content" - and that is a very different issue. These two phrases were invented by Google spokespeople and many people get their meanings crossed.
1. Thin content is content provided by an affiliate feed and published with no value added by the site where it appears. This kind of affiliate site was being demoted long before Panda
2. Shallow content can be 1000 words of completely original writing. The thing that makes it "shallow" is that after you read it, you've essentially learned nothing at all.
Webwork gave the best advice: The question isn't "Will Google like/value my site?". The question is "Will people like/value my site?" It sounds like you are giving this angle some good attention, but just got concerned about possible Panda troubles.
I'd say forget about Panda troubles. No one understands that algorithm well enough to predict anything at all. As long as you're not intentionally pumping out shallow content, forget about the Pasnda-monium.