Welcome to WebmasterWorld Guest from 184.108.40.206
I'm here for one reason. Sheer desperation. My situation is this. Imagine have 100 websites. You wake up and look at your Google webmaster tools and discover that every site has received a penalty. Yes, that's where I'm at.
I don't spam. I've resubmitted many times, testing the waters. I'm still getting rejected. My penalty spans across unrelated domains/keywords. I'm not a dishonest webmaster. I was using a template, which had a lot of content via scripts that I really think are great. I have an rss parser. I think they may have led to a duplicate content issue, but seems like something way more serious is at work here. Most of my sites were single pages, simply as a first step of launching, and not for some sneaky reason. Some are selling things via amazon affiliates, some are not. Yes they are for the most part lean of content.
I've been slowly removing my head content, increasing original content and then resubmitting to Google. I'm at the point, now my last attempt is to remove every last piece of head content (all my scripts) leaving me with a bare page. Of course I shudder at the thought of being rejected, because beyond this, I can't see anything else that I can do.
I'm completely desperate here. I'm not a newb. I am simply hammered by an apparent Google god. I will save my rant for a later date. Surely to god I can have an ugly, one page website and still be allowed on the damn internet.
I'm not sure what you mean by no duplicate content penalty. I thought duplicate content penalty because my penalty applies to 100 of my sites, including completely unrelated subjects/urls. Perhaps you're saying that a template is reason enough for this much of a penalty?
I did have another thought. Every resubmission attempt so far has failed. I've been trying on about 7 sites out of the 100. I keep removing page content and trying again and again. Like I said, the next step is to have none of the template head content whatsoever. My template then would only consist of a table layout!
Is it possible...Google looks at all 100 sites, even though I've only resubmitted 1 site. In other words, they won't consider my request until the other 99 sites are fixed. Do they consider each site a seperate entity or am I boxed in. I have to approach this as a whole and not a part? I mean, I can only image that anyone replying would be completely guessing. I would doubt that many if any people have been nuked in such a way. Now think about it...
You wake up tomorrow to find 95% of your websites penalized and removed from Google completely. I can't really put into words how this situation is making me feel right now. I appreciate any thoughts. Perhaps you can feel my pain.
Such sites are of course "allowed" on the internet - there's nothing illegal here - but Google does not want them on their top search rankings. They are quite aggressive in weeding them out of their results, and have a trained army of thousands of human reviewers who spot and report this kind of site or network.
Here's the Google Help reference that covers Google's point of view: Little or no original content [google.com].
So you can publish this kind of site, but you shouldn't depend on Google Search for traffic. Original content and unique value compared to other affiliates in the same niche is what wins the day.
The thing that I'm struggling with, is the fact that I have a few sites with no affiliate or money making aspects, still being penalized. Still penalized after 2 or 3 resubmission requests. Each time I am removing more and more template or duplicate type content. One site is practically bare. The only thing left on that site is some of the head content from my template. For this reason, I'm completely baffled at this point in time. If I remove that remaining head content, resubmit, and still remain penalized, well, I just don't think my nerves can handle it. I will be in an asylum at that point.
You have 100 sites, obviously small since 100 is a big number. 100 sites you admit are "lean of content". You are also very concerned that Google has penalized you, so these 100 sites are likely completely supported by Google and all are working together to hoist each other up in the rankings.
Now, why would Google want 100 "lean content" sites, all owned by the same person, at the top of the rankings? They do not. Google tries to get original quality sites at the top of their rankings...if they feel someone is there and they shouldn't be they penalize.
I don't know how much money you were making with this plan but the luck train may have just pulled into the station for good.
Creating content on these sites via an rss parser feed seems like a likely cause. In addition, having Amazon affiliate rss feeds could be more reason. Having little original content is another likely factor. Having a template with a big chunk of script code on each of the site is another likely factor. However, since I've removed these elements from the sites which I've submitted for reconsideration, they remain penalized and out of the Google index. I have many notices that Google has reviewed my reconsideration requests, but no penalties lifted.
An opinion only: That could be it. If you're using an automated content generator which is also being used by spammers, it probably has a 'footprint'.
Things like duplicate content, tons of subdomains, unnatural interlinking and backlinking don't help either.
I think Google dumps refried SERPs, also. If these are in a sub-directory, then no problem. If your whole site is re-fried SERPs, then the site is gone.
"Some are selling things via amazon affiliates, some are not. Yes they are for the most part lean of content. "
As others have mentioned, that statement alone sums it all up. Google wants unique content, not stub landing pages - especially ones thin on content with affiliate link ads.
It's interesting the comment about putting all your sites into webmaster guidelines. I do wonder the same thing. Does it now mean that I or you can be audited by Google? Backlinks audits? Content audits? It makes me wonder a few things and seems a bit unsettling.
I will summarize what happened here with my penalties.
-my strategy was to create many websites using a template.
-I used an rss parser for a few feeds from places like google blog search, bloglines.com, etc.
-these sites were lean on content.
-a majority had Amazon affiliate feeds.
-some did not have any form of affiliate ads or feeds but were also penalized.
-the majority of sites were one page.
My strategy was to launch many websites in phase one. In phase two, I was adding more content and cosmetics. Before phase two really took place I was penalized. Understand one thing. I was overly aggressive in my approach. The question remains however, do people have the freedom still on the internet to launch websites in the manner that I did. I do have a plan, and that plan was never to have a bunch of rss feed one page websites. It was to have an initial launch phase like I said. Now you can say I suppose that if I have Amazon links, then I'm a spammer. Well, if my website url is buywidgets.com, is it unreasonable to offer widget sales? I mean people would understand before they came to my site, that I'm all about selling widgets. I don't get the spam argument if my site is what people are expecting and want.
The real crux of the issue for me now is, will does Google not allow one page website in the index with lean original content? I suppose some people would argue that it's great they shut me out. Others might say, hand on, is Google being quality police and filtering the internet.
Google is the best thing that happened to the internet. I'm dirt without being indexed by them. It just really makes me wonder right now, how I'm viewed by them. I say this because the site that I'm trying to get reconsidered doesn't have issues or ground for still having a penalty. It's ugly and one page, but is that grounds for still having my penalty? At this point, yes it seems to be.
I suppose I'm posting this now, just to clarify with you folks what's going on. Perhaps if anything, a reminder to be careful. It seems like for me, I'm being judged as a whole and not as a part by Google. It seems I've done something terribly wrong and I'm paying the ultimate price. Unfortunately, I'm only left guessing what the issues are.
[edited by: tedster at 3:41 pm (utc) on Aug. 7, 2009]
[edit reason] widget-ize the copy [/edit]
The answer to that question is no - it is not unreasonable.
But, did your site offer widgets for sale, or did it simply have affiliate links to other sites selling widgets.
I think there is a big difference, and Google does not want to send people to a page with no content, only affiliate links to other sites. I also think they make that quite clear in their guidelines and you were bound to have problems.
However, I, like you, would be interested to know if your position would ever improve after you have created a real website
[edited by: tedster at 3:42 pm (utc) on Aug. 7, 2009]
[edit reason] widget-ize [/edit]
I would also start "adding" quality content to sites you want to save instead of putting them on a diet as the problem is likely two fold -- too many 1 page sites.
It's certainly clear to me that I was way too deep, with way too many domains to keep up. My mistake was thinking that because I was targeting different, yet similar keywords and keyword phrases, that I was no doing anything wrong. Wow, that was a huge mistake. Add to that the lean content, affiliate links, rss feeds, one page sites, and I'm labeled a big time loser. I was in too deep and not realizing how this would all be viewed, even though my this setup was by no means my end game. Spammers set up scams, I was setting up websites to work on and build upon. Affiliate links were only temporary while I was setting up my company. Ironically, now that I have my company, I don't have a website to sell on. Isn't that great?
My mistake was thinking that because I was targeting different, yet similar keywords and keyword phrases, that I was no doing anything wrong.
Well said. By using separate domains to "target keywords", that is almost by definition working to manipulate the search engines. One domain can rank well on many different keywords -- it doesn't take a network of a domain per keyord.
Multiple websites can most definitley target separate MARKETS, tailoring their content specifically for different demographic niches or slices, even if it is the same product. But in this scenario, one domain's content would not be duplicated on another domain. To do well, both domains need to be rich in usefulness for their respective target niches.
The most difficult thing for me to deal with, because I'm just entering ecommerce, is this. I didn't realize that a general site, covering my product, would in fact cover keywords or phrases that people would in fact be searching for. That's a big big big realization for me. People with more knowledge of Google or search engines may realize this, but it wasn't clear to me. Not until now.
I'm still strugging with what does and doesn't fall into the "manipulating search engines" category. I'm speaking in terms of how my domains would ultimately be viewed. Would I be considered a spammer or not a spammer. I think people use the word spamming a bit too loosely these days, but I'll save that for a different day.
I keep coming back to this. If a search engine other than Google gives more value to your domain and having the keywords in that domain, and you have domains with those keyword phrases, does it make smart business sense to tear up those domains and throw them into the garbage. I'm really struggling with the concept of one product, one website, per person. That is what I seem to be hearing, and what I seem to understand is how this works. If I push the limit and make a couple or a few different websites, selling the same type of product, them I'm spamming or manipulating the index. I think in the real world of business, people or corporations are not limited in what they can or cannot do. However, it seems that in the internet world, business cannot. It seems like essentially one website, per product/subject, per person. I have to accept this, but as an ignorant ecommerce newbie, I'm having a tough time swallowing this concept. I realize this is likely the best way for a search engine to fetch results. I mean, on any given subject/product, do you think there is 2 sites from the same person/company? If there is, I think we can claim they are doing something to manipulate search results. Am I going around in circles on this?
If you want them to send FREE traffic to help your business, then you've got to know what their "rules" are. As a parallel, if you were running a physically based company, then you would need to adapt every new location you opened to the local laws and building codes.
I work with many situations where one company has many websites, and they do so very openly. A main website and a company blog is one almost trivial example. Or a car manufacturer might have a separate website for their financing side. The variations on this theme are many.
If you spend some time browsing around many major company's web presence today, you will see how they do it. And in many cases the many domains are all registered under one WebmasterTools account.
I'm really struggling with the concept of one product, one website, per person.
No...if your 100 websites were all designed differently, didn't use the same exact descriptions, offered different products, and were more than a few pages each then you would have been fine. I think your struggling with the concept of having to have unique product descriptions and a unique layout per website.
Wal*Mart doesn't go and open a Mart*Wal right next door. It's the exact same store, just a different name. It doesn't help anyone that goes to Wal*Mart and doesn't find a product to go next door to Mart*Wal. That's what your doing. Who would want a world with hundreds of stores that are all the same with different names? Why would Google want to link someone to both Wal*Mart, Mart*Wal, Mart*Wal Online, Wal*Wal*Mart, etc. if they all do the exact same thing? What an awful search engine that would be.
Wal*Mart instead opens a different kind of retail store that offers better deals to members, encourages bulk purchases, and in no way resembles the look and feel of Wal*Mart. It's called Sam's Club. That's what you can do if you want to make more than one site. You probably can't do it with 100 sites.
You have no choice now but to pick a few of those domains and then either sell or 301 redirect the rest. There's nothing left to grasp here. You can't have duplicate sites. You can't have duplicate sites. You can have as many domain names as you want. You can't have duplicate sites.
Google wants unique content, not stub landing pages - especially ones thin on content with affiliate link ads.
Nothing much has changed here over the recent 4-5 years. Feed generated, aggregate content that lot's of folks share, definitely doesn't do well and I've seen sites damaged that participate on both sides of the aggregation, but I am also observing sites where multiple content feeds are sliced and diced in such a way that it provides a unique quality user experience - those sites are holding and growing good SERP's.
My guess is that many of your sites, even though they don't link to each other, rely on pretty much the same sources for their inbound links. If you've got 100 sites in the same market area, I don't see how you would have avoided this. As such, this adds one more factor to the already clear picture that these are not independent sites.
nor do I see any problem in having sites on onetypeofwidget.com & anothertypeofwidget.com and so on. it should only become an issue when there is little useful content on each.
perhaps you can develop a handful of the sites and see if they recover?