This account really surprises me. If this affiliate script was creating most of your pages, then their content should be duplicated across many other sites who run the same script, right?
If that's the case, then Google's stated intention is NOT to rank the site - it would be what they cal a "thin affiliate". In that case, even replacing the script and whatever content it generates should eventually result in no rankings anyway.
Have I misunderstood what this script was doing?
Thanks for the reply Tedster. The store was a proprietary script that set up an amazon.com associates store.
I can't remember the exact rankings, but if I remember right it also ranked fairly well for a number of other keywords that were in the site's articles, even though there is only a small amount of unique content.
Also, another thing I forgot to mention: The person who sold me the site said that in the previous month it dropped from the first page for its main keyword for 2 weeks, but then had come back at an even higher position. I'm not sure but think that the script was setup some time before that fluctuation occurred.
|300+ “Not found” crawl errors for the affiliate store's urls |
Is there some easy pattern with those urls that would let you create one simple disallow rule in robots.txt? If those urls once resolved, then Google will "remember" them for a long time - but a robots.txt rule will let you handle the issue more simply.
|Is there some easy pattern with those urls that would let you create one simple disallow rule in robots.txt? If those urls once resolved, then Google will "remember" them for a long time - but a robots.txt rule will let you handle the issue more simply. |
Thanks for the suggestion. I didn't even think to use to a robots.txt rule. (Shows my level of inexperience.) I've now disallowed access to the subdirectory that the script was installed under.
I think I may have another possible reason why the site dropped from the SERPs. I only just know remembered that the site's meta keywords and description were all messed up, which I fixed.
Here's what it looked like on every page:
|<meta name="keyword1, keyword2, keyword3" content="" /> |
<meta name="A descriptive sentance about the page." content="" />
I changed every page to:
|<meta name="keywords" content="keyword1, keyword2, keyword3" /> |
<meta name="description" content="A descriptive sentance about the page." />
Could the shock of having all those meta changes at once caused the current problems?
[edited by: tedster at 4:48 pm (utc) on July 2, 2009]
I'm going to give you advice that many "SEOs" on here are going to yell and scream about, but oh well...
Put the script back!
Forget everything you heard about
"duplicate content" blah blah blah
You've been lied to and unfortunately, I don't have the time or patience to explain why - with all the other people on this forum who are going to tell you otherwise.(because they've been lied to as well and now believe it)
So just put the script back.
Add some more content, get backlinks, and enjoy the returned rankings and income.
The sooner you do this, the better.
No yelling and screaming here. Depending on your overall business situation (you did say there was cost involved, for example) I could sugggest the same thing. That would especially include the "develop more content" part. Especially for future-proofing, you may well need that as a kind of safety barrier. Rankings might vanish some day, but in the meantime you have the traffic and revenue.
If rankings ever do vanish, then this would be the first area I'd investigate. But it would probably take a manual review for that to happen. As you just discovered, your thin affiliate site was not caught by any algo. And if you've added more unique value to the site, a human reviewer could still give you a thumbs up.
Was anyone linking directly to these pages or to the store?
I suspect the dropping in rankings is not related to the contents in your store script, but to the fact that Google found that hundreds of pages suddenly disappeared (not a good signal) and that your site is left with only 6 content pages.
A 6 page site and hundreds of 404's seem good reasons for Google to reconsider the overall quality of your site to me.
I have been looking lately at this as a factor in the recent demise of a couple of my sites.
|the fact that Google found that hundreds of pages suddenly disappeared (not a good signal) |
Since adding content and removing the outdated urls did not seem to be having any effect I changed the DNS to point back to the previous site owner's webhost. I had asked her to leave the site files up for a while incase of problems.
Literally within a matter of hours the site has regained its placement in the SERPs. While I was unable to transfer the script to my host, I have installed a free "lite" version of the script on my server and transferred the DNS back.
So for now the problem seems to be solved. Thanks for all the help everyone!
glad you updated us and everything worked out.
Go forth and prosper...lol (im such a geek)