Frost_Angel - 10:24 pm on May 21, 2013 (gmt 0)
Do any of you think this could be a good thing? Because the crawl errors I am seeing are from a script I had running on my site that allowed people to do job searches. I had no idea the pages were being cached my google and put into the serps because the script was supposed to not allow this. (So said the developer! Anything for a sale.... ugh...)
Any way... it ends up - 80,000+ pages of aggregated (THIN) content from other job sites are created and thrown into the serps under my domain name - basically my domain looks like a huge web spammer - when really I was just naive - didn't understand, not technically brilliant...., whatever you want to call it.
I immediately --- IMMEDIATELY - when I figured out what was happening, removed the script, 404'd everything coming from that database folder and over a period of 6 months.... they quietly went away -- maybe one or two showed up here and there....
But now... I have 6,730 of them!
It's really freaking me out.
What I am wondering....
Is that *G is checking to see if I removed all these thin pages - or double checking.... because maybe this is what hurt me Panda-wise in the first place.
If *G sees I really have removed the junk -maybe they will stop hating me. LOL
Just seems weird to have such a long period of time and they start showing up. I know everyone is saying a major Penguin and Panda update that will rock the internet is "coming...." or has started... and I just wonder is this isn't a "pre-check" or "pre-update" before the update. Like priming a well - just making sure all is running clear before they crank it on.