| 4:20 am on Feb 20, 2012 (gmt 0)|
It's your business and your choice. I have heard from webmasters who made this choice (this was before Panda) when it seemed like Google was just not going to forgive whatever they hadn't liked. In at least one case, moving on did bring some relief and focusing on other traffic sources returned them to profitability.
| 2:06 pm on Feb 20, 2012 (gmt 0)|
|seoskunk wrote: |
What do members think of this idea?
Is this an attempt to clean up those 404s or an emotional lashing out at Google?
If it's the former—and someone else with more experience than me can confirm or deny this—blocking Googlebot will pretty much guarantee that those 410s won't be seen.
If it's the latter, I don't see how blocking Googlebot in robots.txt can be beneficial in any way other than to offer emotional closure on a broken relationship (like throwing out everything that reminds you of your ex-girlfriend).
If you're giving up on Google, there's no reason you can't leave your site open to Googlebot, focus on Bing/Yahoo! and just take whatever traffic Google sends your way.
| 2:29 pm on Feb 20, 2012 (gmt 0)|
Yahoo/Bing is different from Google, not necessarily better or easier. If you have a good handle of their algo then focusing on Yahoo/Bing can be profitable.
I know some arrogant webmasters who thought they were SEO masters since they knew a few short term tricks in Google. They falsely thought ranking in Yahoo/Bing would be no problem. Those webmasters quickly realized they were not SEO masters and had ranking troubles in Yahoo/Bing.
Make sure you are basing your decision on facts & not emotion.
| 4:18 pm on Feb 20, 2012 (gmt 0)|
Hi there, seoskunk:
I have to ask this. First, why do you think you were pandalized in the first place? Could you describe the general nature of your site?
You mentioned you removed 16 thousand pages, which I am assuming was thin content. Could you describe what type of content was on those pages? What type of content now remains on your site?
I know this isn't an answer to your question on whether to block googlebot or not, but hopefully this will give us all a bit more detail and then we can make a better suggestion about whether to block googlebot or not.
| 7:27 pm on Feb 20, 2012 (gmt 0)|
Hey Planet13, I think the site was pandalized because of:
Too many links (250K)
The content was auto generated text out of an sql database. Content on the site is still a bit thin but not so bad
I was thinking of blocking in robots.txt for a few months to clear the junk out of google then resubmit a new site
| 8:36 pm on Feb 20, 2012 (gmt 0)|
|I am now left with 16K worth of 404 pages despite serving 410 in response |
Apparently g### always does this. It doesn't mean anything.
Is your site so popular and well-known that if humans can't find it in g### they will try a different search engine? "Dammit, I know that site's around here somewhere..." Obviously not an issue if you don't rely on search-engine traffic at all-- but if so, you wouldn't even need to ask.
| 9:04 pm on Feb 20, 2012 (gmt 0)|
|Is your site so popular and well-known that if humans can't find it in g### they will try a different search engine? |
The site is deader than that parrot on Monty Python. So there is nothing to loose. It can't live under the penalty so really thinking of ways to get it off the hook and blocking then resubmitting seems a feasible idea. It does OK on Bing but thats as quiet as myspace at the moment. From a business point of view there would be no loss due to a few months out of the google index, especially is this meant the site would get a clean bill of health
| 7:02 pm on Feb 27, 2012 (gmt 0)|
We are in disadvantage because there is no (yet) nearly strong SE like G., could say that they have monopoly on our bussiness visibility.
But I think that it will have to change one day, based on history, monopoly is bad for bussines of any kind and always it come to an end.
| 7:52 pm on Feb 27, 2012 (gmt 0)|
|The content was auto generated text out of an sql database. Content on the site is still a bit thin but not so bad |
Were the 16,000 pages you already cut just the same kind of auto generated content? Was anything on the site expert written subject matter, or was it all just a mix of auto generated stuff.
| 8:15 pm on Feb 27, 2012 (gmt 0)|
i don't think 16k pages are such a big issue for google. I just 410 around 400k pages last week and google deleted from its index a big part of them...around 70%. It will take some time until everything gets stable.