Welcome to WebmasterWorld Guest from 54.198.185.204

Forum Moderators: Robert Charlton & aakk9999 & andy langton & goodroi

Message Too Old, No Replies

Considering Disallow:googlebot for Pandalized site

     
7:40 pm on Feb 19, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Sept 14, 2011
posts:736
votes: 20


I am very frustrated at having a pandalized site, I have cleaned the site up and removed thin content but I am now left with 16K worth of 404 pages despite serving 410 in response. I am thinking of taking off google traffic by blocking in robots.txt and concentrating on yahoo/bing. Maybe wait six months till google is making any sense. What do members think of this idea?
4:20 am on Feb 20, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


It's your business and your choice. I have heard from webmasters who made this choice (this was before Panda) when it seemed like Google was just not going to forgive whatever they hadn't liked. In at least one case, moving on did bring some relief and focusing on other traffic sources returned them to profitability.
2:06 pm on Feb 20, 2012 (gmt 0)

Full Member

5+ Year Member

joined:Mar 22, 2011
posts:339
votes: 0


seoskunk wrote:
What do members think of this idea?

Is this an attempt to clean up those 404s or an emotional lashing out at Google?

If it's the former—and someone else with more experience than me can confirm or deny this—blocking Googlebot will pretty much guarantee that those 410s won't be seen.

If it's the latter, I don't see how blocking Googlebot in robots.txt can be beneficial in any way other than to offer emotional closure on a broken relationship (like throwing out everything that reminds you of your ex-girlfriend).

If you're giving up on Google, there's no reason you can't leave your site open to Googlebot, focus on Bing/Yahoo! and just take whatever traffic Google sends your way.

--
Ryan
2:29 pm on Feb 20, 2012 (gmt 0)

Administrator from US 

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:June 21, 2004
posts:3100
votes: 85


Yahoo/Bing is different from Google, not necessarily better or easier. If you have a good handle of their algo then focusing on Yahoo/Bing can be profitable.

I know some arrogant webmasters who thought they were SEO masters since they knew a few short term tricks in Google. They falsely thought ranking in Yahoo/Bing would be no problem. Those webmasters quickly realized they were not SEO masters and had ranking troubles in Yahoo/Bing.

Make sure you are basing your decision on facts & not emotion.
4:18 pm on Feb 20, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member planet13 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:June 16, 2010
posts:3813
votes: 29


Hi there, seoskunk:

I have to ask this. First, why do you think you were pandalized in the first place? Could you describe the general nature of your site?

You mentioned you removed 16 thousand pages, which I am assuming was thin content. Could you describe what type of content was on those pages? What type of content now remains on your site?

I know this isn't an answer to your question on whether to block googlebot or not, but hopefully this will give us all a bit more detail and then we can make a better suggestion about whether to block googlebot or not.
7:27 pm on Feb 20, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Sept 14, 2011
posts:736
votes: 20


Hey Planet13, I think the site was pandalized because of:

Thin Content
Too many links (250K)
Paid Links

The content was auto generated text out of an sql database. Content on the site is still a bit thin but not so bad

I was thinking of blocking in robots.txt for a few months to clear the junk out of google then resubmit a new site
8:36 pm on Feb 20, 2012 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:13064
votes: 306


I am now left with 16K worth of 404 pages despite serving 410 in response

Apparently g### always does this. It doesn't mean anything.

Is your site so popular and well-known that if humans can't find it in g### they will try a different search engine? "Dammit, I know that site's around here somewhere..." Obviously not an issue if you don't rely on search-engine traffic at all-- but if so, you wouldn't even need to ask.
9:04 pm on Feb 20, 2012 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Sept 14, 2011
posts:736
votes: 20


Is your site so popular and well-known that if humans can't find it in g### they will try a different search engine?


The site is deader than that parrot on Monty Python. So there is nothing to loose. It can't live under the penalty so really thinking of ways to get it off the hook and blocking then resubmitting seems a feasible idea. It does OK on Bing but thats as quiet as myspace at the moment. From a business point of view there would be no loss due to a few months out of the google index, especially is this meant the site would get a clean bill of health
7:02 pm on Feb 27, 2012 (gmt 0)

New User

5+ Year Member

joined:Dec 7, 2010
posts:33
votes: 0


We are in disadvantage because there is no (yet) nearly strong SE like G., could say that they have monopoly on our bussiness visibility.
But I think that it will have to change one day, based on history, monopoly is bad for bussines of any kind and always it come to an end.
7:52 pm on Feb 27, 2012 (gmt 0)

Preferred Member

5+ Year Member

joined:June 14, 2010
posts: 585
votes: 0


The content was auto generated text out of an sql database. Content on the site is still a bit thin but not so bad


Were the 16,000 pages you already cut just the same kind of auto generated content? Was anything on the site expert written subject matter, or was it all just a mix of auto generated stuff.
8:15 pm on Feb 27, 2012 (gmt 0)

Junior Member

joined:Nov 27, 2011
posts:44
votes: 0


i don't think 16k pages are such a big issue for google. I just 410 around 400k pages last week and google deleted from its index a big part of them...around 70%. It will take some time until everything gets stable.