Welcome to WebmasterWorld Guest from 34.204.176.189

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Decided to block Googlebot after Sep/12th traffic plunge

     
6:52 pm on Oct 10, 2014 (gmt 0)

New User from CA 

joined:Oct 6, 2014
posts: 13
votes: 0


Has anyone else experienced a sharp traffic reduction (in Canada) from the 12th to the 13th of Sep/2014? My 14 year-old site has been hit by several Panda updates since Feb/2011, but so were lots of other sites. However as much as I have checked around, Sep/12-13th doesn't seem to coincide with any known, or named updates that I could find. Nothing has changed on my site, and there were no warnings or messages in GWT.

Google visits plunged by 60-70% overnight, while Bing and Yahoo stayed the same, so my traffic is down about 95% since Feb/2011.
Since I don't rely on Google for a living (it's a research/info site), I decided to return the favor and block Googlebot from spidering my site now. I'm hoping to get out of the Google index to a point where my site (11,400 hits right now) will not come up any longer through a Google search, which is probably not entirely possible.

Again, just curious if the Sep/12th traffic plunge was experienced by anyone else (similar type of site?), or if it was due to a random algo change.
7:32 pm on Oct 10, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15812
votes: 848


which is probably not entirely possible

Sure it is. You just have to grit your teeth and let them continue crawling, but attach a <noindex> meta to all pages (or send a "noindex" header with all responses, whichever is more convenient).

Don't see why you'd bother, though. If you don't rely that much on search engines, then having bad luck in serps should have no effect one way or the other.
8:08 pm on Oct 10, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:June 28, 2013
posts:3467
votes: 778


Since I don't rely on Google for a living (it's a research/info site), I decided to return the favor and block Googlebot from spidering my site now. I'm hoping to get out of the Google index to a point where my site (11,400 hits right now) will not come up any longer through a Google search, which is probably not entirely possible.


It seems to me that you're biting off your nose to spite your face. And what about searchers who might find your research/info useful? Is it worth hiding your content from the majority of Web searchers just because you want to "return the favor" to Google?
8:22 pm on Oct 10, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member planet13 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:June 16, 2010
posts: 3828
votes: 31


"When the elephants fight, it's the grass that gets trampled."

- Thai Proverb
9:05 pm on Oct 10, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3652
votes: 369


One way to do it (as Lucy mentioned) would be to put the following line into the head of every page you want to take out of Google's search results:
<meta name="googlebot" content="noindex">

That should affect only Google, not the other search engines.

One suggestion: Leave a few pages in Google's results, so that you can watch them for significant rankings changes.
9:23 pm on Oct 10, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Aug 29, 2006
posts:1378
votes: 18


Welcome to WebmasterWorld.

Try a search on a distinctive and unique sentence from your site, within quotation marks.

If a site other than yours comes up it could be the cause of the plunge.

...
9:27 pm on Oct 10, 2014 (gmt 0)

New User from CA 

joined:Oct 6, 2014
posts: 13
votes: 0


It's not really a "spite" thing. It's accepting the fact that Google traffic is history, and cutting the cord finally eliminates any other ongoing, or future algo headaches. I'm getting a decent number of return visitors and reasonable traffic from Bing and Yahoo, and contrary to a lot of Google-depending individuals, I'm enjoying a good night's rest without worrying about any new additions to the Google zoo.

If enough webmasters who were affected by severe (Google) traffic cuts would ban Googlebot as well, the word would eventually trickle down to the average user that for some queries, search engines other that Google give you much better results - or a greater variety of results, because they provide information that Google is no longer allowed to index.
10:45 pm on Oct 10, 2014 (gmt 0)

New User from CA 

joined:Oct 6, 2014
posts: 13
votes: 0


Welcome to WebmasterWorld.

Thanks...

Try a search on a distinctive and unique sentence from your site, within quotation marks.
If a site other than yours comes up it could be the cause of the plunge.

Yes, duplicate content could be a potential cause, however there comes a time when you tire adjusting your entire life around a search engines' indexing flaws. I'll leave that to those whose livelihood depends on Google. Thus far, Bing or Yahoo don't have a problem with my site. The mantra is to build websites for people, not search engines, right?
11:17 pm on Oct 10, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Sept 14, 2011
posts:1045
votes: 132


Interesting idea, I believe if enough people with tanked domains followed suite it could create a problem. As google is designed to index everything it finds. I reckon it will still crawl your site, just not index it publicly.
11:21 pm on Oct 10, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:July 19, 2013
posts:1097
votes: 0


I reckon it will still crawl your site, just not index it publicly.

Definitely, unless rather than using noindex to keep the site out of the index acutrician goes with a 403 Forbidden for gBot.

RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} googlebot [NC]
RewriteRule .? - [F]
11:28 pm on Oct 10, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 12, 2006
posts:2709
votes: 116


They still index pages that they know about, even if you don't let them crawl it. They can rank pages through stuff like backlinks, which you have no control over, and work out the subject of the page through anchor text. it's probably impossible to get your site out of their index completely, so you may as well just leave it in.
11:28 pm on Oct 10, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Aug 29, 2006
posts:1378
votes: 18


The mantra is to build websites for people, not search engines, right?

Google apparently has an approximate 85% market share in Canada.

So your remaining 30-40% of Google traffic would still dwarf Bing and Yahoo traffic combined.

And that traffic consists of people, whatever the mantra.

...
11:34 pm on Oct 10, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:July 19, 2013
posts:1097
votes: 0


They still index pages that they know about, even if you don't let them crawl it.

Correct when the block is with robots.txt, but they drop 403s, 404s, 410s, etc. from the index regularly, plus if a 403 is served they don't get the content of the page, they get basically, "Forbidden, you do not have permission to access this page."

If anyone has had a 403 or 410 indexed (other than the time it takes to have the page dropped from the index) I'd be interested to hear about it.
11:39 pm on Oct 10, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:July 19, 2013
posts:1097
votes: 0


I'm really surprised at how someone who doesn't need Google is being discouraged from taking a stand here at WebmasterWorld -- It really makes no sense when "Don't depend on Google.", "Let's all get together and block Google", etc. has been posted about and even supported repeatedly.

Now someone, besides Brett_Tabke who blocked gBot from this very site, is taking a stand and "kicking the Google habit" and they're getting blow-back for it? WTF?!

The OP made a decision about "letting go of Google" and making sure they don't ever have to worry about what "curve ball" Google throws next or what new animal Google's coding for a future implementation, yet the OP is being "corrected" for putting their foot down and basically saying, "Enough! If Google's not going to send me significant traffic, I'm not going to send Google content." Again, WTF? and How Sad.
11:51 pm on Oct 10, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Sept 14, 2011
posts:1045
votes: 132


JD thats very very cool, I am thinking of doing this on one of my sites that seems to have a Athenian verdict on it from Google.
11:56 pm on Oct 10, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Sept 14, 2011
posts:1045
votes: 132


Samizdata enough! if google treat our sites like #*$! then time to reciprocate but by JD's 403 not robots.txt or meta tag.
9:52 am on Oct 11, 2014 (gmt 0)

Junior Member

5+ Year Member

joined:Oct 20, 2013
posts:67
votes: 0


Well done you got lots of respect from me! It seems google aint the place to find decent sites...and searchers are missing out on so much. I couldn't find any good sites yesterday searching all day.. had to look on a site that recommended the top sites in the particular niche I was searching in which where nowhere to be found on google. Hopefully one day people will realize.
10:02 am on Oct 11, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Aug 29, 2006
posts:1378
votes: 18


Samizdata enough! if google treat our sites like #*$! then time to reciprocate

Webmasters are entitled to run their sites as they see fit.

That includes taking no action against scrapers violating their copyright.

It also includes sticking it to the Google man, who will doubtless be quaking in his boots.

It even includes building websites for people and then excluding the majority of them.

I have no problem with them doing any of the above.

...
10:56 am on Oct 11, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Oct 5, 2012
posts:951
votes: 200


building websites for people and then excluding the majority of them.


well, excluding people from finding the website via a third party leach (google).

In April 2012 we lost almost all our google traffic. People still found us and now we get more traffic than ever and almost none of it from google. Searches for our brand name on google have gone up 1000 fold.

google tried to kill us, they couldn't kill us by taking away our organic traffic, they couldn't kill us by investing in our competition. Why? Because we are the best in our niche, period.

We're good, we're popular, if people can't find us on google, who looks bad? Mostly google.

Will we ever block google? Business is business, no option is off the table.
2:10 pm on Oct 11, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:June 28, 2013
posts:3467
votes: 778


Webmasters are entitled to run their sites as they see fit.


Yes, and search engines are entitled to index and rank sites as they see fit. What's good for the goose is good for the gander.

I'm getting a decent number of return visitors and reasonable traffic from Bing and Yahoo, and contrary to a lot of Google-depending individuals, I'm enjoying a good night's rest without worrying about any new additions to the Google zoo.


So? Don't worry, just be happy if Google's algorithm changes in your favor, as it did for BillyS. See his post near the bottom of page 8 in this thread:

[webmasterworld.com...]

Even if you've managed to get enough traffic from other sources to keep your audience from drying up, why would you want to eliminate the possibility of getting traffic from the world's largest and most influential search engine? Think of Google referrals as "found traffic," and you won't need to lose sleep.

Some people here seem to think that blocking Googlebot is like sending a message to Google that says "Up yours." In reality, it's just saving Google the trouble of indexing and ranking sites that its algorithm has already deemed less than stellar. The implicit message is "OK, Google, your judgment is correct. No matter how much your algorithm improves, you aren't going to like my site, so I might as well lock the door and draw the blinds when Mr. Googlebot stops by with his checklist."
2:27 pm on Oct 11, 2014 (gmt 0)

Junior Member

5+ Year Member

joined:Oct 20, 2013
posts:67
votes: 0


If your visitors cannot Find you on google and have to go direct it can mean they don't "search" for you and get bombarded by google ads and instead get into the habit of putting your address straight into the address box which is BRILLIANT!
2:53 pm on Oct 11, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:June 28, 2013
posts:3467
votes: 778


If your visitors cannot Find you on google and have to go direct it can mean they don't "search" for you and get bombarded by google ads and instead get into the habit of putting your address straight into the address box which is BRILLIANT!


If they can't find you on Google, they'll probably go to another site unless they've memorized or bookmarked one of your URLs.

Besides, how many site owners want to limit themselves to repeat visitors?
3:16 pm on Oct 11, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Sept 14, 2011
posts:1045
votes: 132


Ah I doubt I'll be blocking googlebot tbh just very frustrated it seems sometimes the more you improve a site the worse its rankings get. But it is free traffic so you got to be thankful for what they send your way I suppose. Maybe things will turn around.
4:31 pm on Oct 11, 2014 (gmt 0)

New User from CA 

joined:Oct 6, 2014
posts: 13
votes: 0


why would you want to eliminate the possibility of getting traffic from the world's largest and most influential search engine?

The AVERAGE searcher will only use "the world's largest and most influential search engine" as long as they find things they are looking for. Once they start having problems with the results, they will look elsewhere on their own, or often by the advice of a friend.
My website represents just one data point, but if other webmasters in a similar situation were to follow suit, Google's reputation of NOT BEING ALLOWED to index an increasing number of sites would eventually become a known fact of web browsing, and people would routinely, or more so, try other search engines knowing that they WILL have the results Google is no longer allowed to provide.

If you think I have a beef with Google, you are wrong. Google exercises its right to rank certain sites over others, and I too have the right to choose who should, or should not index the copyrighted content of my site. It's a mutual business decision. I still use Google as one of several search engines, and respect the people in the plex for what they are, and have accomplished.
What I'm basically trying to do is setting the groundwork for (hopefully) getting more traffic from other search engines, and making people aware that finding certain results will mean having to switch.

You can imagine how quickly this transition would take place if someone were motivated enough to run an ad campaign to that effect. However this is not possible until enough websites are out of the Google index, so a search result comparison can be convincingly demonstrated. Now you're getting the gist why I'm trying to be no longer found in the Google index.
7:54 pm on Oct 11, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 12, 2006
posts:2709
votes: 116


good luck to you my friend, but nothing will change. realistically, google will still be around for years and years, by which time you will have wasted thousands and thousands of visits, and all the customers and backlinks that might have come your way. no one at google even knows that your site exists, i am guessing. all you are doing is having a beef with their emotionless computer. you're just hurting yourself.

but i reckon 90% of webmasters are on your side! my rankings are a lot better in bing and duckduckgo, and i would love them to overtake google
7:56 pm on Oct 11, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member editorialguy is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:June 28, 2013
posts:3467
votes: 778


The AVERAGE searcher will only use "the world's largest and most influential search engine" as long as they find things they are looking for.


Yes, and they will. Or do you think poorly-ranked site owners have a monopoly on information?
8:27 pm on Oct 11, 2014 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Oct 5, 2012
posts:951
votes: 200


Even if you've managed to get enough traffic from other sources...


That's adorable EG. It's a big wide world outside of google. We get exponentially more traffic now from other sources than we ever did from google. And we're not the only ones. And that's a trend that is going to continue to grow.

I can't speak for poorly ranking informational sites but when talking about our penalized ecom site google is doing it's users a disservice. I don't blame google, it's just business. They made a mistake investing in our competitor and they're riding that mistake into the ground. They use their position to promote their own interests regardless of what's best for the user. The company they invested in is floundering and we continue to thrive despite google. We're winning and we will continue because we're the best at what we do.

As for blocking google, who knows. At the end of the day I don't want people to find us via google. I want people to think of us when they need our product before they think of searching google for our product.
12:32 am on Oct 12, 2014 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:July 19, 2013
posts:1097
votes: 0


Besides, how many site owners want to limit themselves to repeat visitors?

Do people really think sites are limited to only having repeat visitors without Google? Just sticking with "household names" below...

The tipping point for Twitter's popularity was the 2007 South by Southwest Interactive (SXSWi) conference. During the event, Twitter usage increased from 20,000 tweets per day to 60,000.

[en.wikipedia.org...]
A high-school version of the site was launched in September 2005, which Zuckerberg called the next logical step. (At the time, high-school networks required an invitation to join.)

[en.wikipedia.org...]
Jeff Bezos incorporated the company (as Cadabra) on July 5, 1994 and the site went online as Amazon.com in 1995.

[en.wikipedia.org...]
Development of Pinterest began in December 2009, and the site launched as a closed beta in March 2010. The site proceeded to operate in invitation-only open beta.

Silbermann said he personally wrote to the site's first 5,000 users offering his personal phone number and even meeting with some of its users.

Nine months after the launch the website had 10,000 users.

[en.wikipedia.org...]
Amazon predates Google.
Twitter tripled in use at a conference.
Facebook grew rapidly without Google.
Pinterest doubled it's user-base in less than a year without Google.

Some people here seem to think that blocking Googlebot is like sending a message to Google that says "Up yours." In reality, it's just saving Google the trouble of indexing and ranking sites that its algorithm has already deemed less than stellar.

Growing without Google is *not* indicative of Google's algo being right, a site being "less than stellar", or Google showing people the site they want to find. Sites growing without Google is indicative of a site being high enough quality in the eyes and opinion of *real people* that they want to find it, they talk about it, they share it, they visit it, they join it, they use it, without Google's algo needing to be involved.

Every site people actually want to find not listed in Google or not ranked highly enough for people to find is a *loss* for Google and indicative of Google having a problem with it's algo being "less than stellar", rather than there being a problem with the site people like and want to visit but can't find using Google.
4:06 am on Oct 12, 2014 (gmt 0)

New User from CA 

joined:Oct 6, 2014
posts:13
votes: 0


all you are doing is having a beef with their emotionless computer. you're just hurting yourself.

No one is having a beef with anyone - it's an amicable separation.

Looking at my logs, I don't see a single Google referral all day, yet my daily page views are identical to the ones from a month ago when I experienced the sharp decline in Google traffic. So it doesn't look like that I'm hurting myself.

But then, my main traffic is unaffected and continues to come from forums, blogs, newspaper articles, facebook, pinterest, links from other websites, or info sites such as ehow, livestrong, etc., which list my site as a reference. Things may change (as they can for anyone), but right now these traffic sources look more solid than those from any search engine.

[edited by: acutrician at 4:21 am (utc) on Oct 12, 2014]

4:09 am on Oct 12, 2014 (gmt 0)

New User from CA 

joined:Oct 6, 2014
posts:13
votes: 0


Or do you think poorly-ranked site owners have a monopoly on information?

Anyone can have a monopoly on information if they offer unique, and copyrighted content that cannot be (legally) published anywhere else, or at least they have a monopoly until it is (illegally) scraped by the rest of the pack.

Ranking is a relative decision process.
This 62 message thread spans 3 pages: 62
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members