homepage Welcome to WebmasterWorld Guest from 54.161.166.171
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 233 message thread spans 8 pages: 233 ( [1] 2 3 4 5 6 7 8 > >     
A real Google conundrum
Established site losing all its Google traffic
diamondgrl




msg:769984
 1:07 am on Mar 27, 2005 (gmt 0)

I have a well-established site that over the past few days has lost almost all its Google referals. I think I know what's wrong but have no idea how to fix it.

First some background. The site is a well-established, deep-information site with many, many thousands of pages and a PR 6 on the home page. While we have attempted to get some links to us, most of the hundreds of links to us are spontaneous from a variety of professionals who find our content useful. Therefore, we're not at all dependent on the "latest" SEO tricks - totally white hat.

Up until this week, we got >15000 Google referals a day. We are not dependent on ranking for "blue widgets" or any other identifiable term - our referals come from thousands of different keywords a day which reflects the diversity of our content. Therefore, only a massive drop in the SERPs across the board can cause a >90% drop in referals, as we are seeing.

We still are in the index with the same number of pages and our backlinks don't seem changed. We still have the same PR showing throughout the site (for whatever that's worth since if there are changes, they probably wouldn't show immediately anyway).

Here's the kicker: Another site we own, let's call it widgetville.com, is showing up ahead of our real site, widgetville.org, in the SERPs when you search Google for "Widgetville". The higher widgetville.com site is shown without title or description. Widgetville.com has been 301 redirected to widgetville.org. Widgetville.com does have a backlink or two out in the world, but not the hundreds that the real site, widgetville.org, has so I don't understand the higher ranking.

If you search for "a bunch of widget words that you find on the front page", three other web sites who quote our mission statement appear on the page and our page doesn't. However, if you click on the link to show "omitted" results, we are listed as the omitted page.

In a way, it seems almost like our home page has been hijacked by our own non-functioning site. And it also seems to be like the whole canonical root problem that trips up some site owners except in our case it is between two domains, not a problem of Google getting confused between widgetville.com and widgetville.com/index.html.

We've had this problem before - a year ago - and I queried Google about the problem. I was told that it was a problem on their end, not mine, and they would fix it. The widgetworld.com listing was removed and within weeks, my traffic grew from a trickle to where I started hiring people to deal with the blossoming new customer base. Now all of that is threatened.

So, anyone want to take a crack at explaining this or giving advice on how to handle it?

I have taken one step to see what happens. I've removed the 301 redirect from widgetville.com and put a simple sentence on the page that says to click on the link to widgetville.org. I did this to disassociate widgetville.com from widgetville.org in case Google was somehow seeing duplicate content from the 301. Not sure how that would happen exactly since that is the prefered method of dealing with pages that are no longer valid, but this whole thing throws me for a loop.

 

Brett_Tabke




msg:769985
 2:30 pm on Apr 5, 2005 (gmt 0)

> First some background. The site is a well-established,
> deep-information site with many,
> many thousands of pages and a PR 6 on the home page.

What to do?

NOTHING. Go back to building content, finding links, and then wait, wait, wait. You will make a comeback.

The worst thing you could do, would be to go make a bunch of changes. Just let the algo work itself out.

oddsod




msg:769986
 3:27 pm on Apr 5, 2005 (gmt 0)

I know Brett's advice doesn't offer any consolation but it's probably the best option. It does involve the nasty tasting medicine of taking a hit on the traffic for an undefined period of time.

Considering all that Google is playing with [webmasterworld.com] it was only a matter of time before sites like yours and EFV's took a hit. On the assumptions normally made vis a vis good content, Google compliant on-site and off-site practices etc, Google should sort itself out and put your site back to where it was.

"Clean" content sites that took a hit on March 23 are now making a gradual comeback.

walkman




msg:769987
 3:31 pm on Apr 5, 2005 (gmt 0)

I made the mistake of making many changes after taking the hit, many months later no where to be found.

diddlydazz




msg:769988
 3:53 pm on Apr 5, 2005 (gmt 0)

hi diamondgrl

Brett's advice is the best IMHO.

but, what we did while traffic was bad:

moved a bunch of sites to dedicated IPs
checked/tested that all sites had a 301 from domain.com to www.domain.com
checked reverse DNS
added a timestamp to the pages

some sites are back and have since seen an increase (more than before) in traffic, and some are making a gradual comeback.

(the traffic improvements have nothing to do with the changes we made, although they may help in the long run)

Doing the things above were simply to take my mind off what was going on, i know that it isn't easy but i found that and creating content/etc more constructive than worrying about the SERPS.

I know it's hard to take but it really is a matter of waiting for google to sort out whatever mess they created.

All the best

Dazz

ncgimaker




msg:769989
 3:56 pm on Apr 5, 2005 (gmt 0)

Try this for a laugh. Search for:

In another post Google as a Black Box Giacomo proposed that we talk too much theory

In Google, 'webmasterworld' are nowhere for one of Bretts most famous posts.

[webmasterworld.com ]

Now do the same in Yahoo, Brett's post is the top result.

I think Brett owes us an apology for his very very bad SEO. He should go back and reword those pages immediately in order to rank better in Google.... ;)

(Edited for clarity)

taps




msg:769990
 4:57 pm on Apr 5, 2005 (gmt 0)

I used the time after allegra to do some cleanup. I set decent 301-redirects to consolidate all the different domains that were listed in google and other search engines.

After that I excluded some files in robots.txt. I did this because I have some duplicate content, which can be reached via www.mysite.com/show?id=1000 as well as via www.mysite.com/this_is_about_widgets_1000.html.

In my opinion our serps can't become worse, so this is the best time to do some real cleanup.

Like diamondgrl's site ours has a lot of different content and has been hit very hard by allegra. Now it is time to wait for a comeback. But it is very hard to keep patient.

And, yes, we still keep adding content :-)

wiseapple




msg:769991
 6:14 pm on Apr 5, 2005 (gmt 0)

Sounds like the same for us... Thousands of pages of clean content that is across multiple topics. No fancy SEO. Previous to Feb 2nd, great traffic from Google. Post Feb 2nd, very little traffic from Google. In fact, MSN, Yahoo, and Ask now outrank anything that we get from Google.

Here is a theory -

Are you caught in the Google Glue? No way to escape the glue. Check back in a year or so.

Or are you caught in the Google Goo? Still a few trickle through from Google but not as much as before. Might be months before the goo can be removed.

ShantiShunn




msg:769992
 6:54 pm on Apr 5, 2005 (gmt 0)

The suggestions of everyone are pretty much right on. Don't get sucked in to the Google Fever, but rather stay calm, and continue to work on your site. Keep adding in the new content, clean-up old content and primarily work on getting newer content up.

Some minor things that I would work on would be to clean up your meta tags, as it sounds like there may be a problem there just so that you do get your optimized site description to show up again.

Some other things to look into in the meantime if this persists and really begins to hurt you bottom line would be to expand out any PPC advertising you are doing in order to try to offset this initial loss in Google traffic.

Primarily though, work on building out the additional content, it can only help in the long-run.

diamondgrl




msg:769993
 7:04 pm on Apr 5, 2005 (gmt 0)

Actually, the widgetville.com site is now gone. It turns out that I didn't follow Brett's advice (it took a week for this thread to get approved so I didn't have the advantage of seeing it). I did something about it and it seems to have been good. But it also didn't solve the traffic conundrum, or at least not yet.

It turns out that even though there was a 301 on widgetville.com, there was a robots.txt file that excluded all robots. So Google couldn't see the 301. I finally figured that out after checking the .com logs and realizing Google was only checking robots.txt, not the index page.

Anyway, so now the .com is gone thanks to my elimination of the robots.txt file and now that Google sees the 301.

Meanwhile, the SERPs have been doing all kinds of nutty things with relation to the widgetville.org site. If I search for "Widget Ville" on Google, sometimes it comes up fourth in the SERPS, even though it is a completely unique name and my site has, since day one, always been #1 in the SERPS for a search of the business name.

I'm not sure if somehow this domain confusion sucked all the PR temporarily out of widgetville.org and transfered it to widgetville.com, in which case it will be slowly regained now that .com doesn't exist any longer. Not sure how and if that's possible. I'm just grasping for straws.

Of course, the .com confusion might have been completely independent of the drop in referrals and that the drop is better explained by the broader earthquake in the SERPs that others are facing. But I have a couple of other data points to indicate that sites very similar to this one did quite fine in the recent shakeup and that's why I'm thinking it's related to the domain confusion.

Time may - or may not - tell.

walkman




msg:769994
 7:17 pm on Apr 5, 2005 (gmt 0)

"I'm not sure if somehow this domain confusion sucked all the PR temporarily out of widgetville.org and transfered it to widgetville.com"

that page /site is WORTHLESS to Google right now, that's why I think it doesn't rank anywhere. If I search for "mydomain.com" (in quotes), I'm not even in the top 300, every other stupid directory or scrapper site ahead of me. Now there's only one "mydomain.com", and I've owned it for 8+ years.

GuinnessGuy




msg:769995
 12:39 am on Apr 6, 2005 (gmt 0)

diamondgrl,

How could widgetville.com been ranking at all if it had a robots.txt that excluded all bots?

GuinnessGuy

Marval




msg:769996
 1:13 am on Apr 6, 2005 (gmt 0)

Sounds like a common problem that many have experienced over the last year - it has to do with Google splitting the site into its "vanity domains" where some pages get associated with the .com version and some with the .org version - its happening across the board for many that I know that have always held their sites with either an index page on their "other domains" or a redirect to the real page - the easiest way to check if this has happened is to do a simple site: command for each of your domains and look at which pages are being associated with each - in most cases I have looked at for old established sites, the domains get a 50-50 split - or where people own the .net, .com.au etc, they get split throughout the domains.

As far as a fix? Ive tried three different ways including the emails to Googleguy with the special "topic" in the subject that he requested a year ago for this problem and quite honestly I still have yet to find a workable solution - Ive tried the 301 route, the url removal route and even moved domains to completely different servers on a totally diff. IP range in a different country - so far not a one has worked.

sasha




msg:769997
 2:31 am on Apr 6, 2005 (gmt 0)

Marval, I may have a similar situation to the one you are describing.

Is there a way to handle this problem with a robots.txt disallow for the second (duplicate) domain? I am just grasping for staws here...

incrediBILL




msg:769998
 2:42 am on Apr 6, 2005 (gmt 0)

Anyway, so now the .com is gone thanks to my elimination of the robots.txt file and now that Google sees the 301.

Doesn't quite pass the sniff test.

If your 301 redirect was in .htaccess, vhosts.conf, or somewhere similar the web server itself should've issued the redirect before there was any processing of the request to access robots.txt. I would assume this was more a problem with your 301 redirection than the presence of the robots.txt file.

incrediBILL




msg:769999
 2:46 am on Apr 6, 2005 (gmt 0)

domains get a 50-50 split - or where people own the .net, .com.au etc, they get split throughout the domains.

I had that problem with a .net and .com until I did a 301 redirect from .net to the .com.

I redirected domain.net, www.domain.net and domain.com all to www.domain.com based on suggestions in WW and about 1 month later it seemed to have cleaned itself up and my SERPs appear to have gotten a small bump out of this change.

nuevojefe




msg:770000
 7:24 am on Apr 6, 2005 (gmt 0)

How could widgetville.com been ranking at all if it had a robots.txt that excluded all bots?

Because it probably got indexed before the robots.txt excluded all bots. This would explain the url only listings, hard to get those to go away.

diamondgrl




msg:770001
 9:14 pm on Apr 6, 2005 (gmt 0)

incrediBILL,

It does pass the smell test because the .htaccess file was 301ing the index page, not every page on the domain. So Google did see the robots.txt file. And therefore since robots.txt told it not to spider, it never saw the 301 on the index page.

And how did the widgetville.com get in the index in recent days if there's a robots.txt exclusion? It seems to have been resurrected from the bowels of some Google datacenter somewhere since at one point many many months ago there was content on the page and there was no robots.txt or 301 telling Google to go away.

Marval




msg:770002
 10:12 pm on Apr 6, 2005 (gmt 0)

diamondgrl - Ive found things like extremely old links (circa 5 years ago) turning up as the referrer for the Googlebot into internal pages which then have a link to the index page somewhere - leads to avoiding the whole 301/htaccess issue for the bots in question - and has actually in my case been tracked back to some very old pages that were in the Y database when they dropped Google last year - of course the pages are no longer in Y now, but they were part of the original Y database posted in the natural results - and ended up being botted by Google through Y's listings

Romeo




msg:770003
 10:48 pm on Apr 6, 2005 (gmt 0)

Furthermore I found that Google seems to see a 301 differently than it should (and remembers me to the 302 mess).

Here is the story:

When checking site:example.com I found some old ^example.com URIs which I wanted go clean up and get removed from the index to not get into any dupe-trouble. Don't know where these URIs came from -- perhaps sloppy external links.
I have a long standing 301 rewrite [URI permanently no longer in use, all requests should use new URI] for all ^example.com to www.example.com.

So I tried the removal tool to get rid of those old ^example.com URIs manually, and was baffled: my server sends a 301 to the removal-check-bot, the bot then follows that 301 and fetches the new URI (all seen in the server's log) andfinally the removal tool said the "page" can't be removed, as it is found still up.

While there are 2 URIs behind a 301 -- an old abandoned and a new one, Google just does not see it this way -- it just sees one "page" (whatever this is), confuses the old abandoned URI with the new one and consequently refuses to just forget about the old URI!

I finally solved this mess by inserting another rewrite rule into the .htaccess just before the 301 redirect section to send the google-remove-bot a specially crafted plain 410-GONE -- some sort of ReWrite cloaking?
RewriteCond %{HTTP_HOST} ^example\.com
RewriteCond %{REMOTE_HOST} \.googlebot.com$ [OR]
RewriteCond %{HTTP_USER_AGENT} googlebot-urlconsole
ReWriteRule ^(.*)$ [example.com...] [G,L]

This worked perfectly, the bot got his 410 and was happy. It finally believed that these old pages are gone, and they have been removed from the index by now. Other users still get redirected as they should.

So whenever your 301s seem to be un-honoured, you may try this hard way as a last resort.

Regards,
R.

rjohara




msg:770004
 11:04 pm on Apr 6, 2005 (gmt 0)

I also had a huge drop in Google referrals beginning in early February to a long-established, heavy-content site where I hadn't made any notable changes. I'm just hoping, as Brett says, that the algo will eventually come around again and recognize that this site should be returned to its former standing. It's happened before, so I have to expect it will happen again.

Fairla




msg:770005
 3:44 am on Apr 7, 2005 (gmt 0)

Same here. Actually this happened to me a few months ago, and after a while the traffic from Google went back up, and now it's dropped again.

I also have a content site with lots of links from other content sites (which I didn't have to request), and I've always ranked well in Google. It seems strange to suddenly be getting more traffic from Yahoo than Google. I can only hope Google will reverse whatever detrimental changes it made and my traffic will recover.

cline




msg:770006
 1:23 am on Apr 8, 2005 (gmt 0)

If multiple people edit the site don't forget the possibility that something *was* done that caused the ranking change. I had this happen recently to a site. A casual observer wouldn't notice any change in the homepage, but somebody (and no one is owning up to it) made several little, mostly non-obvious changes detrimental to SEO. I restored the page to how it used to be and the site went right back to its former ranking.

walkman




msg:770007
 2:59 am on Apr 8, 2005 (gmt 0)

"caused the ranking change"
cline,
we're not talking moving from #2 to #4 or even 10.

diamondgrl




msg:770008
 5:05 am on Apr 8, 2005 (gmt 0)

Yeah, there is a huge difference between a narrow keyword-focused site and a broad site, such as WebmasterWorld.com. In our case, we are not keyword focused so there is a relative amount of stability. We are not really affected by competitors. Sure, Google algo changes can have relatively large effects but things usually vary +/- 20%, not by 90%.

Spine




msg:770009
 5:11 am on Apr 8, 2005 (gmt 0)

The last time Google screwed up like this (and I do think it's their fault in most of these cases) our site was at 25% of normal traffic for about 3 months.

I'm hoping for a faster fix this time, but with each day I get a little more depressed.

Reid




msg:770010
 5:54 am on Apr 8, 2005 (gmt 0)

This thread seems to confirm my suspicions that these rapid and confusing database changes are their attempt to deal with the 302 problem.
Seeing 301's being affected.
This could indicate algo changes in the interpretation of 301 or 302.

anyone use any different status codes from 3xx or 2xx and noticing strange behavior with these?

Spine




msg:770011
 8:01 am on Apr 8, 2005 (gmt 0)

How do you think 301s are involved? I'd be curious to know, as I have a 301 from my 'non-www' URL to the URL with www.

The site did fine for years without that redirect in .htaccess, but I put it in when Google went weird on me in the fall.

Cheers

JuniorOptimizer




msg:770012
 10:50 am on Apr 8, 2005 (gmt 0)

The idea of "waiting for Google to get it right" sort of sucks. They seem incapable of getting it right.

reseller




msg:770013
 11:06 am on Apr 8, 2005 (gmt 0)

JuniorOptimizer

<The idea of "waiting for Google to get it right" sort of sucks. They seem incapable of getting it right.>

Exactly. Its therefore I keep writing that its about time that publishers start thinking SOLUTIONS and MEASURES. Any suggestions how to deal with current situation?

This 233 message thread spans 8 pages: 233 ( [1] 2 3 4 5 6 7 8 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved