|traffic from google has dropped|
anyone seen the drop in traffic from google?
I am seeing that the traffic has dropped nearly 50 % in a day while the pages in index are the same and also the SERP have not changed that way.
I am talking about traffic of nearly 7 k from google everyday and hence its a sizable decrease.
Looking for early answers on how we could check the things
My summary is that my 70% loss of traffic for one of my bigger sites is googles problem and not mine.
My peers would not know that my site is suffering because it still retains its positions for main targetted kws.I am not comfortable with the "dampening of the internal link" theory as its my internal pages that are ranking which do not have a lot of outside links to them.But i have lost all traffic for all the minor kw phrases that typically make up 70% of the searches.
The wierd thing is when I type in a paragraph of unique text from any of my pages, they do not show up in the first 100 results,yet when I break down the paragraph in lots of 2 words I have top positions.
So I am hanging out for google to correct itself and i see that I am slowly regaining some traffic.
Just an update on my site- I have seen a small recovery in my SERPs however not nearly enough to compare to last months traffic. Looks like they reverted to the earlier backlink update for some reason, however no pr update on any pages yet.
The theory on the trouble with calculating the homepage is interesting, however it doesn't affect my site. I am listed on other homepages however so if they dropped it is possible it affected my site.
Of coarse to determine this without any pr update is almost impossible so I can only wonder how Googe figures out what they do. Do they have a special toolbar to show the real pr?
Anyway looks like we are stuck with this drop for now,,,
I think it's the chickens coming home to roost.
Google exhibited clear signs of being broken in April and May, 2003. This was when the monthly crawl, PageRank calculation, and update dance was ended. They threw out an entire crawl and reverted to the previous month's index. Since then they have never returned to the old monthly cycle that had served them so well for about three years.
In June there was what I believe to be an insider leak of what happened. You can still find this on WebmasterWorld here [webmasterworld.com].
Ever since June 2003, no one has convincingly challenged that interpretation by re5earcher.
It was very clear to Google by then that Yahoo was gearing up to compete. Yahoo had already bought Inktomi, and Overture, Alltheweb, and Altavista were acquired in mid-1993. Microsoft was already crawling the web on an experimental basis.
In November we had Florida. Many sites were dropped. Google was forced to turn back the knob on Florida because of all the screaming.
Since Florida, I've been watching a few nonprofit sites with thousands of pages. The number of pages that are indexed by Google, as opposed to merely having the URL listed in Google, has declined across the board. Traffic from Google is at an all-time low on one nonprofit site I've been watching. Most of the pages are listed as URL-only. Dot-org sites like this were not affected by Florida, but they've been squeezed slowly for the last seven months.
My theory is that when Google was confronted with the 4-byte integer problem, they looked at several factors:
1. Microsoft and Yahoo were knocking on the door.
2. Google's profits from ads were extremely hot, and climbing, and Google had a window of opportunity to get really rich before the competition had time to kick in the door.
Google could either put their resources into their organic algorithms, which would take a year of planning and effort to expand to a 5-byte docID, or they could put the organic results on hold and maximize their profits from ads.
With the IPO always a possibility, Google decided to go for the ads, and put the organic results on hold. The result is that for the last year or so, in order to get a page into Google, another page had to come out. The old method of calculating PageRank was abandoned in favor of "guessing" PageRank based on the parent directory, or on a small sampling instead of a recursive calculation involving the entire web. The fresh bot, which had been working since August 2002, was expanded so that it almost replaced the old monthly crawl cycle.
Google has basically abandoned the main index in favor of cashing out. I think it's time for folks to stop trying to second-guess the logic behind which sites drop and which don't. Given the fact that the web is growing at a good clip, and Google is required to appear fresh or face ridicule, and given the fact that they are limited by their 4-byte docID problem, the bottom line is that innocent pages will get dropped all over the place. This is true even if you assume that Google is making a good-faith effort to restrict the dropped pages to those spammy ones that might deserve to be dropped.
Google's priorities are with ads. They've abandoned pure search. The fact that they can claim they're doing a good job on keeping spam out of the main index means that they've completely lost perspective about what's going on with the main index. Or, their priorities are clear in their own minds, but they have to keep spinning the myth of excellence in algorithmic search, which is how Google got on the map in the first place.
PLAYBOY [referring to ecommerce spam in the main index]: Playing cat and mouse like this, how can you be sure to stop them?
PAGE: We have a lot of people devoted to stopping them. We do a good job.
BRIN: People try new things all the time. By now, the people who succeed have to be very sophisticated. All the obvious or trivial things one might think of have been done many times, and weíve dealt with them.
PAGE: Itís going to get harder and harder to do these things. However, the benefits are obviously large, so some people will try to manipulate the results. Ultimately, itís not worth it. If youíre spending time, trouble and money promoting your results, why not just buy advertising? We sell it, and itís effective. Use that instead. Advertising is more predictable and probably more effective.
cabbie, so you are saying that searches of more than 2 words are not picking up any of your pages when they should -as per in the example of the entire parragraph-
This is interesting. It explains the great loss of traffic and in some cases sales. I bet those obscure, longer searches are the most profitable ones too.
Perhaps Google just decided to provide "certain" results to those profitable searches, that is only-adwords matching results by eliminating the possibility of "contamination" of otherwise great free serps. If they can somehow manipulate the best traffic towards adwords clients they will be maximizing their overall gross, in the end.
|no one has convincingly challenged that interpretation by re5earcher. |
Scarecrow, good points, and fairly self-evident given the completely static total pages indexed count over the last year almost now. I read that thread with great interest, and took especial note of the strongest arguments for and against the thesis, as well as who put those forwards. The original posting satisfied a basic research theory axiom of being the simplest, and most comprehensive model put out to explain a wide range of behaviors, and still is. Sandbox, incomplete large site indexing, dropped sites, etc ad nauseum, why look for complex explanations when simple ones do fine?
The biggest mystery to me is not the points you raised, but why google doesn't just pretend to increase the indexed page count on the home search page, since no one would ever know the difference anyway.
does anyone remember this thread [webmasterworld.com] from 16 months ago? if so what do you think?
its just that in June 2003 Google stated it had: 4,294,967,296 pages
and now, August 2004 it says it has 4,285,199,774
not an inconsiderable abount of pages are out then!
Yes, but how many new pages has Google indexed from June 2003 to August 2004!
A LARGE number of pages disappeared then?
|its just that in June 2003 Google stated it had: 4,294,967,296 pages |
Google never said this. That number is 2 to the 32nd power, which is the maximum count for a 32-bit unsigned integer.
Google's numbers have been screwy for almost a year now. They say I have more pages in particular directories than have ever existed in those directories. Not a few more, but usually about 50 to 100 percent more.
At the risk of causing nightmares for those whom this board is intended, wouldn't it be interesting if Google just cut all this garbage out and simply found a dozen radically different combinations of algorithm settings which produced 'reasonable' results when noone is trying to game them, and then randomly rotated them, say on a weekly, daily, or even hourly basis? Or perhaps on a regional basis, so that doing the same search on one server produces different results elswhere. Maybe add new ones now and then and drop old ones occasionally. Sortof a perpetual G-dance. That would render any form of conventional SE gaming useless, as what worked a minute ago, won't necessarily work now or later. With all the gamers exhausted, frustrated and financially eliminated, and avoiding them no longer a factor at all, all that would be left to index would be the non-gamers who didn't really care in the first place if they were on top or not and focused all along on meaningful, non-commercially dependent content.
...Returning now to my little Utopian world :)
Meanwhile, whatever it is, the results for searches are sucky right now, even for things I am not interested in personally.
Answer to reply to my post (463)
|Many people have sites that have seen no impact whatsoever from this change were not even aware of an update. |
I believe it is because they have good inbound links and did not rely on their site optimization (cross links inside their sites). Than, naturally, their rank did not changed. I can see it in my own example. My site is rather old and has different topics (directories). The pages that had good (independent of me) inbound links have not changed their PR. The pages that got PR mainly from other pages in my site and from download sites got PR0. But among the pages that got PR0 one that has good outbound links to independent sites now has PR2.
|If they simply de-valuate these techniques, i don't really care. As of yet I have not heard really any reliable theories on what is going on with Google, so I am sitting tight. |
Of course, really only Google knows. But I believe that they are clever and reasonable. So they indeed probably just de-valuate some techniques.
|googleguy himself just explained in another thread that google's algorithms try to determine the home page |
|but then wont ALL index pages have something nice to say about themselves and commmon, you ought to. |
I did not say that Google has special low rank for home page. Simply if Google now relies more on what other sites say (inbound link with good PR), index page should *in general* has much lower rank than other site pages. For example, when I link to another sites I try to link to the page with specific information, relevant to the topic of my page. And (surprise!) as a rule it is not an index page. Other seems do the same. When I now try to find the new part of my site in Google I find first numerous download sites that point to me and (you guess right!) my download page and not index one after them all.
|Links from other domains have only a very small higher likelihood of being unrelated than links within a domain. Google is not going to ignore the reality of the web, let alone play so obviously into the hands of template/duplicate spammers. |
|devaluing internal links and boosting external links, that will only play into the >hands of people with a large network of sites. |
I believe here PR is involved. I also believe also that Google tries to detect spammers and some sites that by their nature just duplicate the content (e.g. download sites). I believe that for template/duplicate spammers it is very difficult to have many unique contents so they will have in general low page rank. And if one site try to spread its good PR to unusually many sites (network) it is probably rather easily to detected. This may be the reason why PR dropped for some SEOs and for oldest sites. If an oldest site tries to spread its PR to unusually many dependent sites it is indeed looks like spam.
My site snipped got totally creamed in a 4 hour period on Monday, August 16, 2004. Coincidentally, this occurred while my site was disabled (from 8-14 to 8-16-04) due to problems experienced by my hosting company. Site was #1 for over several hundred geo-targeted keywords; rankings continued even while the site was still down for this several day period. When the site was restored on the server on 8-16-04 all of my rankings were gone. Not coincidentally my phone stopped ringing.
One theory is that perhaps Google went to crawl my site, and when it returned a dead link perhaps they took me out of the index. On the Google FAQ page they list this web hosting problems as a potential factor for a site disappearing. Anyone out there with any other thoughts/suggestions?
[edited by: DaveAtIFG at 5:32 am (utc) on Aug. 18, 2004]
[edit reason] No URLs please [/edit]
I've seen about a 30% increase in traffic since the change discussed in this thread. I had been working on adding inbound links and I had just assumed this traffic increase was a result of that.
"This may be the reason why PR dropped for some SEOs and for oldest sites."
Displayed PR hasn't updated in seven weeks so I don't see how that can be applied any discussion about August happenings.
because there may be a difference between "calculated PR" and "displayed PR"
I agree with you. Google should just come up with a bunch of good, but very different algos and just rotate them, weekly, daily, or hourly, as you suggested in message #489. Share the wealth, stop the excessive, obsessive, compulsive SEO and spamming. Finally, a SUGGESTION and not just whining or complaining. Just do it, Google! Seems like a simple and yet effective solution to this on-going problem and cat and mouse game.
|Google should just come up with a bunch of good, but very different algos and just rotate them, weekly, daily, or hourly |
Instead of Google SERPS being a contest it would be a charity handout. Nothing wrong with that in principle but it means Google wouldn't be a search engine any more, and Yahoo etc would take all the business and the status quo would resume.
Alta Vista thought this was a good idea a few years ago... Does anyone remember Alta Vista? ;)
|Google should just come up with a bunch of good, but very different algos and just rotate them, weekly, daily, or hourly |