| 12:47 pm on Feb 11, 2005 (gmt 0)|
I am in India, and I can see the results from the 13 DCs on google.com or the regional googles. They are being used, definitely. Though in what order and where, I have no clue.
In fact, have been seeing them off and on for the last 3-4 days.
| 12:49 pm on Feb 11, 2005 (gmt 0)|
Update: On mcdar.net., i can see the 13DC results now only on 2 datacenters.
| 3:10 pm on Feb 11, 2005 (gmt 0)|
USA searches are still changing by the minute between 2 sets of results while Google in Europe shows those same 2 with an additional set of results that I've not seen here. Am I correct that in past updates that the final results were first seen overseas?
Where is Googleguy? This has got to be the longest update in their history. If you are listening, can you shead some light on what's happening Googleguy.
| 3:38 pm on Feb 11, 2005 (gmt 0)|
This really is a mess, good sites have disappeared and have been replaced by random mentions of the search words in PDF documents! I have never seen worse google serps for some of my key searches.
Surely this is a technical problem and not an intentional final result?
| 3:38 pm on Feb 11, 2005 (gmt 0)|
re: 13 DC's in non-USA SERPs
That explains why our traffic is down 60% and not 95%, since we got bounced. It also explains why the latest registrants to our forum have all been International (we're in the USA).
Back to turtle racing. You don't have to SEO a turtle, hell you can't even teach them. It doesn't work and you only annoy the turtle.
| 3:52 pm on Feb 11, 2005 (gmt 0)|
|This has got to be the longest update in history |
No kidding. I first noticed my site drop on Feb 4th, and they've been up and down all over the serps ever since. I have followed the movement of my 1000+ page site on a daily basis, sometimes twice a day for terms that were fairly solid in the past, and the results are inconsistent on an hourly basis.
I wonder when the huge ranking swings will end. I am not changing anything till this settles down.
| 3:59 pm on Feb 11, 2005 (gmt 0)|
My site was hit by a Google change in mid-December. It has always ranked well for its keywords and keyphrases, mostly because it has unique content that isn't repeated anywhere else on the Internet. The few sites that do have similar content don't have anywhere near the coverage of that content that I do. In most cases, these sites have lower PR than mine, and fewer links pointing to them.
When searching for unique pages in my site, that have always ranked at the top, they are now buried. Other sites that mention my site are listed first, then a bunch of sites that have the keywords spread out on the page, but when viewed, you find the page has nothing to do with the search query, and the content is taken out of context by Google!
Then I find my page down at #97 or #102 or something like that! I've done nothing sneaky to try to get better rankings, yet my site seems to have a penalty on it for some reason. I am seeing lots of directory type sites linking to me, but from what I can tell they are just straight links, with no redirects, and they are no help whatsoever to the person looking for information, they just tell them where to find it, which is what GOOGLE SHOULD BE DOING! What a waste of time to search Google for something, only to find site after site of links telling you where it is. Google should be doing that, not sending people to a directory.
It really seems like Google is broken, I know everyone thinks their site is better than the rest, but when you are the only site with this content, it just doesn't make sense that it would be buried in the SERPs. I keep hoping Google will fix this, but it seems they're taking a long time to do so.
Google Images is sending twice the traffic to my site that Google is, so apparently my images are worthwhile in Google's opinion, but the pages they're on aren't. My Google search traffic is down 90% since the mid-December glitch, but my rankings in Yahoo and MSN are very good. I'm getting 10 times more traffic from Yahoo right now, and MSN is sending me 5 times more than Google.
| 4:06 pm on Feb 11, 2005 (gmt 0)|
Here's a question for all the nimble minds...
Can anyone see the major differences in the two algos in terms of why sites ranking well in the 13s get filtered in the 0s? The SERPs seem better in the 13s to me, but I'm loaded down with bias because I'm doing so well in them.
| 4:08 pm on Feb 11, 2005 (gmt 0)|
use the tool on my profile. NOT my site, it just shows all the DCs in line and shows on how many you don't rank
| 4:09 pm on Feb 11, 2005 (gmt 0)|
> What a waste of time to search Google for something, only to find site after site of links telling you where it is.
How true. On the other hand I experienced just the opposite in my little niche: Many of these directories vanished and the results were much more satisfying for the user than before.
I'd suggest google tries to get rid of these directories, but those follow two opposing bunches of SEO-techniques.
| 4:13 pm on Feb 11, 2005 (gmt 0)|
At least in the industries I follow it looks like the relational filter hasn't been applied for the 13's.
| 4:22 pm on Feb 11, 2005 (gmt 0)|
I just tried a search for "young teens" (without double quotes) on google with default moderate filtering on. Then compared it with yahoo and msn serps. The "quality" of google serps seems poor.
| 4:48 pm on Feb 11, 2005 (gmt 0)|
Usually I think people are exaggerating when they complain about the Google SERPs after an update; but I must admit, this time I came to WebmasterWorld specifically _because_ the Google results were so bad and I wondered what was up.
My own site is positioned just the same as it ever was, as are all the other large educational sites; but when I try to use Google to pull up new sites to add to the ODP categories, suddenly I'm getting nothing but crap for a lot of them. I've actually found better sites using Yahoo this week. Naturally most of the complaints on here are going to be about commercial sites, but esoteric and hobby sites have taken a real beating this time, especially those on Angelfire and Geocities. I search for sites on obscure authors and the one substantive fansite dedicated to an author is buried down on page three under sites that have lists of college alumni one of whom has the author's first name and another of whom has the author's last name. It's really quite ludicrous; I hope it doesn't stay this way.
| 5:19 pm on Feb 11, 2005 (gmt 0)|
I was going to withhold this next point because our hobbyist site took a beating with Allegra, and people "write off" laments from those who are on the losing side of an update, but your post has motivated me to at least say this:
This update, plus the Google IPO, may have sounded the final bell for the end of the party for the small or hobbyist webmaster; at least as far as Google's SERPs are concerned. There'd be no way to get a good population sampling so I won't post a thread with this question, but it would be interesting to see how many of the "little guys" still rank well in Google's SERPs with this update vs. previous ones, vs the "big guys".
The problem is due to the keyword battles between Google and the webmaster population. In the past, only the most super-competitive keyword sectors, (travel, health, etc.) were war zones for SERPs combat. But now, since you can make money with little or no effort with pay-per-text programs, all traffic is valuable. It's kind of ironic. Even though AdSense wasn't the trigger, it seems to have been the atomic bomb that ignited the keyword wars to the level that has caused Google to take such aggressive anti-SEO action, that the baby is being thrown out with the bathwater. The baby in this case being the small webmasters that don't fit the favored profile by the current algorithm. AdSense, by fully monetizing the web and being so easy to implement, has made it profitable to attack virtually almost every keyword sector of the web.
Can you still rank well in Google if you don't exert major efforts in linking campaigns, aggressive site SEO, etc? The big guys can form relationships with the "authority sites". Even if they don't buy text links, they can still influence the writers and reviewers on the big sites to write about their product or sites, and also wage publicity campaigns which gets you links from external writers covering certain industries. The small webmaster can't afford these things, either monetarily or time-wise. The last refuge of the small webmaster was being the content provider to the long keyword phrases being searched; but automatic page generation has done severe damage here.
Only time will tell. I hope I'm wrong. Obviously creating a truly viral idea or killer app (web site or product) will always work, but those are really hard to dream up. Kudos to those of you that are good at it.
| 5:51 pm on Feb 11, 2005 (gmt 0)|
Seriously, I think chasing this update right now would be like chasing a dog whose chasing it's own tail.
| 5:59 pm on Feb 11, 2005 (gmt 0)|
|Can you still rank well in Google if you don't exert major efforts in linking campaigns, aggressive site SEO, etc? |
Yes, though it may take a while in some cases because of the Sandbox.
| 6:02 pm on Feb 11, 2005 (gmt 0)|
"Even though AdSense wasn't the trigger, it seems to have been the atomic bomb that ignited the keyword wars "
Totally agree, they have created a monster and now are having to deal with it.
The last six months have seen an avalanche of made for adsense sites.
The web gets scarier by the day.
| 6:13 pm on Feb 11, 2005 (gmt 0)|
"Can you still rank well in Google if you don't exert major efforts in linking campaigns, aggressive site SEO, etc? "
depends who your competition is.
| 6:19 pm on Feb 11, 2005 (gmt 0)|
The last straw I am clutching at the moment is that at the moment Google is returning poor SERPS. By 'poor', I don't just mean from my perspective, but the SERPS I am seeing are literally choc full of extraneous rubbish - most of which is coming from portals in the form of press releases (referencing the companies whose sites who are being ignored), PDF's and sources which are far from the 'best pick' with regards to relevance.
Nothing against those who haven't been affected, but this means that if it stays that way for much longer Google will become less used - not by 'us', not by savvy people who can break and confound the algo, but by the average user who is just trying to find good info.
With the competition for 'searchers', and budgets being spent to get people to use a particular search engine, Google will have to do something or loose credibility and usage.
This is what I am hoping anyway...
| 6:47 pm on Feb 11, 2005 (gmt 0)|
Yes a large portion of the latest update seems to be related to "favoring" indirect references to a site over direct references. Obviously this is one way to make it harder to SEO, it's harder to hit a rebound shot then a straight shot, but I think the negative complications of such a technique are very significant. IMHO.
| 7:18 pm on Feb 11, 2005 (gmt 0)|
|"favoring" indirect references to a site over direct references |
They also seem favoring indirect page about going to the direct page.
They do not land on site abc.html for search abc. They land for my site on abd.html or my index.html which are linking to abc.html.
| 7:25 pm on Feb 11, 2005 (gmt 0)|
|...a large portion of the latest update seems to be related to "favoring" indirect references to a site over direct references |
Interesting way of putting it, and it describes what I've been seeing. I've been using the tool in walkman's profile to look at a couple of nonprofit sites. I'm using keywords that used to always bring up each site's home page at number one, because these sites have almost no competition for that keyword. One is the name of an individual with an unusual name, and it's his personal site with his name in the title and in big letters on the home page. That gives you an idea of why it deserves to be at number one. The other site is my favorite, s-c-r-o-o-g-l-e dot o-r-g, for the most obvious keyword you can think of to bring up that site at number one. Yahoo says it has over 400 backlinks.
My favorite site has been showing half of the IP addresses at number 1 and the other half at over 60, consistently for the last two days. It's not consistent with respect which IPs are doing what, but rather it's consistent in that the ratio between old and new data stays within about 20 percent of the total number of IPs tested, which is 55 IPs.
The other test, on a person's name, has been showing half of the IP addresses at number 1 and the other half at zero (meaning higher than a ranking of 100, since that's as far as the tool measures) -- and it's exactly the same story, with about a 20 percent variation over the last two days.
It appears that there is no possibility of Google merging the two data sets at this point, because I see no evidence that such a trend is in the works.
That gives Google three basic choices: 1) roll back to the old data; 2) commit to the new data across all IP addresses; 3) continue with this all-crazy blend of half old and half new.
I'd say that this is already a major embarrassment for Google, no matter which of the three choices prevails. At this point I don't think Google has sufficient talent or resources to merge them sensibly.
| 7:27 pm on Feb 11, 2005 (gmt 0)|
Personally, I think all of the SEs would do well to get rid of the directory-type sites. Many of them are a haven for spam, but even the ones that aren't - how useless are they? You go to a search engine, type in a search for "blue widgets" and it's going to return directories of links about sites on blue widgets!
Why not just return the links in the first place? Isn't that what a search engine is SUPPOSED to do? If I'd wanted directories of links, I would have searched for "blue widget link directory"!
I really don't see the need for all those directories in the first place, they clutter up the SERPs, make it harder to find the actual sites with blue widgets for sale, or information about blue widgets. Plus, it's like adding another layer between the searcher and what they're searching for. Just my opinion, but I find them useless...
| 7:54 pm on Feb 11, 2005 (gmt 0)|
Any update still fluctuating this wildly after this long can not, IMO, be working as intended. So the question is: who signed off rolling out this update to the live system without a workable backout plan in place? And does he/she have their CV up to date?
| 7:59 pm on Feb 11, 2005 (gmt 0)|
Here's a weird thought -- shades of Florida!
Could there be a hitlist of mostly commercial terms that gets sites bounced down? This one could be a list that got integrated into the rankings at an early stage for this "new data" that we're seeing, because I see no evidence of a real-time filter. That would explain why Google seems to lack the capacity to trim back the effect.
My speculation is that many strong sites (however Google chooses to define this) would be exempted from this treatment. Otherwise, it would be a bloodbath that even Wall Street might notice. And as regards to my two nonprofit examples above -- well, the "unusual name" I mentioned is someone who writes for The Register, and he is not a Google fan. His name could have made such a list quite easily.
| 8:06 pm on Feb 11, 2005 (gmt 0)|
"Any update still fluctuating this wildly after this long can not, IMO, be working as intended."
I agree but I think they're seeing some benefits because the serps (statiscally speaking) must've gotten better, yet so many sites are hurt for no reason. They're trying to figure out how to solve the filtering and still keep the "good" serps. That's why on s-c-r-o-g-l-e- at least half of the DCs carry serps where many of us do much better.
So far they have solved the spam problem like the cops could react to a surge in rapes cases: by arresting all males 18-55 years old. Not the way to go Google. When in doubt, don't filter sites out.
| 8:42 pm on Feb 11, 2005 (gmt 0)|
|It appears that there is no possibility of Google merging the two data sets at this point, because I see no evidence that such a trend is in the works ... continue with this all-crazy blend of half old and half new. |
Maybe old is 32Bit and new is 64Bit farming upgrade...
| 8:54 pm on Feb 11, 2005 (gmt 0)|
I am sorry to say but I think that the filtered results are starting to take over. On more and more DCs I am seeing the filtered results.
| 8:58 pm on Feb 11, 2005 (gmt 0)|
People are claiming that its half and half for the datacenters (which would explain why my Google traffic remains high) despite being bumped to obliviun in post allegra.
However, I only see the new results from Canada. Is google only serving from post-allegra datacenters, outside the USA?. Can anybody report the the contrary.
| 9:02 pm on Feb 11, 2005 (gmt 0)|
Concurr with the previous few messages precisely.
Just did a search on Google, for my own personal benefit, for a leading, big name furniture retailer in the UK. Searched by their well-known company name.
SERPS came back with approx. 5 press releases from 'news portal' sites on the company, 4 directory listings (real local, low quality sites) wasn't until page 3 that I found the company's home page.
Searching 'in these shoes', I was an average searcher looking for a big name in furniture. Google never even got near to helping me with my search.
End result: average searcher gets p~~~sed off with Google and searches at Yahoo. They MUST fix this.
| 9:05 pm on Feb 11, 2005 (gmt 0)|
Just in the last couple of hours, I've seen new movement in the direction of accepting new data on more IPs.
Just to throw this out -- I think the term "SEO" got hit quite hard in the new data. A couple of very high-ranking SEO sites look like they're taking a dive in the new data. They're both optimized for "SEO" even in their domain names, and they seem to be hit regardless of the keywords you use.
Maybe there is a sliding scale of mild to severe on the hitlist. If so, "SEO" must be at the severe end.