Google results are so bad it would not surprise me if I had some sort of spyware on my PC.
Unfortunately I have not!
Wouldn’t be simpler to hire 20 people that hand edit those spam sites? I am sure that if you cut one spammer 10 sites in one day he will rethink his strategy. In one day I am sure that I can get about 1000 spam sites, would be so simple.
I’ve searched today for a prescription drug home page – the mother site - and I did not found it. The site was banned or sandboxed or what ever with all the others. I found it with Yahoo on the first position. The site was ranking on #1 on a 66 DC in Google…
removed - wrong board
|Wouldn’t be simpler to hire 20 people that hand edit those spam sites? |
This is off topic but you are right on the money. Manual editing is the only real answer to this problem.
has very nice results...
|Google results are so bad it would not surprise me if I had some sort of spyware on my PC. |
Unfortunately I have not!
That's a good one.
Listen, I am a small fish. But I think you have to look to the small fish as a harbinger of what's to come.
I was a huge Google fan. But lately I'm really getting put off by what they're doing/serving up.
I have noticed a change.
I think they're streaching themselves way too thin and getting away from their bread and butter.
Google was fast and accurate at one point...not so today.
There is something really wrong with the Google search, the results are wacky.
Many of those sites have never been on the first page and are not really what you would expect to find there.
As I look at it the first sites description is in Chinese!
My site was #1, now it's gone, not even in the top 1000.
It seem that putting up good, relevant content isn't good enough. Rather you have to do the dance with Google, their way.
Seems a bit ridiculous.
I'm frustrated with Google as a webmaster, and as a member of the public searching for stuff.
This is not a good sign.
I may be full of it but that's my 2 cents.
IMHO whatever is going on (and no-one can deny there are strange things going on) needs to be or is being addressed.
we have lost and gained in this *update* so swings and roundabouts for us really.
when i look at things from an objective angle i really wonder how long it will be until word gets out that something is wrong/strange/whatever at google.
IMO those who say "oh shut up, changing your homepage search from G to MSN/Yahoo/whatever wont affect google" are maybe underestimating the power that the internet has on peoples perceptions (how do you think google got where it is).
i am not attacking google with this post, i still feature (and think of) google as the number 1 SE on many sites, but the problems that others (senior members and mods included) have reported and that I myself have seen (and still see) shouldn't be dismissed so easily as they have.
serving 2 sets of serps with up to a 50% difference in the number of results from one minute to the next is not a good strategy for an SE that is supposed to be reliable.
maybe those that are dismissing these quirks know something i don't :o)
this is one of the most competitive markets around at the moment, and looks like it will get more so in the next year or 2.
today i heard the first complaint from a user (quite experienced but not a webmaster/seo/sem/etc), this is not a figment of imagination, this is a real issue, i thought WebmasterWorld would be the perfect place to have a professional debate about how this may affect google, but it just seems to get dismissed as if it happens all the time.
i think the folks at google are probably addressing this issue right now, search is what makes google what it is, without it, it will be another Altavista.
just my thoughts, i know others disagree but can't we at least address the issue from a professional perspective and from a users perspective.
i for one do not want to see google suffer any sort of downfall, i think they started off with a refreshing view to SE policy, and i wish them all the best for the future.
come on google prove us all wrong and sort out those strange serps :o)
<edit>typos, grammar, you name it!</edit>
Generally I think this index is an improvement on what we have seen before with a lot of spam removed, although as always some sites which most would regard as good sites, on topic and free of any deceptive techniques are hit as a consequence of filter / algo changes.
One site in particular I follow, which is a relatively new employment website (less than 12 months), but had been ranking as you would expect for a new site (not too prominently) has been replaced in the index by a site showing a cached version of this site.
Previously when you searched for the company name, the site appeared, now it doesn't and has been replaced by a site (actual domain name omitted) with a url www.website.com.au/cached?id=zm14gddd
Within the search results this url returns the page title and snapshot as it is in effect a copy of the site in questions homepage.
Perhaps the reason for the serious drop in ranking for the site is because of some duplicate content filter caused by this cached verion on another domain?
"Feb 12 11:32 (Chicago time):
terms: "domain.com" page: domain.com
number of zeros: 45
average of nonzeros: 1.0"
out of 55 Dc's I'm in 10 that according to my logs don't serve any results to the public. Essentially nothing changed from yesterday. If anything is to change, Google usually around the 20th starts making changes. September 20th, December 17th.
For some reason if your site got caught in Google's algo they don't ban you, just take away every single benefit in ranking. Not even in the top 1000.
On three sites I checked, it was exactly the same pattern. I think Google is semi-offline with the old index on these IPs so that engineers can run comparison tests.
These three sites showed number one positions on two Class C blocks, 216.239.53.X and 66.102.7.X and zero on all other IPs.
Now the question is, if your site shows a high rank on those two Class C blocks, and zero everywhere else, do you see any Google referrals in your logs? My guess is no. That means Google isn't using those Class C blocks to serve results to searchers.
I have one site that follows this pattern, except that instead of all zeros on the other IPs, it shows a rank of about 63. My referral logs aren't helpful in providing evidence one way or the other for this theory. How about yours?
|I think Google is semi-offline with the old index on these IPs so that engineers can run comparison tests |
If this is true, then it's very unprofessional of them to put the results live before comparison.
Not likely... GG himself said they do extensive tests before such changes..
Then what other explanation exists? The results are not stabilising, could mean that they meant it that way - 2 indexes, rotating with different results. Hehe, it actually is funny, dont think they will do that.
Then what? An extremely looooonng update?
Google has always spidered my sites (based on the reported date and time in the cache) at about 04:00 UTC. The last time they did this was Feb 10th. As of Feb 11th it has moved to 23:00 UTC instead.
I also used to see the new Fresh Dates appearing in the hour leading up to midnight UTC. As of today that process also moved at least several hours earlier. They were out well before 21:00 UTC tonight.
"Then what? An extremely looooonng update? "
No. The update is over. I think Google realizes that one of their filters has caught many sites for no reason, and is working on solving that...for the next mini-update.
Don't fix something that ain't broken.... Up until a while ago, I'd of said this phrase was stupid - but apparently Google tried to "fix" themselves and they just made themselves worse off.
I think the real problem at googleplex right now is a large group of SERPs engineers with huge inflated salary packages and nothing better to do but to try and fix something that is not broken.
Google's ever lasting attempt to try and fully automate the spam catching mechanism is the real problem. Fact of life, some things (like site quality reviews) cannot be fully automated and must be done manually using a human editor. Maybe the large number of surplus engineers can be put to a better use. Mathematics has it's limitations.
|Then what? An extremely looooonng update? |
Yeap, my guess is that we are seeing the emerging new index (64Bit) running beneath the old 32Bit indexes. So the old ones are still up for reasons like rollback availability. The new 64Bit index will need a lot of time as we now that age is very important for the Google algo. But if it is started from scratch, then there is for example no sandbox in yet as there is no link age or whatever it relies on available here yet.
|I think the real problem at googleplex right now is a large group of SERPs engineers with huge inflated salary packages and nothing better to do but to try and fix something that is not broken. |
Hmmmm....and what do you consider the spam problem to be - nonexistent?
The only way to keep webmasters from gaming the serps using sites of quetionsable content is to keep altering the formula. While some good sites certainly get caught in the crossfire, I think many webmasters would find (as some here can attest to) that sites that have established sites that have "disappeared" from google will eventually return as they are re-spidered and/or google engineers continue to "tweak" the algo.
I think it's not.
99% peoples here think that results form this DC are much better. I hope Google guys think also.
This idea was born a year ago I think while talking to someone here or maybe developed from my own dwelling on updates. I have logged patterns though and this has always been on my mind since.
Be assured while we all are checking our positions in those competitve keyword(s)/phrases in the SERPS - Google and at each DC the following information is being logging:
> User session IPs, Proxy, as well as other platform data collected
> Each phrase/keyword searched matched with the IP/Proxy/User data collected is matched – Results indicates a possible SEO/SEM/Spammer/or what have you checking their results from a specific geographical location.
> User data + Keyword/Phrase + Site tweaking = FF (Flagged and Filtered)
In sum: Based on the data collected, Google then understands one of the sites being tweaked for that term and has an active SEO teaking a site is flagged and due to be filtered!
Ha! There is my over the edge theory (or is it?)
I just had to toss that out there… WebmasterWorld while a great place to discuss strategy, sort out algo changes, I hear many people here spilling their guts daily how they monitored their keyword//phrase in each DC. Google has got to know with each update everyone here is going to spill it! We run all our checks on each DC and comment how positions have been lost, gained etc… What a resource for Google to monitor how well their algo/filter or what have you is working or not…
(note: I hope not to get spanked for saying that. WebmasterWorld is a great place - perhaps topics should be more filtered though.)
Perhaps I am thinking too much (most likely) it would seem too logical though that Google would play off this information.
[edited for correction]
[edited by: Slone at 2:26 am (utc) on Feb. 13, 2005]
|...will eventually return... |
Which is unacceptable. I constantly hear from mom and pop ecommerce folks who do not have the luxury of "...will eventually..." Google is rich enough to set up an appeal process of live humans with integrity, who are semi-independent of the occasional mood swings of a Larry or Sergey, and let these reasonable people handle webmaster inquiries and tell the engineers what they should or should not be doing.
Google has always claimed that PageRank "relies on the uniquely democratic nature of the web." Show me a democracy where innocents are "disappeared" without any recourse whatsoever. That's what Google does now on a regular basis. Just because it's done by Google's bots does not make it excusable.
At one point in the Microsoft antitrust trial, Bill Gates was asked to confirm that he wrote an email that came from his email account. "The computer wrote it," he testified.
During the Gmail privacy issue last April, Google kept saying that machines were reading the email, not humans.
It's the same old story -- you've got to do something about people like Bill, Larry, and Sergey before they become evil billionaires. Because by then it might be too late.
|99% peoples here think that results form this DC are much better. |
Looks like an old set of results to me (and possibly a case of wishful thinking?)
IT'S OVER. DONE. CAPUT. FINISHED.
...At least until they make another change that brings the end of the world as we know it ;-)
|Which is unacceptable. I constantly hear from mom and pop ecommerce folks who do not have the luxury of "...will eventually..." |
Guess what? I'M a "mom and pop" ecoomerce biz, and I think it's laughable that ANY business, be it a one man operation, or a mega-corporation should be given any kind of special assistance in maintaining their profitability simply because their business model is weak. If they based their business on a search engine delivering customers to their door for free, and that free ride has been taken away (albeit, IMO, probably temporarily), then those people NEED a wakeup call.
Further, while I DO think google needs to employ actual humans to screen sites that receive multiple user reports of guideline violations, why on earth should they expend serious resources to help ANYONE (including myself, whose site was in limbo for the last year or so) make a buck on the back of the system they have invested millions upon millions to develop?
|Show me a democracy where innocents are "disappeared" without any recourse whatsoever. |
Show me a automated, hardware/software based system that runs 24/7 WITHOUT a single glitch or error. I think people sometimes fail to take into account the enormous job that crawler-based search engines and their designers are trying to do. Properly indexing an ever a information resource that grows exponentially and ranking that information by relevance, all the while trying to devlope an automated system to combat the spam that is (mostly) generated by OTHER automated systems is a task that is herculean in nature. Expecting 100% accuracy and infallibility is not only asking the impossible, it's unreasonable at best.
Do you like those update and results?
Maybe you would like that.
[edited by: Newman at 3:05 am (utc) on Feb. 13, 2005]
Yahoo employs humans, that's why we see similar SERPS for weight loss and weightloss.
WebFusion .... most 24/7 hardware/software systems that I'm aware of usually have more glitches caused by wetware than both the hardware and software.
A software glitch in almost all cases is actully a wetware crainial blowout.
Major hardware systems that run 24/7 critical applications are usually highly redundent.
NOT old results.
I checked again earlier today (Feb 12); some pages were Feb 11 cache and rankings are different than pre-allegra. Number of results on variety of 3 word search terms I use always slightly greater than pre-allegra.
Google.com, however, is returning SERPs where number of results on some of the same search terms is 40 to 50 % LESS than pre-allegra. I'm talking 250,00 pages MISSING for one of my 3 word search terms.
This update is not yet over; it is broken-in-progress.
Google has been updating on a daily basis for months. Saying "update is over" is just non-sensible. They pushed out some data and are now fiddling with it and continually updating as they have been.
But as for what was called "allegra", that burp of data has not been integrated yet into one basically consistent result set.
As much as I'm bothering to watch I just keep seeing two sets. One or the other.
Anyone seeing sites rotate between more than two positions? I've been seeing these same two sets (only) for 3-4 days.
I too have seen the same two sets of results for the past 2 days in the US. However there is a third set that I've only seen on Google in Europe.
I think Google needs a "rate these SERPs" group of buttons at the bottom of the search results page.
Let the public decide.
Not the "Dissatisfied?" link at the bottom of the current SERPs; 99.99% of the people out there won't take the time to do that.
Just 5 radio buttons at the bottom: very bad, bad, OK, good, very good.