| 3:38 pm on Dec 8, 2003 (gmt 0)|
I think that has already been done loud and clear, and to good effect. NY Times, BBC and many other places - not least in here. I suspect that Google underestimated the influence of (Their words in a live interview) "Mom and Pop" sites on this occassion. The roll back is underway not necessarily because of what Google thinks of the results, but because of the influence Webmasterworld's biggest members have had on the worldwide press machine (and other forums, I am sure, but the best people in the other forums are in this one too).
It is impressive to see where the "authority" is on this issue.
| 4:22 pm on Dec 8, 2003 (gmt 0)|
I wouldn't jump to conclusions. These recent changes may have nothing to do with the recent press. It may simply be another step in whatever Google is planning.
Whatever the case... the timing of these changes could not have been worse.
| 4:51 pm on Dec 8, 2003 (gmt 0)|
>> wouldn't jump to conclusions. These recent changes may have nothing to do with the recent press. It may simply be another step in whatever Google is planning.
my thoughts exactly. we are here thinking that google is listening, which might not b ethe case. Probably this would have been the update anyways they were planning on .. just it was a little slower. Otherwise, they would have restored everything back immediately and not wait for 3 weeks (the update pattern) to change it.
| 4:56 pm on Dec 8, 2003 (gmt 0)|
I wouldn't hang my my hat on a "rollback"; it's abundantly clear where google is going, if not today it will be soon, very soon.
Stay away from the money, google intends to keep it.
| 5:17 pm on Dec 8, 2003 (gmt 0)|
lucky english speaking people,
we germans don't even have that, (the page exists [google.com] but google hasn't linked it on the main page).
Maybe one of the many reaons why Google Germany is having huge problems with spam (even Google atmitted that at the SES Munich).
[edited by: viggen at 5:18 pm (utc) on Dec. 8, 2003]
| 5:17 pm on Dec 8, 2003 (gmt 0)|
>but because of the influence Webmasterworld's biggest members have had on the worldwide press machine..
>It is impressive to see where the "authority" is on this issue.
That would border along the lines of illusion in the eyes of a realist :)
| 6:07 pm on Dec 8, 2003 (gmt 0)|
>> let’s start making some noise
That form is a great service, although it should not be seen as a place to file an error report that will be acted upon instantly. Rather as a "suggestion box" for well-motivated and well-informed specific suggestions. I've used it only a few times (all pre-Florida, even pre-pre-pre...) and i'd hate to lose that opportunity because of spam.
Agree, Chndru. Imho, the authority on google serps is to be found inside the googleplex, not in the press. (And specifically not in a SEO forum, there's nothing but uninformed mom&pop enterprises at those places anyway, it seems).
| 10:41 pm on Dec 8, 2003 (gmt 0)|
Jeezus, anyone who thinks the recent re-addition of sites was due to complaining on webmasterworld needs to get a life.
GoogleGuy has repeatedly urged people to send in feedback about serps, and those definitely have an effect.
The sites re-added were done so via Google crawling the web. That's it. People clinging to long discredited fantasies really should move on now. Google has changed the way it ranks sites. It has good and bad ramifications (choose your degree). Tell them about the bad, and the good too if you want. But don't think that it isn't full steam ahead with Florida. It is.
| 1:48 am on Dec 9, 2003 (gmt 0)|
>The roll back is underway not necessarily because of what Google thinks of the results, but because of the influence Webmasterworld's biggest members have had on the worldwide press machine
Whatever it is your smoking, put down the pipe dude. The worldwide press machine doesn't care much about Webmasterworld. And even if so, remember what Andy Warhol predicted. If someone here gets that 15 minutes of world fame, they will be quickly forgotten, and Google will go on.
| 2:10 am on Dec 9, 2003 (gmt 0)|
|the influence Webmasterworld's biggest members have had on the worldwide press machine |
That explains it... I knew it was big, but whoah...
What should we do next? World hunger? Aids? Getting me a PR7?
Brett, next it's world hunger. This is serious stuff here at WW.
| 3:28 am on Dec 9, 2003 (gmt 0)|
It doesn't really matter whether the current ongoing changes in the SERPS constitute a rollback or not...
I doubt it. From where I'm sitting, it looks like a second or third pass and still part of Florida...
But the recent spate of articles on major English language press organs will not have been dismissed at the 'plex, I suspect.
Mountain View is undeniably massive now. It may not touch as many lives and businesses as Redmond but it's big enough that any changes it makes cannot be made quietly.
Think about how few ripples were made by the big update last year (September 2002, was it?). Were there many comments even in very specific industry journals about that?
By contrast, just over a year later, the company is under pressure to absolutely sit on any adverse publicity if it is to maximise the success of its impending IPO.
I haven't seen Google's PR machine put any sort of positive spin on recent developments...
I seriously doubt there will be any more major shakeups between now and February.
| 4:01 am on Dec 9, 2003 (gmt 0)|
>People clinging to long discredited fantasies really should move on now. Google has changed the way it ranks sites. It has good and bad ramifications (choose your degree). Tell them about the bad, and the good too if you want. But don't think that it isn't full steam ahead with Florida. It is.
This just about covers it. End of subject.
| 4:32 am on Dec 9, 2003 (gmt 0)|
>>Google doesn’t have any obligation to us as webmasters<<
Who buys their ads?
>> but they are certainly dependant upon their users.<<
So are the webmasters who buys their ads.
>>Reporting bad search results<<
I’ve said it before and I’ll say it again. It is sad that people have to report errors to any search engine. Especially if we are talking about 404’s. It is the same as inspection, and you cannot inspect quality into any product. It must be designed in. 100% inspection is only 40% to 60% effective. If a search engine has any kind of ‘quality control’, they need to make sure that they have a big enough sample size, based on acceptable standards, to have a 99% confidence level in their results, and they should not go live with it until they do.
| 4:36 am on Dec 9, 2003 (gmt 0)|
Really , Google is not good as earlier, you get a lot of .org sites with less help.
Hope they hear the truth and make it better.
| 5:00 am on Dec 9, 2003 (gmt 0)|
>>GoogleGuy has repeatedly urged people to send in feedback about serps, and those definitely have an effect.
No doubt they have an effect in that they're probably taken into consideration and taken seriously. But if even legit, accurate feedback on quality of search results will over-ride higher priority company concerns right now is something I'm supposed to believe, someone will have to pass me the pipe first.
It's been awful tempting a couple of times this week to give feedback on something I've caught because it's so transparent and admittedly interesting. This is not a "spam report" type thing, it's completely within Google's guidelines.
It looks to be a gaping hole, a vulnerability that could easily be exploited and is barely perceptible; you have to look twice, blink and look again to see it. I don't think they can fix everything with adjusting the algo, and this looks to be one of the things that falls into that group. I don't think the algo could catch it; it looks like it's playing right into it.
Ordinarily I'd give feedback if I catch something symptomatic like this, and quite honestly it's being exploited to where it's taking up the majority of top spots in certain lucrative markets, but at this point I'm really thinking twice, just on principle.
|Tell your friends, relatives, anybody who uses Google to speak up next time they get irrelevant search results. Perhaps if we use some other channels and cry out with enough voices we might be heard. |
cherryo, to be honest this post kind of stopped me dead in my tracks and got me thinking hard. Google's done a lot to nurture relationship with the webmaster community. Like many others I'm a person of strong, long-lasting loyalties, on a personal level and beyond.
Many of us value trust highly, and that's an issue that's personally hitting on a few levels at this point. Once trust starts to erode something has to happen for it to be restored. This is an impersonal, relatively unimportant type of thing compared to an issue or two on a personal level that deeply parallel some core issues, but I'm inclined to take a wait and see attitude.
Hitting the button about problem results is basically an act done in good faith and with helpful intentions, and is a simple, insignificant thing that only takes a few minutes. But some things I'd ordinarly do without hesitation are just something I'm not inclined to do right now.
It's broader than causes and companies, sometimes we just have to pause and take time to introspect, look around us with eyes wide open, and do a reality check.
| 5:32 am on Dec 9, 2003 (gmt 0)|
Very well said, Marcia. I agree whole heartedly.
| 9:47 am on Dec 9, 2003 (gmt 0)|
|Hitting the button about problem results is basically an act done in good faith and with helpful intentions |
Hmm, not always. I would imagine most Webmasters that use the 'button' do so to try and boost their own rankings.
| 1:33 pm on Dec 9, 2003 (gmt 0)|
>> something I've caught because it's so transparent and admittedly interesting
Marcia, that post puzzles me - not so much what you write as the fact that you do it. I understand that you don't want to mention specifics, and that's okay - algo's change and new things move into focus, that's just the way things are, no problem. (besides, other threads have already mentioned most new issues i think)
Still, let's say pre-Florida that you discovered that somebody had a lot of incoming links with the same anchor text that was indeed the targeted keywords of the site. There's a quite big "hole" there for people that are able to get a lot of links placed around the web in their chosen format. I can see that.
Otoh, targeted anchor text is common practice around the web in a lot of cases, it's even providing good usability. The only "risk" is that these "holes" will (in due time) become common knowledge, and perhaps even lead to lower quality. If this happens, then there are clearly patterns to it, and the advantages can be removed just as easily as they were established. I really see no problem here. (Apart form this, i see no significant change in "anchor text is king". Also the emphasis on descriptive anchor text in the SEO community is definitely not a "bad" for the internet as a whole, imho.)
So, my opinion is that these things you observe are not "holes" in the algo. They are things that Google quite consciously have decided to give some weight, and as such they are more likely to be part of the algo than holes in it. One might say that Google is working with incentives (as opposed to penalties), but i guess it will take some time for most people here to realize that (as i do see why some people are currently tempted to think otherwise). >> 404, page for sale
As for the 404, Googlebot might not see a 404. I'm not suggesting cloaking, perhaps the server just doesn't send out the right headers. If they're not sending out the right headers, then this is one such case where you have to be human and "in the know" to see it. If it's cloaking, then you'll have to decide for yourself what to do.
Pages for sale, parked domains, highjacked domains, PPC ad-filled previously valuable pages... these are not "Florida" issues. They've been around for a long time and Google really should be able to get them if Gbot was able to properly identify specific patterns in content. It's getting better all the time, but we're still waiting for that. Humans see that kind of thing easily but bots don't.
I'd personally use that formula in cases where my search did not give me results that should be expected, considering the query i made. There are some things that a Search Engine just doesn't know; issues that you'll have to be an industry (as in "some keyword combo") insider to know about. These cases are the ones i have personally reported, they're neither abuse of the algo, nor spam, nor the algo not working properly, just things that are not as obvious to people that are not "in the know" (or bots that are not people). Personally, i'd think these were good examples to report, but feel free to disagree.
| 3:40 pm on Dec 9, 2003 (gmt 0)|
In my case with the 404’s I believe it is because some bot, that shall remain nameless, was having a problem with 301’s. Instead of just going to the index.html, or whatever, and following the links, they keep some kind of cache of where they think the pages are and what the URL is. When that cache is really outdated plus the problem with the 301’s, it has created a mess for me. Do doubt it was just my luck to make these changes at the wrong time, but how was I to know it was the wrong time. Which makes my point about NOT going live until you, blah, blah, blah.
This could be one reason why there are now 404's showing up in the searches. And since the last update of the nameless bot, it is now returning these pages instead of the pages that should be returned based on what the webmaster expected the nameless search engine to return. Before the update, I never saw a 404.
My server does return the proper header, I have checked. This is clearly a problem ONLY with the nameless bot due to the fact that all the other bots have done it correctly. There is only one place to place the blame. Well, unless it is my fault for changing the names of some of my pages for better file management reasons. Using the theory that the more complex something is, the higher the probability for an error, and case studies performed by major universities indicate that the error rate it is exponential. In the case of a search engine, you have 2 pieces of code running. The part that fetches URL’s and the part that searches the human query. So the probability for error is more than twice as likely.
claus, are you saying that the nameless search engine is not susceptible to errors in the coding?
| 3:46 pm on Dec 9, 2003 (gmt 0)|
Webmasters *are* Google users - no website -no Google.
It is as important to be found as to find, otherwise what is the point?
| 4:13 pm on Dec 9, 2003 (gmt 0)|
>Webmasters *are* Google users..
Ofcourse. What is implictly assumed is that the vested interests of both the webmasters and general users in search engines (e.g. google) are the same.
It's not neccesarily the same in many cases.
| 4:56 pm on Dec 9, 2003 (gmt 0)|
cherryo, nice post. Receptional, brave post ;-)
There's a battle going on here, and while I'm the first to admit that I don't always see the subtleties of algo tweaks, I'm a little more well versed in business and marketing issues.
Brands are complex, existing primarily in the minds of users. It is rare that one single event, no matter how momentus, will start the long term decline of a brand. Typically it's a series of events or misjudgements - resulting in deteriation of product quality as seen from the eyes of the user - that starts a decline period.
One thing is almost certain. The majority of experts don't believe the decline has begun, when it already has.
Common elements seen at the beginning of the decline in a brand's strength can include:
--Heavy users becoming frustrated or dissatisfied (usually less than 20% of total users).
--Early adapters and experts/authorities becoming dissatisfied.
--Increasing numbers of less than positive press blurbs.
--Major blunders by the company itself that contribute to perceptions of problems (e.g., Micheal Jackson dangling baby off balcony...and please don't anyone tell me he's not a brand).
Those who who think that WW members in force can dictate the general direction of any company are of course incorrect, but then again, I don't think I heard anyone in here say that.
Those who don't fully appreciate that *when a chorus of senior WW members unite in noting their frustration* it can have a profound effect that extends far beyond the walls of WW, probably don't have much experience with brands.
I'm not saying G is dead. Not by a long shot. I'm just talking history right now. Time will tell as far as G is concerned. Certainly the game is far from over.
| 5:11 pm on Dec 9, 2003 (gmt 0)|
>> claus, are you saying that the nameless search engine is not susceptible to errors in the coding?
Nope. And there's a lot of factors, as you mentioned. As for real 404's, these should really not be found in SERPS or index.
| 6:30 pm on Dec 9, 2003 (gmt 0)|
With regard to the topic of this thread "Reporting bad search results" I reported the number 1 site in our main search results yeaterday as this site has been showing just one page that says "this site is temporarily unavailable" for 6 months and yet continues to rank #1 both pre Florida and post Florida. Even the Google description tag shows the "not available" phrase.
If content is king then how can this happen. It looks far more like link text is king.
Any words of wisdom from the experts?
| 2:38 am on Dec 10, 2003 (gmt 0)|
|If content is king then how can this happen |
The reason "content is king" is because it means more keyword and phrases in Meta tags and in text on pages. The more pages the more keyword and phrases and the more real estate you have in Googles 3.5 billion pages. It also spreads the risk a lot, one page may drop while another may rise.
Why the page is no 1 is impossible to say without seeing the URL.
| 11:18 am on Dec 10, 2003 (gmt 0)|
This query: [google.com...]
...reveals 20,300 such pages. If you look at a few of them you will see that they do not return any of these server headers:
404 Page not found
301 Permanently moved
302 Temporarily moved
...ie. they are real html pages. As they also have incoming links (some even from dmoz), and even content on them, they do qualify technically for being in the index. A bot will not "understand" the text of such a page, it will only read it, and because of this such a page will be ranked according to the same ranking criteria as any other page. These would, imho, be good examples of pages that i would use that form for, as a bot would never be able to tell the difference but a human would.
| 11:25 am on Dec 10, 2003 (gmt 0)|
BBC news UK:Just a few days ago a woman cry in an interview about the new Google results, for the UK webmasters just remember last week's coconut oil case.
| 2:27 pm on Dec 10, 2003 (gmt 0)|
|Just a few days ago a woman cry in an interview about the new Google results |
If -in propogates I won't need an interview to start crying.
| 4:38 pm on Dec 10, 2003 (gmt 0)|
>> a woman cry
Sorry for being cynical, but this is ridiculous; first mom&pop's and now a crying woman... what's next - poor homeless kids?
Hey, everyone... the above line as well as the post that caused it is a very good example of being Off Topic. I know it's sometimes hard, but please do try to stay on topic, which is:
Reporting bad search results
...that little link down on the bottom of the search page “Dissatisfied with your search results? Help us improve.”
Sincere apologies to everyone who might have been offended, it's just easier to see how discussions can easily drift off topic when trolling-like posts are given as examples. There are a lot of trolls [google.com] out right now, please don't feed them.
| This 36 message thread spans 2 pages: 36 (  2 ) > > |