| 12:16 pm on Jun 8, 2003 (gmt 0)|
So, will they ultimately read the same? If not, why are they different?
| 12:22 pm on Jun 8, 2003 (gmt 0)|
Usually after an update the data centers are (almost) the same. But since Dominic it is not the way it was. It could be different data or different algo. Please remember GoogleGuy wrote that after the initial data was spread from sj and fi (some 2 weeks ago), spam filters would be brought in and new backlinks, new data. The second phase would take between weeks and months.
| 12:38 pm on Jun 8, 2003 (gmt 0)|
>So, will they ultimately read the same? If not, why are they different?
LOL benflux, that is a very good question!
Google is the best search engine on the planet. It achieves this goal by dishing up one of many different sets of results every time you use it. All depends upon which datacenter you happen to get attached to.
No one can ever argue Google results aren't correct, because if you try the same search again 10 seconds later on a different PC the chances are they will be different.
Most companies prepare the data to be published in the background, and then let it become live and public. Google prefers to display the data while it is still in the cooking pot and leave the question of whether it is ready to eat up to the person looking at it.
You can chase your tail all day with this conundrum...all you end up with is a dizzy feeling :)
| 12:39 pm on Jun 8, 2003 (gmt 0)|
GG also made a comment that he wouldn't be surprised if Google did one more traditional update. Much of what GG has wrote has been cryptic. My take is that Google is in transition to some new system. However, at the moment we are in some sort of in between state. Which explains the somewhat botched results.
| 9:31 am on Jun 8, 2003 (gmt 0)|
One of our domains was first listed on Google in November 2002. Since then we've worked hard to get it from 50th place to 7th place on our most important single word search. It remained in 7th place during February, March, and April.
During May, we couldn't find it on the first 15 pages of Google using the same search.
Today, we are in 3rd place using the same search, and so Google must be on the move again. We've never been as high as 3rd, even during the dance, and only have 2 recorded back links with Google, the same as before.
Hypothesis 1 - Google is starting to repair the problems of the middle of May. He's got to be.
Hypothesis 2 - Google is starting to review the weight it associates with backlinks in its general algorithm, since others that were above us before had many more backlinks. very controversial - and outright wrong, it after this escapade we are back to 7th ... but I like the idea of posing the hypothesis anyway)
| 7:49 am on Jun 8, 2003 (gmt 0)|
...this just came to my mind... Google made it clear that they don't want to see any PageRank deal/dealers out there, they also don't want to see SEOs optimizing their sites basing on all kinds of Google algo cheats/tweak/tricks rather than on sites' usability and content.
Google tries to take actions against them, but Google's scale can not afford on-case methods, especially when they have to go to defend themselves in court from time to time for that. It also seems pretty difficult to create automated filters to penalize offenders, because Google can only analyze text/code, not minds.
so... my guess is that Google found a way! Being tired of commercializing PageRank they decided to ruin business models of PR dealers, and they found the best way to avoid lawsuits..... MESSY and LOOOOOONG UPDATE! Imagine how may SK-like “traders” have to look for a new job now...
Does it make sense?
| 7:54 am on Jun 8, 2003 (gmt 0)|
Does it make sense?
To the conspiracy theorist - yes
To the cock-up theorist - no
| 12:53 pm on Jun 8, 2003 (gmt 0)|
I agree with Webgurilla, freshbot is working fine gathering new content on our "established" sites with good PR (6 and above). But our new site is hurting bad. It was doing really well right before the SJ-Dominick deal settled, but now is suffering. Just patiently waiting for things to settle in. ALso telling our Clients that keep calling "HEY WHATS UP WITH MY RANKINGS?" Everyone needs to be patient.
I do want to give Google KUDOS for the spam filters that are slowly getting rid of folks using hidden text. Also GG (and/or someone at Google) read a spam report I submitted and caught the guy that slipped through within 45 minutes. Keep sending the spam reports to help them out through the transition!
I think all of us "fair" SEO folks will be very happy in a month or two when the "bad guys" are banned and our quality/relevant sites are ranking very well!
Best to all!
| 1:44 pm on Jun 8, 2003 (gmt 0)|
<<Google is the best search engine on the planet. It achieves this goal by dishing up one of many different sets of results every time you use it.>>
| 1:53 pm on Jun 8, 2003 (gmt 0)|
mfishy, glad you appreaciated that;)
"Where do you want to go today"....becomes...."where do you want to be positioned in the next 10 minutes"...It's all good....it must be, Google says it is!
| 1:55 pm on Jun 8, 2003 (gmt 0)|
About the Dominic fiaso
On a competitive term, number 4 result for a site that has no meta description or meta keywords tag.
That is ridiculous and disgusting!
Even using 3 month old deepcrawls that is not acceptable standards for the world's premier engine...
I know, I know-- wait, wait, wait,...go broke, go broke, go broke
| 2:10 pm on Jun 8, 2003 (gmt 0)|
>On a competitive term, number 4 result for a site that has no meta description or meta keywords tag.
I was just looking for people who sell "widgets in Kansas", I eventually found one at position 37, the first 36 were complete garbage. Same search on MSN, Altavista, and ATW returned the top 10 to 15 relevant and OT results.
Needless to say I ended up buying my widget from a site suggested by MSN.
Google may argue it has more pages indexed, something I personally doubt, I think ATW has more, but it can never argue the results are more relevant. For common search terms even Yahoo (data sharing) is more relevant. The other SE's are definately doing an better job.
Google, size is not everything, as my wife says how you use it is much more important....LMAO:)
| 2:16 pm on Jun 8, 2003 (gmt 0)|
And to think I was actually looking forward to that last update knowing that things were in place to break into the top 5 after taking 9 months to get into top 10.
If Google had any clue as to what this update was going to be--why did they even bother doing this one?
Not like they worry about how behind they are to begin with...all this Google publicity over the past year and they take center stage in the general public's mind and usage and give them this...
It feels like a pretty good sports game and then a tie result...
It feels like that awful kiss on the cheek from your fat aunt when you were a kid...
Oh well, guess we can wait, wait, wait while the spammers figure things out before the new filters are even applied.
Sorry to be so blunt today, but this is my living income and I did nothing to deserve dropping off the face of the search planet.
Can Google throw me a life preserver or something please.
| 2:17 pm on Jun 8, 2003 (gmt 0)|
>> On a competitive term, number 4 result for a site that has no meta description or meta keywords tag. <<
We know that Google does not use the keywords tag; and, so what if the site doesn't have a description tag? If you are going to penalise webmasters who don't have a description tag, then I hope they start penalising sites with code that is not well-formed, or who use marquee, blink, or embedded midi sound.
Is the site relevant to the search you asked Google to perform?
Does the site have useful and usable content?
If it has then what is the problem for the user?
[edited by: g1smd at 2:23 pm (utc) on June 8, 2003]
| 2:20 pm on Jun 8, 2003 (gmt 0)|
"Is the site relevant to the search you asked Google to perform?
Does the site have useful and isable content?"
The site can be argued relevent...however it is so 'relevent' that it was at 160 last update.
It is fine with me if Google were to just cut out using meta tags and just 100% judge a page on it's contents...
WOW, what a concept...content driven results!
| 2:34 pm on Jun 8, 2003 (gmt 0)|
>WOW, what a concept...content driven results!
LMAO...Google persists with its "off page" algo, actual site content is becoming a thing of the past. I assume Google is playing this strategy to combat SEO's.....Google wake up and smell the coffee.....you are playing into SEO hands!
Of course whichever way Google goes SEO's will grab it by the short and curlys. I guess if its popularity declined to 2% we might leave it alone;) Google would be better off ignoring SEO's, after all most of them only want convertible (i.e. relevant) traffic in the first place.
Google is trying to jump through hoops to combat the very people that are trying to help it produce more relevant results!
| 2:34 pm on Jun 8, 2003 (gmt 0)|
|...at the moment we are in some sort of in between state. Which explains the somewhat botched results. |
Explains, yes - justifies, no.
| 3:24 pm on Jun 8, 2003 (gmt 0)|
<<After reading your examples, I searched on "metropolitan museum of new york" and found the Metropolitan Museum of Art in position #1>>
Also comes up #1 on:
It really doesn't say much if Google can accurately return results for a very simple query.
I agree that on most searches users will not find a problem. I don't agree though, that long term, users will not see less relevant results creep into the index if new ranking data is not brought in soon.
| 3:41 pm on Jun 8, 2003 (gmt 0)|
Google is alright.
| 3:46 pm on Jun 8, 2003 (gmt 0)|
Following Altavista and Teoma making a large update today - alltheweb making an update about a week ago and Inktomi making an update about a month or so ago Google results (excluding freshbot) are starting to look more like wizenut :)
Lol - I love you really Google
[edited by: Dayo_UK at 3:49 pm (utc) on June 8, 2003]
| 3:48 pm on Jun 8, 2003 (gmt 0)|
Any fool who's put all their eggs in one basket deserves to be wringing out their crying towel. There needs to be a forum for winers and losers so they can cry on each other's shoulders. This thread reads more like a "Dear Abbey" column than one discussing professional webmaster issues.
The most pathetic posts are from those who cry that Google is costing them their living. These people need to look into the mirror to see who's costing them their living. I'd hate like hell to be a family member depending upon these losers to provide for me.
| 3:50 pm on Jun 8, 2003 (gmt 0)|
If google is moving towards continous updates and measuring relevancy by the words on the page (compliments applied semantics) it would make sense that they need to hold constant the "rated" results and let the swimming pool of freshbot fill up the pool so that they can measure the relevancy of the "new algo". Ergo if all that is true then even Google doesn't have better than an approximation of when they can roll in "official" updates.
note the word if
| 3:53 pm on Jun 8, 2003 (gmt 0)|
"It really doesn't say much if Google can accurately return results for a very simple query."
Exactly. It's amazing what sort of arguments pass here as soon as people think they have to defend something which is clearly out of order at the moment.
Searching for a particular name is peanuts. There will be no other websites claiming to be the same entity.
But if in a highly competive keyword segment 404 sites and even advertisement banners start showing in the top results then one would be insane to claim nothing's wrong.
| 4:01 pm on Jun 8, 2003 (gmt 0)|
|For almost every search Europeforvisitors mentioned, you can have commercial sites that are relevant that can go into adwords, and that's where google makes their money right? The main index attracts them, adwords provides good commercial sites where they can buy in the query area. |
I'm seeing this with Google AdWords banners, which my ad network is running on my travel-planning "content site." Those text banners contain AdWords for escorted tours, rail passes, vacation rentals, etc. Google is willing to buy CPM text banners (rather than CPC text banners) because it knows that people who want information on my topic are likely to be interested in related products and services.
There's nothing new about the concept of targeted advertising in editorial media; it's why hotel chains, cruise lines, and tour packagers advertise in THE NEW YORK TIMES Travel Section, and it's why Dell, Gateway, and Microsoft advertise in PC MAGAZINE. A lot of Webmaster World members don't seem to understand that information sites can be a platform for successful targeted advertising and direct marketing--and that some buyers (not all buyers, but a significant number) prefer to buy from advertisers on a legitimate-looking editorial site instead of just typing a search term in Google and placing an order or booking with the first keyword-keyword2-keyword3.com affiliate site that comes up in the search results.
I don't believe for a minute that Google is intentionally "breaking" its index to encourage AdWords sales. Still, if Google's current unpredictability leads some owners of commercial sites to try AdWords instead of relying completely on free traffic, that may not be an altogether bad thing. Those site owners may find that AdWords are a useful source of traffic and revenue.
| 4:17 pm on Jun 8, 2003 (gmt 0)|
|<<After reading your examples, I searched on "metropolitan museum of new york" and found the Metropolitan Museum of Art in position #1>> |
It really doesn't say much if Google can accurately return results for a very simple query.
Okay, try "airlines," "art museums," "churches," "nudists," "apartments," "horses," "cars," or "peanuts." The only mildly questionable listing in the Google SERP's #1 position is the "Peanuts" comic strip instead of the legume (and it's questionable only if you prefer goober peas to comic strips).
I suspect that the results for the terms above are about the same as you would have found if you'd searched Google before Dominic made its debut. It's certainly possible that the current update has more questionable results than earlier updates did, but again: for most queries and most users, Google is working just fine.
| 4:48 pm on Jun 8, 2003 (gmt 0)|
Meta tags are useless because whats to stop someone from filling them with MICKEY MOUSE on an Adult website? Then when a child hits the web and searches for his/her favorite character, they hit an adult site.
I agree with Google's VISIBLE CONTENT rating system 110%! Meta tags, other than Title (which is visible to users) are becoming (if not already) a thing of the past.
I love it when one of our Clients calls us and asks "Why dont we have our meta-tags setup for all of my products". I give them the above statement and they then conclude that this is a true and SMART concept...
| 4:48 pm on Jun 8, 2003 (gmt 0)|
Isn't the idea of continual update just what Fast does already on a weekly basis? or is it going to be a daily update using an extended Freshbot idea?
| 5:05 pm on Jun 8, 2003 (gmt 0)|
>Meta tags are useless because whats to stop someone from filling them with MICKEY MOUSE on an Adult website? Then when a child hits the web and searches for his/her favorite character, they hit an adult site.
What is there to stop those adult sites from cramming the visible text with MICKEY MOUSE? Some of those models or their body parts can be named MICKEY MOUSE and referred to many times on the page?
| 5:19 pm on Jun 8, 2003 (gmt 0)|
< Meta tags are useless because whats to stop someone from filling them with MICKEY MOUSE on an Adult website? Then when a child hits the web and searches for his/her favorite character, they hit an adult site >
Children don't have a credit cards. Owners of these websites will try to appeal to their target audience like we all do, by selecting relevant keywords - I'm sure Mickey Mouse isn't one of them.
| 5:27 pm on Jun 8, 2003 (gmt 0)|
>> .... as people think they have to defend something which is clearly out of order at the moment <<
That's the feeling I sort of have oraqref. Whenever someone points out the dangers of having an ever ageing and out of date core, a handful jump up and down citing irrelevant examples and claiming Joe Public won't notice. Except lots of Joe Publics have, or more will continue to do so.
Everyone knows the bottom line by now, even those in denial really know it. The quality is down, and will remain down until a new database appears. Sorry, but FACT.
Accept it and focus on the questions being raised, like the one in this thread: Will there ever be another monthly update?
It's broken. No it's perfect. It's broken. No, it's perfect.... will wreck every thread if it continues. The core IS out of date and IS affecting quality. Accept it and move on to discuss something more fruitful.
| 6:02 pm on Jun 8, 2003 (gmt 0)|
<<Whenever someone points out the dangers of having an ever ageing and out of date core, a handful jump up and down citing irrelevant examples and claiming Joe Public won't notice>>
This is quite true.
My questions to all that feel that an older core of data does not negatively impact the SERPS are: Why would Google have updated their backlink/ranking data about every month if they did not think it was important in determining serps for users? It seems like a huge waste of resources if the users don't even notice?
It seems ot me like it is unlikely they were happy about having to use an old index this time around and the transition period has been a bit longer than expected.
If ANYONE thinks fresh results are currently an adequate replacement for an update, I would love to hear your thoughts.
| This 237 message thread spans 8 pages: < < 237 ( 1 2 3 4 5 6  8 ) > > |