Welcome to WebmasterWorld Guest from 22.214.171.124
Forum Moderators: open
"Any clue as to the possible role greater reliance on semantics is playing in your never ending quest for more relevant results?"
I'd say that's inevitable over time. The goal of a good search engine should be both to understand what a document is really about, and to understand (from a very short query) what a user really wants. And then match those things as well as possible. :) Better semantic understanding helps with both those prerequisites and makes the matching easier.
So a good example is stemming. Stemming is basically SEO-neutral, because spammers can create doorway pages with word variants almost as easily as they can to optimize for a single phrase (maybe it's a bit harder to fake realistic doorways now, come to think of it). But webmasters who never think about search engines don't bother to include word variants--they just write whatever natural text they would normally write. Stemming allows us to pull in more good documents that are near-matches. The example I like is [cert advisory]. We can give more weight to www.cert.org/advisories/ because the page has both "advisory" and "advisories" on the page, and "advisories" in the url. Standard stemming isn't necessarily a win for quality, so we took a while and found a way to do it better.
So yes, I think semantics and document/query understanding will be more important in the future. pavlin, I hope that partly answers the second of the two questions that you posted way up near the start of this thread. If not, please ask it again in case I didn't understand it correctly the first time. :)
Google needs to step back and look at what they were producing and what they are now producing.
With MSN and Yahoo coming soon, they better concentrate on what is important. Retaining searchers, not driving them away with mediocre results.
[edited by: Marcia at 5:43 pm (utc) on Feb. 16, 2004]
just a thought ..
under the 64** and GG's way of thinking ...."yahoo" would be considered by "google" to better deserve # 1 slot in a search return on "google" for the term "search engine" ....it should even get # 1 for search term "google".....( one's got loads of information rich text on its start page and links back to the search term and the url of "google"...the other is basically just a logo ) .......;)
I wouldn't feel so bad if I knew that while shooting me down you shot yourselves in the foot guys ;))
I run a number of small sites 15 - 30 pages and pre-Austin the interior pages of these sites always did well. We rode the Florida update reasonably well but Austin was hard for certain industries.
With Brandy it looks like we're back in for our main search phrases site-wide but only the index is being presented when an interior page would be more relevant. Not as good for the user I would have thought?
GG, any idea when those 64.xxx results will roll-out to the UK, or are they not going to now?
Please - no brown-nosing about making reports, and no specific search terms or URLs.
Like it says on the front page, let's stay with this:
"...discussion for the independent web professional."
C'mon guys, we've got work to do! :)
If anyone wants to whine about spam instead, you know exactly where the spam report is.
Could we declare a moratorium on whining about drops in search rankings, too? :-)
For example keyword1 was searched but google highlights keyword1 and another keyword which is either a plural or singular version or a word related to the original search.
On 64.**** you can see them highlight many new words that wasn't searched for.
If you type in the search for a two word phrase, with the second word having two possible suffixes, if version-1 is what's on the page and version-2 is what you searched for it'll come up with the page with the version-1 phrase (what's on the page) highlighted just as though it were version-2.
It's much easier to see on less competitive terms.
216.**** still showing in southern California. Have noticed slight shuffling in the results, but nothing of any consequense.
Dates no longer appear infront of the cached link, but our page is date stamped and the google cached dates for our site appear as follows:
64.xxx - Sat, 14 Feb 2004 0:47:10 CST
216.xxx - Sat, 14 Feb 2004 0:47:10 CST
Live california Google - Sat, 14 Feb 2004 0:47:10 CST
Hope this has not offended anyone - GG, moderators, etc.
& now, a Reader's Digest version:
Patience, Grasshopper, patience.
In the past, there was a lot of worry about transfering domains because a day or two of being off the net could result in your site being gone from google.com for a month or more.
We have a site (main site, highest traffic site, authority hub site) which was down hard for five full days at the beginning of the month. Fresh dates went away but the site was not dropped. Rank in the SERPs of the main page appears to be unaffected but a few of the sub pages have dropped a couple of places.
Is this the kinder, gentler, googlebot?
Has anyone else seen this happening?
... or ... is this just an "update is not done" issue and there is more to come?
It looks to me like the freshness of the index is a major consideration but that Google will refrain from dropping the main "authority" site in an industry, even if it is missing in action for an extended period.