Welcome to WebmasterWorld Guest from 54.227.52.24

Forum Moderators: open

Message Too Old, No Replies

Update Brandy Part 3

     

GoogleGuy

7:41 pm on Feb 15, 2004 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Continued From: [webmasterworld.com...]

"Any clue as to the possible role greater reliance on semantics is playing in your never ending quest for more relevant results?"

I'd say that's inevitable over time. The goal of a good search engine should be both to understand what a document is really about, and to understand (from a very short query) what a user really wants. And then match those things as well as possible. :) Better semantic understanding helps with both those prerequisites and makes the matching easier.

So a good example is stemming. Stemming is basically SEO-neutral, because spammers can create doorway pages with word variants almost as easily as they can to optimize for a single phrase (maybe it's a bit harder to fake realistic doorways now, come to think of it). But webmasters who never think about search engines don't bother to include word variants--they just write whatever natural text they would normally write. Stemming allows us to pull in more good documents that are near-matches. The example I like is [cert advisory]. We can give more weight to www.cert.org/advisories/ because the page has both "advisory" and "advisories" on the page, and "advisories" in the url. Standard stemming isn't necessarily a win for quality, so we took a while and found a way to do it better.

So yes, I think semantics and document/query understanding will be more important in the future. pavlin, I hope that partly answers the second of the two questions that you posted way up near the start of this thread. If not, please ask it again in case I didn't understand it correctly the first time. :)

drewls

10:46 pm on Feb 16, 2004 (gmt 0)

10+ Year Member




well obviously if some places are seeing 64 on www then it hasnt moved zero, it has moved some, just not in your area............

Sorry, but that's not how it works at all. They've taken the datacenter dns away to lend credence to arguments like the one you just made, which any senior member of this forum will tell you hold no water.

digitsix

10:50 pm on Feb 16, 2004 (gmt 0)

10+ Year Member



^--- interesting, however im not totally convinced. If GG says that its going to be rolled out, ill take his word for it... for now anyway. Also, I have seen 64 results at my house for two day straight now. No flux at all which conflicts with the statement about the load balancing and getting redirected to random datacenters... either way, this is a never ending discussion, so lets just wait and see what happes ;)

kevinpate

10:53 pm on Feb 16, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



drewls,

Scroll back to msg. 147 this thread (quoting GG from Brandy2) where the forecast was changed and tyhe advice was to anticipate a rollout over multiple days.

Comparing 64 index topre-florida

For my fav kw1 kw2 combo, 64.x.x is most definitely 'not' pre-florida serps, but that'sfine. Looking around with a user hat on, I'm liking 64 just fine so far, so in the spirit of the month, Laissez Les Bon Temps Rouler.

drewls

10:53 pm on Feb 16, 2004 (gmt 0)

10+ Year Member



True, but like I said, I took his word the last time and it didn't go like he said it would. Now this time is looking similar to last time.

The datacenter dns being removed, in a way, removes a lot of accountability for them. They can now say one thing and do another with most people being none the wiser.

skippy

10:53 pm on Feb 16, 2004 (gmt 0)

10+ Year Member



#1 for two word key word from 64.****...

"This site is currently under reconstruction. Please do
not be surprised to read out-dated information".

Teee Heee page is offline

Robert123

11:00 pm on Feb 16, 2004 (gmt 0)

10+ Year Member



Please clarify if i am wrong, but this is the first time i have ever seen goolge guy say that the results of the new update could be viewed a specific data center.

He said later it will take longer to roll out.

everyone needs to go easy on their refresh buttons.

Dumb_guy

11:04 pm on Feb 16, 2004 (gmt 0)

10+ Year Member



Any one having any take on why 216.**** is fluctuating in results? I am not noticing any fluctuating in 64.xxx, seems to be the same for better than 48 hours now. But 216.xxx is kicking different results back about every 6-12 hours.

Any one?

Marcia

11:05 pm on Feb 16, 2004 (gmt 0)

WebmasterWorld Senior Member marcia is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Hold on, stop the music!

Let's take off our soft shoes folks, we're not doing the data center shuffle any more. We've been told it'll be 64. - if and when we're told it'll be other than 64. it's one thing, but as of right now things are shifting and we'll not be dancing back and forth and around and around from one to the other.

We have 64. to look at, when that's been on www all over for maybe 12 hours steady it'll be a wrap but until then we're not doing data centers.

Lets back off and give some air time to the people here who want to look at 64. and have some serious analysis discussion.

This brings up the old issue of "site" vs "page". Historically Google has ignored the concept of site and used PageRank.

Is Google now embracing the concept of "site" directly as opposed to indirectly via linking structures & PR? Do we have any solid evidence either way?

Not sure I'd call it solid, but it seems a lot of shopping searches have been populated by pages from sites like Amazon, and I'd sure call those shopping "sites."

quotations

11:15 pm on Feb 16, 2004 (gmt 0)

10+ Year Member



>So let's have some throughtful discussion on
>what it is we're seeing - not numbers, but
>some serious evaluation, which is what a lot
>of our people are hopefully here for.

In the SERPs I watch, I am seeing quite a bit of reduction in sites which have two pages on the front page in this update.

The authority site will typically have SERP position one and two but now has #1 and #11 or #1 and #14.

I see a lot less of what I consider doorway pages but still see too many of those pages generated by searching at a random competitive search engine and publishing the results.

In the major technical area I monitor, things have improved greatly. The 89 pages which were missing from the top 100 during Florida have not all returned but about half of them are now showing up and several which truly are the authority sites have moved from about #50 pre-Florida to #25 post-Austin and now appear to be up around #12-15 in Brandy.

This still does not return all of the important technical information which an engineer or scientist will be looking for but it does give them pointers to a different way of finding that information.

Perhaps there has been a lot of work done in the area of the algo which deals with route optimization?

Bobby

11:24 pm on Feb 16, 2004 (gmt 0)

10+ Year Member



The whole idea of LSI and applied semantics is in determining the meaning of a "document" (a term we've heard before in this very thread) which suggests an entire site.

A couple of years ago there was a lot of talk about search engines moving to "themes" in order to deal with the load of doorway pages and spam that was arriving back then (Altavista nearly was buried in it). Now it looks like Google has taken it up again and is using applied semantics to better understand the "theme" of the site.

Marcia

11:45 pm on Feb 16, 2004 (gmt 0)

WebmasterWorld Senior Member marcia is a WebmasterWorld Top Contributor of All Time 10+ Year Member



>>Perhaps there has been a lot of work done in the area of the algo which deals with route optimization?

quotations, what are you defining as "route optimization?" Does it have anything to do with local search?

quotations

11:47 pm on Feb 16, 2004 (gmt 0)

10+ Year Member




Route optimization is the mechanism by which pages and documents are ranked relative to their ability to provide the most efficient overall path to the definition of a complete body of knowledge.

For example, the "Systems Engineering Body of Knowledge" document (SEBOK) of the International Council on Systems Engineering (INCOSE) would rank very high on that scale due to the "perceived" fact that it contains pointers directly to the entire SE Body of Knowledge.

By providing the link to the SEBOK, all other important documents/web pages related to Systems Engineering could be found by the shortest possible route. A link to the Institute of Physics (IOP) page about Distributed Systems Engineering, on the other hand, would be expected to provide some useful information, but much of that would already be reflected in the SEBOK and the IOP page would therefore have a lower route optimization score.

The ability to recognize the optimal size and contents of a body of knowledge and to optimize the route or paths which must be followed to acquire the entire contents of that body of knowledge is a rudimentary exercise in modeling but not a trivial undertaking with a data set the size of the entire Internet.

More Traffic Please

11:49 pm on Feb 16, 2004 (gmt 0)

10+ Year Member



It is almost like a handful of the old top ten sites (which would rank in the top if there was no filter) were mixed into the Austin filtered results.

I completely agree. I've done a number of "city" real estate type searches across the US and it seems like there is a trend of higher PR real estate sites with pretty good on page optimization leading the pack followed closely by the Florida/Austin type directories. It appears that makemetop may have a good point about the threshold for being considered an authority site has been lowered.

The devoted real estate sites are far better optimized for the "city" real estate query. Now, assuming the authority threshold has been reduced, they can compete.

valeyard

12:06 am on Feb 17, 2004 (gmt 0)

10+ Year Member



Route optimization is the mechanism by which pages and documents are ranked relative to their ability to provide the most efficient overall path to the definition of a complete body of knowledge.

But does the user want the "complete body of knowledge"?

Say I'm thinking of buying the Widgetco Widgetmaster 3000. Which of the following would be most relevant:

- A Widget superstore with hundreds of pages about all sorts of Widgetco products. Except the Widgetmaster 3000 which they don't stock.

- A Mom & Pop outfit selling all sorts of Widgets, Gizmos and Ubiqs but which have ten years experience of installing the Widgetmaster 3000

- A Blog that - amongst the soporific rubbish - has a detailed description of the user's bad experiences with the Widgetmaster 3000, explaining why they'd never buy one again and suggesting several alternatives.

If Google want to drop all deep links and only return index pages then site analysis makes sense. Otherwise it simply hides potentially useful stuff.

IITian

12:07 am on Feb 17, 2004 (gmt 0)

10+ Year Member



I have a feeling that we have been provided the data center explicitly by GG so that our feedback can be used to make the webmasters happy. They are going to take time to make sure they have got it right this time because a lot might be riding on it (The IPO, that is) and they surely don't want an Austin before that because it will surely spoil their dreams of early retirements in some isolated privately-owned islands. ;)

vplaza

12:07 am on Feb 17, 2004 (gmt 0)

10+ Year Member



I am seeing "64" on Yahoo.

farberama

12:40 am on Feb 17, 2004 (gmt 0)

10+ Year Member



would anyone have an idea why the SERPs from google wiewer would be different that 64, 216, or www? My site doesn't show up at all for my keywords in 216 or www, shows up #54 in 64, and #29 in google viewer results.

<<pulling out what's left of my hair!>>

Chicago

12:48 am on Feb 17, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Scroll down to the bottom of your results page and see if the results say "provided by Google". If not - this is not 64 - this is the new Y! index.

We are not seeing 64 on Yahoo here. We are seeing Yink.

Kirby

12:51 am on Feb 17, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



allan, what Im seeing goes along with what steveb wrote. A mix of city sites and relevant directories/authorities and newspaper sites that are specific to the industry and city. Some cities are less impacted by directory/authority types and I cant explain the discrepancy.

I still think this is the evolution of Florida and Austin, but I dont think stemming/semantics plays that much of an effective role. I see many results where pages, including my own, are ranking well based on <title> and anchor text, not content. I'll sticky you an example.

quotations

12:52 am on Feb 17, 2004 (gmt 0)

10+ Year Member



>But does the user want the "complete body
>of knowledge"?

They may or may not. It is hard to tell from a single simple search (another hint) but route optimization, like PR and anchor text, and proximity, and stemming, and localrank and about 100 other things, only plays a small role overall in the algo but it can have a huge effect.

Under this theory, whoever is most efficient in providing direct or nearly direct access to the largest volume of the most important and most relevant information, without negatively impacting the other factors, should and does get a bump in rank.

quotations

1:03 am on Feb 17, 2004 (gmt 0)

10+ Year Member



These Inktomi results on Yahoo are almost embarrassing.

On google, we have #4,5,9,10,17
on yahoo, we have #1,2,3,4,5,6,8,10,11,13,15,16,17,18,20,21,22,23,24 ...

Of course, both are excellent results.

;-)

vplaza

1:04 am on Feb 17, 2004 (gmt 0)

10+ Year Member



Yes, I apologize the results appear to be INK, and not "64" Google.
Interesting SERPS though!

Leosghost

1:11 am on Feb 17, 2004 (gmt 0)

WebmasterWorld Senior Member leosghost is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Can someone say which "yahoo" we're supposed to be looking at ....
I just checked for my KW #1 +KW #2 and I'm still #1..
but on the bottom of the page it says "Google"..
Pass the generic tranquiliser close my eyes and hope it stays that way or what ..=:o)

< sorry I was typing when you said that >

James_Dale

1:57 am on Feb 17, 2004 (gmt 0)

10+ Year Member




Route optimization is the mechanism by which pages and documents are ranked relative to their ability to provide the most efficient overall path to the definition of a complete body of knowledge.
For example, the "Systems Engineering Body of Knowledge" document (SEBOK) of the International Council on Systems Engineering (INCOSE) would rank very high on that scale due to the "perceived" fact that it contains pointers directly to the entire SE Body of Knowledge.

By providing the link to the SEBOK, all other important documents/web pages related to Systems Engineering could be found by the shortest possible route. A link to the Institute of Physics (IOP) page about Distributed Systems Engineering, on the other hand, would be expected to provide some useful information, but much of that would already be reflected in the SEBOK and the IOP page would therefore have a lower route optimization score.

The ability to recognize the optimal size and contents of a body of knowledge and to optimize the route or paths which must be followed to acquire the entire contents of that body of knowledge is a rudimentary exercise in modeling but not a trivial undertaking with a data set the size of the entire Internet.

Hm, yes, but to add to/clarify some of these points:

Route, optimization, mechanisms, cover, (by default) documentative abilities when ranked according to their relationship; (num root) with the efficient paths. This is the means by which PDF documents and their overall internet presence (sic) are established via semi-autocratic knowledge mechanisms. A prime example of this is that the systems engineering (INCOSE) rankings on the forefront, partially-peaked and indexed according to the traditional dampening factor, as yet perceived, focuses on the 64.x index, which is high enough on the paradigm scale for containment of relevancy pointers.

;)

[edited by: James_Dale at 2:07 am (utc) on Feb. 17, 2004]

steveb

2:05 am on Feb 17, 2004 (gmt 0)

WebmasterWorld Senior Member steveb is a WebmasterWorld Top Contributor of All Time 10+ Year Member



farberama, the Google Viewer normally shows the same junk results as an allinanchor: search.

mbauser2

4:28 am on Feb 17, 2004 (gmt 0)

10+ Year Member



The whole idea of LSI and applied semantics is in determining the meaning of a "document" (a term we've heard before in this very thread) which suggests an entire site.

I fail to see any logic whatsoever in that assertion. You're just spreading misinformation.

Google has always used the word "document" to refer to an independent file. If you don't believe me, search Google for "The Anatomy of a Large-Scale Hypertextual Web Search Engine".

metrostang

4:34 am on Feb 17, 2004 (gmt 0)

10+ Year Member



I was really starting to enjoy this thread again until we started comparing datacenter results. I think we should take the moderators advice and wait until we see things settle down.

I can find 7 different IP addresses coming out of 216 with three different search result possibilities, one of which was mentioned above. Three of those are identical to those from 64. I think this just means it's taking some time.

I don't think we will know it's over until all IP addresses from all datacenters show the same results. Until then, let's go on the assumption that Googleguy was being straight with us and discuss the effects of the update.

Marcia

4:50 am on Feb 17, 2004 (gmt 0)

WebmasterWorld Senior Member marcia is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I was really starting to enjoy this thread again until we started comparing datacenter results.

I think we should take the moderators advice and wait until we see things settle down.

Exactly, and thank you. We'll not be comparing datacenter results.

All this '216's on www from the uk' and 'it's 64. from here' are meaningless white noise. That is just the normal cycling of the datacenters.

Exactly, and we'll not be doing any more reporting of data center results in this discussion, which is about the update.

Marcia

5:00 am on Feb 17, 2004 (gmt 0)

WebmasterWorld Senior Member marcia is a WebmasterWorld Top Contributor of All Time 10+ Year Member



The whole idea of LSI and applied semantics is in determining the meaning of a "document" (a term we've heard before in this very thread) which suggests an entire site.

One portion of the LSI paper that can relate nicely to this concept is IDF - Inverse Document Frequency. While document refers to a single document, how about the fact that some folks are of the belief that increasing the breadth of a site, as well as the vocabulary used, can help with rankings.

Is there a possibility that the frequency factor could be taken into account across an entire site as well as on individual pages within the site?

Chicago

5:03 am on Feb 17, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



To all: Do you see specific examples of some kw phrases doing well on some pages and some kw phrases *significantly* downgraded on other pages, when both pages are contained within the same site with the same level of optimization, on page and off?

If so, are the phrases that are downgraded more important from a search volume standpoint?

Yes, yes across many sites for me.

This 327 message thread spans 11 pages: 327
 

Featured Threads

Hot Threads This Week

Hot Threads This Month