homepage Welcome to WebmasterWorld Guest from 54.227.77.237
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 57 message thread spans 2 pages: < < 57 ( 1 [2]     
Google Observation
Ok, everyone else is talking Google but did we miss this
paynt




msg:70473
 5:35 am on May 20, 2003 (gmt 0)

Brett is going to bop me for these quotes but I canít help itÖ

I know, weíve all said it dozens of times and thereís nothing new here but maybe we should take another listen because seriously I donít think most of our esteemed membership get this. You all want to make it some big deal, a conspiracy theory or a huge change in the algorithms or penalties, thatís a popular theory, to get around the idea of what Google wants. What feeds Google? Isnít that what we all want to know? Isnít that the key to our success these days?

My advice, you want my advice on Google? Take a great big step back and look at the big picture.

worry less about PR on the toolbar and more about rankings. Ė GoogleGuy

And less about rankings for high-profile phrases and more about overall rankings. Ė GoogleGuy

And less about rankings and more about traffic. Ė GoogleGuy

And less about traffic and more about conversions Ė GoogleGuy

Can I use that GoogleGuy, on my website?

Did everyone note this comment?

Sorry to once again draw attention to your comments GoogleGuy, I really try to cut you slack but when something as profound as this comes through I can only hope that by drawing a bit of attention to it will in turn help us all.

Two clients asked me, knowing in advance that I hadnít been following the update, what I felt about this update, if Iíd peeked in and followed what was going on. For me itís not the updates that get my fire burning but I certainly understand why it burns in the core of many of our members. I think the core of what I found to work with Google is today as successful as it was when I first discovered it 3 Ĺ years ago. The problem is that what I discovered isnít very exciting.

I find success with Google when I diversify.

 

dcheney




msg:70503
 12:54 pm on May 20, 2003 (gmt 0)

My site, by its nature, has tons of keywords (specific names of typically famous-only-locally folks). I'm always amazed when looking at the keywords folks have used to get to my site. For example, out of 1500 SE referrals a day, they use around 1000 different keyphrases!

Perhaps this model is what the "new" google algo is trying to boost.

IITian




msg:70504
 1:55 pm on May 20, 2003 (gmt 0)

seo-sherpa
Google acquired applied semantics.

I would hope that theming is what's happening at the current update. However, how could they do it so soon after the acquisition of semantics, is a puzzle to me.

I was checking alltheweb, and was pleasantly surprised by my web pages from the same site classified into a few clusters and these clusters in a simplified way were making sense as I had intended them to.

I think that you are right in stating that websites, rather then single web pages will become more important in future.

wackmaster




msg:70505
 2:25 pm on May 20, 2003 (gmt 0)

chiyo - message #14 - the best post I've seen yet on Dominic and related matters. Outstanding.

I do have one question about this potential evolution G seems to be moving towards however. It involves this quote:

If you do the page per phrase system, it could easily backfire for you at some point.

Let's suppose that you run a very deep and detailed site on tulips and growing tulips. So we're already talking about a very narrow topic.

By the time you get to a subpage 3 or 4 level, you might easily see pages that, for good reason, display only a limited number of keywords or phrases. (I say 'you might easily' because of course we do have such pages - not a flower site). Example of a subpage 4: "where to buy Dutch tulips."

These pages might look to a SE as too SEO'ed, but in fact they are not; they were written by professionals in the field, with no knowledge of SEO, to appeal to users, not machines.

For those who are smarter than I about this area of technology, how do the SE's avoid somehow discounting such pages?

Mozart




msg:70506
 2:34 pm on May 20, 2003 (gmt 0)

Paynt: I always enjoy your common sense, cut through the cryptic words type of posts.

Here is my interpretation of a few of GoogleGuys posts - not sooo cryptic to me.

GG says: worry less about PR on the toolbar and more about rankings.
GG means: we changed the algo and PR is decreasing in relevancy although still a factor.
GG says: And less about rankings for high-profile phrases and more about overall rankings.
GG means: we changed the algo and a single word doesn't mean much anymore, it is the multi-word phrases that will get a lot better!
GG says: And less about rankings and more about traffic.
GG means: You will notice traffic coming from keyword combinations that you don't think of right now. As we bought Applied Semantics we also bought a lot of knowledge how to improve our theme-finding part of the algo, which now plays a larger role.
GG says: And less about traffic and more about conversions.
GG means: You may get less traffic. Because that traffic you used to get for keywords or phrases that were not really relevant to your site. Now you get traffic that really is looking for your site or its information. In other words, as Google gets better you will get less but more targeted traffic. How can you counter-act that? You shouldn't, because Google wants to get most relevant search results. So create new pages with new content relevant to new things and you get more traffic on those pages. Read and understand Paynt's posts and you'll know

GG also said that this current flux in results will last about one update cycle.
GG means: Once all datacenters have the "new" index (which was especially prepared for the change in update procedure that now happened) we will apply the new algo which has time-dependant variables in it. That is why we can't just get the most up-to-date index out there but need to step back two months. Because a big part of the new algo will be the changes over time brought in by freshbot. This part is something you simply can not speed up, although the past three months of data will slowly get applied as we read! And so... this was the last time you ever saw a dance...

It's funny, but over the last few months so many people have repeatedly asked when the Google Dance will happen more often or become replaced by continuous updating in a "sticky freshbot" style. And now that it happens (in my opinion) everybody is totally upset because of a few weeks of turmoil necessary for this to happen.

rfgdxm1: I think this is what happened to your drop in the rankings: You optimised your site heavily towards a single keyword. Algo has changed and demoted single keywords, but upped multi-word phrases. You are upset because of the conceived (and real) drop on that one keyword. On multi-word phrases you said yourself you gained. This is how Google got better. And as GG said many times: it is not all done yet anyway! Per aspera ad astra.

steveb: I read many statistics telling me that the single keyword search is decreasing. Whenever I search it is with minimum 2 words, never a single keyword. Therefore I doubt they still represent huge traffic and I very much doubt that you should care about such non-targeted traffic. As GG says: And less about traffic and more about conversions.

Now, you can howl at the moon and hope it'll change a thing, or just get on with creating a great information resource for your visitors! I'd go for option two!

Chris_D




msg:70507
 3:02 pm on May 20, 2003 (gmt 0)

Mozart - brilliant post!

And I believe that you are onto something with the decreased 'emphasis' on the single word.

Thanks for sharing that info.

Best regards

Chris_D

GoogleGuy




msg:70508
 3:56 pm on May 20, 2003 (gmt 0)

paynt, feel free to use that. I think I'm paraphrasing something Danny Sullivan would say when I talk like that. :) Just the idea that paying more attention to users, what they're finding and what they want, rather than gunning for #1 on that "major keyword" is one thing that demonstrates a wise SEO. :)

digitalghost




msg:70509
 4:07 pm on May 20, 2003 (gmt 0)

>> what they're finding and what they want, rather than gunning for #1 on that "major keyword" is one thing that demonstrates a wise SEO.

It's also something that has been repeated hundreds of times here with little impact. The people that understand it practice it, those that don't continue to track 5-10 "key" phrases. They will continue to claim that their site has a narrow focus without doing anything to expand that focus, usually under the guise of "not diluting" a theme.

"My site is nowhere to be found now for my major keyphrase, I'm doomed". Well yes, you are, you were doomed as soon as you designed a site around an "all-important" keyphrase.

[edited by: digitalghost at 4:09 pm (utc) on May 20, 2003]

BigDave




msg:70510
 4:09 pm on May 20, 2003 (gmt 0)

I think we all need to keep in chiyo's bit of wisdom was we muse over what is happening.

People are analysing the pieces and of course coming to premature conclusions.

All we are doing is taking some guesses bases on early data. There are always othe possibilities that can cause similar results.

While I think there are already some big changes in there, One of the things that I get from GoogleGuy's posts is that it is a new system that will be allowing for new pieces to be fit in more easily.

Skier




msg:70511
 4:15 pm on May 20, 2003 (gmt 0)

After wading through the endless flood of threads and opinions (yawn), finally some ideas that are worth thinking about. Nice work paynt!

The convergence of GG comments, big-picture trends in Google objectives, data from my logs, etc - it makes a convincing case.

When I do a search as a user, the results list contains mostly sites that are not "exactly" what I was looking for. The list looks like a shotgun blast of the broadest range of possible answers, in hope that one might hit the target.

This is a long way from the computers of science fiction movies, the ones that answer your question directly, or ask for clarification. The ones that the SE's must surely be designing toward.

Google may well be taking a step in that direction if it is trying to "understand" the content of the sites in its index. That's exciting! Hope its true.

SEO has been a game of trying to get your site into the "blast zone" of the results shotgun. Maybe now it will become a question of who can best package information?

BigDave




msg:70512
 5:22 pm on May 20, 2003 (gmt 0)

Skier, (interesting name given my example, LOL)

I think that there will still be something of an intentional shotgun blast approach. It is still the best way to make sure that everyone gets *something* that they want.

It might even force more of a variety. On a search for "mountain" that could mean all sorts of different things, it might come up with only one ski resort instead of 10 at the top. The SERPs might seem less relevant to the skier, but it would be more relevant to the mountain climber or mountain biker.

Of course you will then have all those resort owners going nuts about their drop in ranking.

Skier




msg:70513
 8:10 pm on May 20, 2003 (gmt 0)

There are two sides to this equation, good results and good questions. The SE's are dealing with what they can control, the answers. Having spent a few years in advertising, I know that changing people's (searchers) behavior is a "mountain" of a challenge.

I have a problem with your example. "Mountain" is a very poorly constructed test question to ask - an SE, or anyone else. How can one judge the validity of the answers?

If Google starts giving more precise answers to multi-word questions, perhaps the searchers will learn to be more specific. It is not that far from there to questions in "plain english" and answers that reflect the precision that language is capable of reaching. (apologies to non-english speakers - "plain french" etc too)

Keyword matches by single pages is a pretty primative tool. Content matches by whole sites is an entirely different gizmo.

I can't believe that one and two word searches are natural to the public, they learned that somewhere.

MyWifeSays




msg:70514
 8:18 pm on May 20, 2003 (gmt 0)

Interesting theories.

Way above my skill level though. Single keyword SERPS always seemed to be the domain of the high PR, high incoming link pages to me. Although there has always been the odd result in the SERPS that has had me baffled (they have never seemed to hang around too long though).

Shame we can't point out examples on here. I'm not seeing much difference with these SERPS that can't be explained by missing inward links/PR.

There must be something though otherwise GG wouldn't be making all these comments. Or would he?

BigDave




msg:70515
 8:27 pm on May 20, 2003 (gmt 0)

Obvioulsy mountain is a lousy example. The trick around here, when "widgets" will not do is to come up with something that you can use to make your point, tat will not get edited for being too specific.

It is also a far to simplified example. It was intended to show a direction that things might be going.

Who outside of Google knows IF they are doing this or what parameters they are using. They might be doing it by site, or by page. They may be doing something totally different that just happens to make this look reasonable.

As for the natural language queries, I think there is room for both types of search engines. Just as there is room for search engines that stem and those that don't. If ask gets their act together and also reduces their ads to a reasonable level, it would most likely become the default SE in libraries.

But searching on one or two word key phrases is a natural way of doing things for many people. It is how dictionaries work. Learning to narrow the results by expanding on that is less natural.

steveb




msg:70516
 8:51 pm on May 20, 2003 (gmt 0)

"Certainly that would suggest it is more 'topical' yes"

chiyo think about what you are saying. Fresher certainly does not imply topical. I'm sorry but that is utter nonsense. Fresh is new. Fresh is *unproven*. Fresh page imply non-topical, non-proven, "instant content".

All you are saying is fresher pages make for more keyword no-content spam.

Now this is exactly why these changes can be analyzed to some degree at this point. This index is profoundly anti-topical and anti-content, and Google has apparently made this change *deliberately* in favor of unproven *newness*.

Many of my quality content pages have plummeted in the past week, while some piece of piffle I threw up a few weeks ago is almost making up for all that lost traffic all by itself.

This index is thoroughly anti-topical, anti-theme (as being listed in dmoz and the Google directory seem to be attracting a severe penalty), and absolutely anti-content. That is what "fresh" is.

It's a sad change for those of us who provide topical content, even often fresh content, but it is heaven for spammers and crap-servers who can just create keyword sites that are utter gibberish and they will be valued for their "freshness".

It won't last though. The public doesn't like to eat manure. James Carville almost said "It's the content, stupid." It's not the date on the manure.

mrguy




msg:70517
 8:56 pm on May 20, 2003 (gmt 0)

A personal observation:

I use Google all the time for finding stuff not commerce related, lately, the serps are filled with nothing but commerce sites and finding any real data is getting harder.

For example, just now I looked for the site that shows the status of the North American routers for the network.

It took me to the second page to get passed all the commerce sites and actually find it.

It used to not be like that!

IITian




msg:70518
 9:07 pm on May 20, 2003 (gmt 0)

It took me to the second page to get passed all the commerce sites and actually find it.

mrguy, I believe that after a few years it will become like TV. If you want it free, you have to live with the commercials. Otherwise, pay (cable) fees.

Critter




msg:70519
 9:18 pm on May 20, 2003 (gmt 0)

Exactly...because we all know that cable TV doesn't have commercials...

Umm...

I mean...

Hmmm

Peter

IITian




msg:70520
 9:20 pm on May 20, 2003 (gmt 0)

Exactly...because we all know that cable TV doesn't have commercials...

Critter, I was referring to premium channels with fewer commercials. ;)

albert




msg:70521
 9:22 pm on May 20, 2003 (gmt 0)

... quality content pages have plummeted in the past week

If you want it free, you have to live with the commercials. Otherwise, pay (cable) fees.

It depends from your point of view. This thread gives some thoughts about other points of view.

There are two kinds of posters around. Blue or red pill :)

Ok it's still reading tea leaves. But some of us seem to be more confident because they try to look at the big picture.

[mumble]I stay optimistic.[/mumble]

chiyo




msg:70522
 9:25 pm on May 20, 2003 (gmt 0)

steveb wrote: chiyo think about what you are saying. Fresher certainly does not imply topical. I'm sorry but that is utter nonsense. Fresh is new. Fresh is *unproven*. Fresh page imply non-topical, non-proven, "instant content". <<

No need to be sorry Steve.

In an attempt to understand your statement that "fresh" means non-topical I revisited this.

So yesterdays page on the morrocon bombing is less topical than last week's on the Ryiadh bombing because it's fresh? That the info on the Moroccon bombing is instant content, non-proven and non-topical? That a page with the latest prices for a product is less proven and non-topical compared to an old "tested" page with last years prices?

WE may have different views due to the different types of sites we have or different perceptions of what Google is trying to do with fresh. Or maybe even different oerceptions of the role of Google's "free" index SERPS.

I think google's fresh is fairly intelligent. From what i see on our sites, our NEW pages or pages with NEW links are getting picked up, not so much changed or rotated content pages.

And i expect the fresh algo to continue to improve. I agree at the moment it still seems a bit blunt.

Fresh pages may be "untested" but the reputation of the sites pointing to them "tests" them for quality.

I agree that fresh may not automatically mean "topical" but cannot agree that "fresh" automatically means "untested", non-topical or "instant content". Freshing pages to me DOES increase the likelihood that users will see the latest content about current events, note i didnt suggest it was quality as i agree it takes time to understand and analyse a topical event, but thats not the point. Look at CNN. There first report of breaking news is "fresh". Its certainly topical. But it may not e quality. But as they get more info and more sources, the quality improves. But they are still more topical that yesterday's morning newspaper.

However part of what we are talking about is semantic. So its a small part of the more significant points being made by others in this thread.

[edited by: chiyo at 9:36 pm (utc) on May 20, 2003]

steveb




msg:70523
 9:32 pm on May 20, 2003 (gmt 0)

I could put up a page with the word Moroccan bombings in the title, with a bunch of related keywords in non-sentences, and that would be fresh, and the new Google would love it.

Bleeech.

That is spam heaven. Old Google would need someone to link to that site, to prove its value via a deepcrawl in comparison to other sites that mention Morroccan bombing.

A hundred years from now they might teach of this month, when Google made a change that led to the end of coherent sentences. Fresh words are nothing, certainly not topical.

chiyo




msg:70524
 9:43 pm on May 20, 2003 (gmt 0)

Steve. I addressed your argument on quality in my post above.

Google addresses quality in other ways. eg. if your page about the moroccan bombing was written on a low PR site, or was spam, there is less chance that anyone would see it, fresh or not. And your site would bit by bit lose PR or other "merits" due to your spam and bad user experiences. Agreed google needs to better target pages worthy of freshing, perhaps by theming, but the concept of freshing by itself certainly does not imply low quality. As always google is a work in progress.

peterdaly




msg:70525
 9:47 pm on May 20, 2003 (gmt 0)

The non-single keyword idea may have something to it. I have a site with ten's of thousands of pages in Google. There is only one page I specifically track in google ranking for a (2 word) search phrase. Other pages are "passivly" optimized, and not tracked for specific "high volume" phrases or words.

In the past couple of weeks, a few things have happened:
1. My site is nowhere to be seen for the keyword phase I track. "mytopic widgets" I used to be on page 1-2 depending (I think) on Freshie's week.

2. Conversion rate is up, with non-proportional traffic increase. I am getting better targeted users who are both viewing more pages per visit, and purchasing more items. I have not figured out if this is all Google referred traffic, but the majority of my traffic comes from Google.

As much of my content comes from other sources (with permission!) we will see how things go once new spam filters come into play. So far so good. Mostly detailed product descriptions, features, etc.

BTW - Most links to my site which I don't solicit myself are deep links, not to the keyphrase tracked page. New algo may take this into account differently.

PR 5.
Alexa ranking around 100k.

-Pete

paynt




msg:70526
 10:24 pm on May 20, 2003 (gmt 0)

I sometimes see an outcome with Google comes not from what I can figure out as an obviously engineered plan for the algorithm but what appears to be a side effect. Does it matter then that the final result is because a) Google planned for that or b) it happens in spite of Google?

steveb




msg:70527
 10:38 pm on May 20, 2003 (gmt 0)

"but the concept of freshing by itself certainly does not imply low quality."

You seem in denial or something. Freshness does not imply low quality or high quality... freshness implies *no* quality. And it doesn't just imply it, it screams it.

There is no judgement of quality, and your statement about "sites" is mystifying. Google doesn't look at "sites".

jonrichd




msg:70528
 12:12 am on May 21, 2003 (gmt 0)

This is a brilliant discussion.

Using bombings as a search term brings out the use of fresh results to provide more reasonable SERPS. If someone used that one word term, a logical assumption might be that they had just seen information about the recent terrorist attacks on TV, and now want more information. Using fresh results brings the most recent information to the top. Of course, other results might have history of bombing, atomic bomb, etc.

Where this algo breaks down is with search terms that don't have any currency to them. In a query for widgets How will G tell the difference between the latest commercial page selling widgets and a page talking about the newest widget discovery?

chiyo




msg:70529
 5:15 am on May 22, 2003 (gmt 0)

>>You seem in denial or something.<<

What is it that i'm denying steve?

I don't not understand at all the reasoing behind your statement - "freshness implies *no* quality." Perhaps you need to explain it more.

I have already tried to explain clearly to you how, when assessed in combination with other factors, it does add value/quality. e.g. unless you have high Pr of pages from which links are freshed, your new pages dont turn up anyway, or are reduced in ranking. Almost every algo factor in Google by *themselves* is almost usless for creating good SERPS (e.g. PR, text analysis factors, link pop factors) either because they are too narrow or too easy to spam, it is only when they are combined with a few others they start becoming useful.

stevb said >>your statement about "sites" is mystifying. Google doesn't look at "sites".<<

Let me de-mystify it.

1. YEs it would have been better to say "page" rather than "site". Consider the mistake mine as I should have been more clear.

I was more referring to the situation where a site has lots of high PR pages, and has many of these freshed as opposed to a site which has less, and has very few, if any "freshed". Thats whay i meant by a "high PR site"

2. Google does look at sites and may well do so more in the future. Certainly in terms of PR and many others it's unit of analysis is pages, but i think there is plenty of evidence that Google "looks at" sites in several instances - determining spam or cross-linking are two where site-wide penalties can apply over and beyond certain pages.

The minute Google starts to apply theming, if it hasnt already, one possible unit of analysis will be the "site".

This 57 message thread spans 2 pages: < < 57 ( 1 [2]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved