homepage Welcome to WebmasterWorld Guest from 54.225.24.227
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Visit PubCon.com
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 327 message thread spans 11 pages: < < 327 ( 1 2 [3] 4 5 6 7 8 9 10 11 > >     
Update Brandy Part 3
GoogleGuy




msg:70899
 7:41 pm on Feb 15, 2004 (gmt 0)

Continued From: [webmasterworld.com...]

"Any clue as to the possible role greater reliance on semantics is playing in your never ending quest for more relevant results?"

I'd say that's inevitable over time. The goal of a good search engine should be both to understand what a document is really about, and to understand (from a very short query) what a user really wants. And then match those things as well as possible. :) Better semantic understanding helps with both those prerequisites and makes the matching easier.

So a good example is stemming. Stemming is basically SEO-neutral, because spammers can create doorway pages with word variants almost as easily as they can to optimize for a single phrase (maybe it's a bit harder to fake realistic doorways now, come to think of it). But webmasters who never think about search engines don't bother to include word variants--they just write whatever natural text they would normally write. Stemming allows us to pull in more good documents that are near-matches. The example I like is [cert advisory]. We can give more weight to www.cert.org/advisories/ because the page has both "advisory" and "advisories" on the page, and "advisories" in the url. Standard stemming isn't necessarily a win for quality, so we took a while and found a way to do it better.

So yes, I think semantics and document/query understanding will be more important in the future. pavlin, I hope that partly answers the second of the two questions that you posted way up near the start of this thread. If not, please ask it again in case I didn't understand it correctly the first time. :)

 

wholland




msg:70959
 1:40 am on Feb 16, 2004 (gmt 0)

Google Guy,

Why does it take so long for Google to update their directory from DMOZ?

When will the next update be?

How much weight is given, if any, to sites in DMOZ versus sites not listed in DMOZ when it comes to ranking.

Thanks in advance,
Will

GoogleGuy




msg:70960
 1:48 am on Feb 16, 2004 (gmt 0)

"Why does it take so long for Google to update their directory from DMOZ? When will the next update be? How much weight is given, if any, to sites in DMOZ versus sites not listed in DMOZ when it comes to ranking."

When we updated once per month, we'd make sure that we downloaded the newest RDF dump of DMOZ before indexing started. I don't know when the next update will be. As far as the weight, a link from DMOZ is the same as a link from any other page on the web. DMOZ pages typically have more PageRank, so it helps in that respect, but there's no extra boost just because the link comes from the Open Directory.

wholland




msg:70961
 1:53 am on Feb 16, 2004 (gmt 0)

Thanks GG,

Does that hold true for sites once they are added to googles directory as well... No extra weight because they are in googles directory, same as any other link?

Will

MLHmptn




msg:70962
 1:54 am on Feb 16, 2004 (gmt 0)

Quite honestly this new update is going to kill me! I am still finding some of my keywords are found but other keywords where I have major content I am nowhere to be found. I got indexed in google during the Florida Update and was still positioned quite well during the Austin update. I'm beginning to think google is a joke of a search engine simply because you never know what to expect of it. Inktomi's search sites on the other hand are reliable! I am paying over $2000/month on google adwords at the moment and I think I'll have to increase this amount simply because google is returning JUNK results! My assumption is that I think google is trying to maximize their returns on adwords. If they make their search results nothing but garbage they can make people buy adwords.

Seems everyone here is happy about the changes but what about those of us who are getting hit by this brandy update? Are sites that got indexed during the Florida and Austin updates getting penalized in this update or is it just my site that is getting singled out or have any of your sites been lost in this new update?

Any feedback would be greatly appreciated!

Ledfish




msg:70963
 1:55 am on Feb 16, 2004 (gmt 0)

Just wanted to mention that my current google is using the 216 results and has been eveytime I checked today, versus yesterday when it was switching back and fourth between 64 and 216. The 216 results I'm seeing are not even close to 64. Therefore I assume that 216 hasn't been updated yet?

Kirby




msg:70964
 1:57 am on Feb 16, 2004 (gmt 0)

Thanks GG for comfirming again what senior members here have been reiterating for quite some time.

What has me bewildered though, is why 'city widgets' for some cities return typical pre-florida city widget specific site pages, while other city searches return directory type pages still.

Chicago




msg:70965
 2:19 am on Feb 16, 2004 (gmt 0)

>>What has me bewildered though, is why 'city widgets' for some cities return typical pre-florida city widget specific site pages, while other city searches return directory type pages still.

Although I would of said it a little different, Kirby, it nice to see another is noticing patterns in local search.

Do your results coincide with leading cities being hit the hardest? Put another way, are you fine with 80% of your cities, but wiped out in the majors?

We started to see this activity pre-Florida, and it has persisted and grown through Austin and Brandy.

I am watching all this talk about building a quality site ~ all the while one page ranks well and another is gone ~ when both are part of the same site and both are optimized the same. Dozens of such examples. The only correlary is search volume.

I have asked GG directly, but the silence on this topic is deafening.

XtendScott




msg:70966
 2:29 am on Feb 16, 2004 (gmt 0)

<added>By the way, the front page is talking about "The Semantic Web". I was talking about plain old semantics--understanding documents better, for example. The "Semantic Web" is a different topic altogether--more about RDF and XML and (OWL?) and lots of other things.</added>

I believe this was a request to get back on topic "The Semantic Web"

With my understanding uf Semantics is that is uses or looks for other words to "Validate" meanings of words. S the "Red Widget Only Boutique" website, would really have a difficult time to "Validate" the phrase Red Widget" because it does not have the other validating words "blue, green, olive, fun, silly, etc.".

This is why I believe many Mom&Pop webshops are nolonger in the listings because of limited (person power) to be able to offer the multiple product types.

I would hope this Semantic Algo is continully in a Learning mode and that it may still be learning as results are improving in areas.

GG,
Is this part of your reasoning to add customer discussion/comment/user content areas to possibly increase chance of getting validated? (corrections welcomed if above is incorrect)

WebStart




msg:70967
 3:03 am on Feb 16, 2004 (gmt 0)

MLHmptn
>My assumption is that I think google is trying to maximize their returns on adwords. If they make their search results nothing but garbage they can make people buy adwords.<

I would be tempted to agree with you, but cannot. I buy AdWords only for those search terms where I do not normally show in Google. Or show on page 2 or 3.

On every Google update I lose position right off the bat, dropping from #1 for a two word search term (xx widgets), that is very popular, where I was #1 for 2 years in the early years when Google was new, to #2 or #3 or #4 for that term -- everytime there is an update. For some reason, and I don't know why, I eventually come back to #1 within a month or two. But I never buy AdWords to compensate for the lost position. Guess I am just lucky, but it is strange.

peter andreas




msg:70968
 3:21 am on Feb 16, 2004 (gmt 0)

I read GG reply on page 2 of this topic and am really really encouraged ie design a site which is for users first and don't pay too muc attention to seo. Round of standing, stopmping appaluse!

Thats what we have been doing on our site ie just slowly plodding along making the site better and better and increasing content. At times Iv'e felt like an ostricth with my head in the sand as I just can't keep up with what you have to do regarding seo so I have ignored it totally! Ignorance is bliss!. I figure our very limited time and resources are spent on creating content and not trying to crack the latest alogorithm (also I just am not that clever and wouldn't know where to startt so hats off to you guys who understand this stuff!)

I am pleased to see a lot of dodgy door way sites (one even used us to boost its ratings but thats another story)which have gone

I would like to know though what are the rules. Say I go into a library everyone and want to find a book-everyone knows the dewey system so why can't Google present the rules clearly?

cbpayne




msg:70969
 3:27 am on Feb 16, 2004 (gmt 0)

Can anyone explain the cluster of DMOZ pages at the top of the Google search for these 'commercial' phrases?

Internal links with those words as anchor text; those words in title of page; those words with a reasonable keyword density; authority status of page; pretty good PR .... this sounds like the formula for a reasonably good ranking

europeforvisitors




msg:70970
 3:29 am on Feb 16, 2004 (gmt 0)

I would like to know though what are the rules. Say I go into a library everyone and want to find a book-everyone knows the dewey system so why can't Google present the rules clearly?

See the Webmaster guidelines on Google's Web site.

As for the Dewey Decimal System, I don't see how that's relevant to what Google does. The Dewey Decimal System is a manual indexing system in which books are categorized by professional librarians--which is why, for example, guidebooks in the New York City travel section are easy to find instead of being hidden behind random stacks of Yellow Pages directories and third-party hotel brochures. :-)

mipapage




msg:70971
 3:39 am on Feb 16, 2004 (gmt 0)

I would like to know though what are the rules. Say I go into a library everyone and want to find a book-everyone knows the dewey system so why can't Google present the rules clearly?

How about this? Present your information clearly - make it easier for them (users bots etc.) to understand. Valid markup, thoughtful use of the tags that are made available to you...

Maybe your competitor won't. Maybe their page will be tag-soup, while yours is 'application/xml' valid XHTML 1.1.

Okay, maybe a little extreme, but it's that simple. Make an HTML document that has great content, is marked up according to the standards.

A great story that's easy to read and understand; a great base for a long-term internet marketing strategy ;-]

Lostin




msg:70972
 4:00 am on Feb 16, 2004 (gmt 0)

I just tried this: 'widet1 widet2 widet3 widet4' and page 1 #1 was a site, fair enough! but #2 was the seach term (in that order) above in the broser title and a picture, with no text content in the page at all (except alt text, as above). Not even hidden content.

How can this make any sence?

By the way, I came in under this.

Maybe I've got lost again?

Sorry if this is off topic, but the thead leads me to think something is wrong, although the lead in text is the term as above. Maybe a new way to go? Not.

So:

Title='widet1 widet2 widet3 widet4'
Lead in='widet1 widet2 widet3 widet4'
Alt='widet1 widet2 widet3 widet4'
Content= a picture (very small)
= Page 1 #2

On 4 word search.

My head spins!

P.S. I liked Betts anon.

trumble




msg:70973
 4:11 am on Feb 16, 2004 (gmt 0)

Devastating. That one word describes my site's experience of Brandy.

I've read people here observe they've lost a few points against competitors who are a little more spammy than themselves.

But my site (which is pure information) has seen Google referrals fall by 70%. Instead of hitting the information people are looking for, SERPS are now flogging astrology products, domain names and motivational tapes. Or just .com's with very little to say about the given topic. Very mid-90's type results.

I've worked with keyword databases from search engines before, and it was very clear that internet users see the web as a library, not a shopping mall or late night TV.

Whose interests are served by this? Not Google's surely. If you want to advertise, then pay for it. If you have useful information, then accomodate it.

[edited by: trumble at 4:20 am (utc) on Feb. 16, 2004]

WebStart




msg:70974
 4:17 am on Feb 16, 2004 (gmt 0)

<Re all posts on what GoogleGuy wrote:

Searching for "high-quality" links before the site itself is high-quality is putting the cart before the horse, so to speak. That time would be better invested in enriching the site ........etc.

And various responses : GG, your advice makes a lot of sense for information or content publishers, but how should widget marketers interpret that statement, and etc..........<

You will drive GG crazy trying to answer to specifics/questions since raised to his very good broad advice: which is simple and says it all: enrich the site, make it the best site in its category for the person who searches for a site like that. If you do that - GG says you will win the Google war. And he is very likely right.

peter andreas




msg:70975
 4:20 am on Feb 16, 2004 (gmt 0)

Ok the dewey system wasn't a good comparison, but I suppose in a way Google is the librarian.. When I meant rules maybe DON"T DO's are nearer the mark as I don't know if wer'e innocently tripping a filter as someone else said.
eg whats optimum page size, eg(sorry I know going off topic)

We try and produce clean code (CSS is brilliant for this) and use H1 tags for emaphasising contents subject etc. Use image and alt tags too. As you say try and use the tags provided for us. We don't do any analysis of keyword densiity etc etc though. But is this considered seo?

I tried some keywords in the Brandy update and we are better in some worse in others so I suppose it hasn't really effected us.

allanp73




msg:70976
 4:24 am on Feb 16, 2004 (gmt 0)

Kirby and Chicago,

I have been trying to get a response about this issue since the latest dance began. I noticed that on 64 that major city areas were showing pre-florida style rankings; however, smaller city or town areas were not. When Florida happened it was the large city terms that were hit first. Later, Austin update got all the areas and even hit secondary terms. I think Google is working in reverse now. Fix the problems where people would notice them first. Later smaller cities will be sorted out possibly in another update.

For the Steveb's out there that don't believe there is a filter and for Googleguy who says there was only a change to algo, I'm sorry but you are mistaken.
It was obvious that only certain terms were targeted with the new algo (or filter). The reason I call it a filter is this fits the more scientific definition where the index is the same for all terms, then later a filter (different algo) was applied to certain terms. In the case of Florida it was the large city terms, where the index was placed through a sive where commercial rich content sites (like rocks) were filtered out and directories were allowed to show (like sand). I studied hundreds of terms and saw the pattern over and over again. It overwhelmingly corresponded to the commercialness of the term. This is not a conspiracy theory just the facts. Possibly Google had legit results to clean up these areas because they were the hardest hit by spam and the timing near Christmas was coincidence.

GoogleGuy




msg:70977
 4:33 am on Feb 16, 2004 (gmt 0)

allanp73/Kirby/Chicago, sounds like you guys are more familiar with the city widget niches than I am. If you shoot me emails or spam reports, I'd be glad to look at them though. My guess is that Big City SERPs may have been more likely to have spam? Send me some specifics though and I'll take a look..

Kirby




msg:70978
 4:42 am on Feb 16, 2004 (gmt 0)

>I noticed that on 64 that major city areas were showing pre-florida style rankings; however, smaller city or town areas were not

allan, this is pretty close to what I'm seeing.

Chicago, I'm looking at several niche sites that are city/area specific - 'domain.com/cityA' vs. 'domain.com/cityB', not one domain with several city pages.

As for the nudge to get back on the 'its all about semantics' theme, well, that isnt what many are seeing. For the moment I'm going with the "its a process" theory and assuming the process isnt nearly complete yet. We'll see if the dichotomy that exists within certain industries is resolved over the next few days.

Kirby




msg:70979
 4:45 am on Feb 16, 2004 (gmt 0)

GG, Didnt see your post when I responded. Its not spam, its directories, etc., showing for one area and targeted local sites for other areas.

GoogleGuy




msg:70980
 4:50 am on Feb 16, 2004 (gmt 0)
"We try and produce clean code (CSS is brilliant for this) and use H1 tags for emphasizing contents subject etc. Use image and alt tags too. As you say try and use the tags provided for us. We don't do any analysis of keyword density etc etc though. But is this considered seo?"

peter andreas, it maybe be SEO, but I wouldn't consider it spam.

Here's how I look at the subject. People think about how their pages will show up in search engines. That's SEO but perfectly normal. People tweak the words they use a little. Maybe they add a site map. Maybe they redo their site architecture so that the urls are static instead of dynamic. That's SEO, and no problem. Things like removing session IDs, using text links instead of Flash or frames or weird stuff that some search engines can't read can make a site more usable for users. It's easier to bookmark a page. It's easier for blind people, people with older browsers, or people using WebTV. All of the changes like this can improve a site for users and search engines alike. Is it SEO? Sure. Is it wrong? Of course not.

The part where I object is where SEO veers into spam. What is spam? Well, this is a pretty fair start:
http://www.google.com/webmasters/guidelines.html#quality
There's a concrete list there: things like cloaking, duplicate content like doorway pages, and things like hidden text.

More importantly, there's a list of principles above the concrete list. If you're doing something that a regular user would consider deceptive (or that a competitor would!), it's probably something to steer clear of. From that list of general principles, you can infer a lot of specifics. For example, on http://www.google.com/webmasters/seo.html we mention a few extra specifics like operating multiple domains with multiple aliases or falsified WHOIS info. But given the principles and the concrete examples, hopefully you should probably be able to take a pretty fair guess at whether something would be considered spam.

My personal take would be that there's lots of people who practice good site design for users and search engines, and that Google has no objection to SEO if it reflects those ideas.

Different people have different definitions of SEO and spam, but that's a rough cut at my outlook.

Hope that helps,
GoogleGuy

rfgdxm1




msg:70981
 4:53 am on Feb 16, 2004 (gmt 0)

>Entering search term (snipped) as 34th result.

>Search for (snipped) as result number 25.

>I think Google is broken.

oohayoohay: Those are very uncompetitive search terms. And, those examples of Google being "broken" are buried so far down in the SERPs ain't nobody likely to ever spot them. A bad result buried in the SERPs here and there is normal for search engines. Now, if all of the top 10 for those searches was irrelevant, THAT would be significant.

Chicago




msg:70982
 4:56 am on Feb 16, 2004 (gmt 0)

GG, a reiteration from Kirby. This is not about spam.

This is about a seperate filter or algo being applied to major local search areas.

This pattern started many months ago, and is still persistant today.

How can one page equal a #1 rank while another equals #600 with the same level of optimization within the same site? How can this same pattern be repeated accross hundres of pages on multiple domains with the only correlary being the search volume within those cities. And please don't tell me its competition.

Are you asserting that you know of no activity that uses KW search volume and/or city type as a means to trigger a specific filter or algo?

idoc




msg:70983
 4:58 am on Feb 16, 2004 (gmt 0)

"It was obvious that only certain terms were targeted with the new algo (or filter)"

Thank you. call it LSI if you want... it works like a "candy coated" filter. Not wanting to dis GG or dwell in conspiracy either but just as I had noted earlier... informational pages are not necessarily the best converting pages... information retrieval for information retrieval's sake doesn't pay the bills either and is not something anybody would necessarily want stock in. The adwords value is climbing... I'm past that now. I would like to see a serious thread about how best to cope without necessarily caving in until that 70-80% search number is split up?

Also, yes people do use the www for information and it IS actually replacing brick and mortar libraries... but actually if you get technical about it more people search for celebrity pictures in a given day than search for all the great literary works combined. If the internet is going to evolve into anything greater than this piffle (and I think it will) it will be because of commerce.

peter andreas




msg:70984
 5:19 am on Feb 16, 2004 (gmt 0)

Thanks GG for replying to my questions. I suppose thats what wev'e been doing, you have to BIAS a page towards its subject in subtle ways. We never do any cloacking or whatever (Wouldn't know how to) but we do want people to look at our site after all. I'll look at the links you suggest. We want to be in this long term too.

a_chameleon




msg:70985
 5:23 am on Feb 16, 2004 (gmt 0)

GG writes..;

using text links instead of Flash or frames or weird stuff that some search engines can't read can make a site more usable for users. It's easier to bookmark a page. It's easier for blind people, people with older browsers, or people using WebTV.

GG - In one of my more important categories, sites using <title> tags are seemingly penalized, or at least poorly ranked. Same with alt tags - a common denominator in several 3-word categories for good relevance "grades" from Google (at least for now) is the complete absence of both.

Yet Google's info for webmasters tells us to use a Lynx browser, for example, to "test" our pages.

Aren't the two (use a Lynx browser, but don't use alt/title tags) conflicting in theory?

Particularly re: "easier for blind people" ..?

.

mipapage




msg:70986
 5:39 am on Feb 16, 2004 (gmt 0)

GG - In one of my more important categories, sites using <title> tags are seemingly penalized, or at least poorly ranked.

Don't think keywords in title tags - They are trying to go beyond that:

GG:

The goal of a good search engine should be both to understand what a document is really about, and to understand (from a very short query) what a user really wants. And then match those things as well as possible.

Time to think bigger than 'put some words here, put some words there'.

And with regards to the tags, again, fuhget-about-it ;-]

Think bigger!

At the very minimum, they won't penalize you for coding with standards, and I can show you 14000000 poorly ranked pages with title tags.

Kirby




msg:70987
 5:47 am on Feb 16, 2004 (gmt 0)

GG,

I am starting to see some 'widget city' searches improve from earlier today. Hope this means the process is still ongoing.

a_chameleon




msg:70988
 5:56 am on Feb 16, 2004 (gmt 0)

GG writes..

.. We want quality sites to do well--ideally without worrying too much about SEO. ... I'd like to make sure that we keep looking at any issues with our scoring

I'm little puzzled reading blanket terms like "quality" / "new "signals of quality" GG.. or anyone.

Quality is being defined as a) Authoritative? b) Unique, as you mentioned? c) Backlink volumes?

I ask because I'm seeing sites in my category with + hundreds of "votes" (per the "PR explained" page in G)
that are only hundreds of messages to various newsgroups, with the owner's address in the signature, and little else in the way of backlinks with any reasonable PR.. Unless newsgroups in Yahoo inherit the Yahoo PR and pass it along to the thousands of messages posted within it? I can't get past that - but that's all there is..

penfold25




msg:70989
 6:15 am on Feb 16, 2004 (gmt 0)

I think if anyone has problems with these spammy sites, im sure GG would like to get a spam report and investigate these sites.

As i have noticed, I believe personally that the aim of these updates is to get rid of those crappy spam sites, while trying to keep the high quality sites remaining at the top of the serps

[edited by: penfold25 at 6:18 am (utc) on Feb. 16, 2004]

This 327 message thread spans 11 pages: < < 327 ( 1 2 [3] 4 5 6 7 8 9 10 11 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved