Welcome to WebmasterWorld Guest from 54.196.42.8

Forum Moderators: open

Message Too Old, No Replies

Update Brandy Part 3

     
7:41 pm on Feb 15, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Oct 8, 2001
posts:2882
votes: 0


Continued From: [webmasterworld.com...]

"Any clue as to the possible role greater reliance on semantics is playing in your never ending quest for more relevant results?"

I'd say that's inevitable over time. The goal of a good search engine should be both to understand what a document is really about, and to understand (from a very short query) what a user really wants. And then match those things as well as possible. :) Better semantic understanding helps with both those prerequisites and makes the matching easier.

So a good example is stemming. Stemming is basically SEO-neutral, because spammers can create doorway pages with word variants almost as easily as they can to optimize for a single phrase (maybe it's a bit harder to fake realistic doorways now, come to think of it). But webmasters who never think about search engines don't bother to include word variants--they just write whatever natural text they would normally write. Stemming allows us to pull in more good documents that are near-matches. The example I like is [cert advisory]. We can give more weight to www.cert.org/advisories/ because the page has both "advisory" and "advisories" on the page, and "advisories" in the url. Standard stemming isn't necessarily a win for quality, so we took a while and found a way to do it better.

So yes, I think semantics and document/query understanding will be more important in the future. pavlin, I hope that partly answers the second of the two questions that you posted way up near the start of this thread. If not, please ask it again in case I didn't understand it correctly the first time. :)

4:17 am on Feb 16, 2004 (gmt 0)

Full Member

10+ Year Member

joined:Sept 21, 2002
posts:227
votes: 0


<Re all posts on what GoogleGuy wrote:

Searching for "high-quality" links before the site itself is high-quality is putting the cart before the horse, so to speak. That time would be better invested in enriching the site ........etc.

And various responses : GG, your advice makes a lot of sense for information or content publishers, but how should widget marketers interpret that statement, and etc..........<

You will drive GG crazy trying to answer to specifics/questions since raised to his very good broad advice: which is simple and says it all: enrich the site, make it the best site in its category for the person who searches for a site like that. If you do that - GG says you will win the Google war. And he is very likely right.

4:20 am on Feb 16, 2004 (gmt 0)

Junior Member

10+ Year Member

joined:Nov 21, 2003
posts:152
votes: 0


Ok the dewey system wasn't a good comparison, but I suppose in a way Google is the librarian.. When I meant rules maybe DON"T DO's are nearer the mark as I don't know if wer'e innocently tripping a filter as someone else said.
eg whats optimum page size, eg(sorry I know going off topic)

We try and produce clean code (CSS is brilliant for this) and use H1 tags for emaphasising contents subject etc. Use image and alt tags too. As you say try and use the tags provided for us. We don't do any analysis of keyword densiity etc etc though. But is this considered seo?

I tried some keywords in the Brandy update and we are better in some worse in others so I suppose it hasn't really effected us.

4:24 am on Feb 16, 2004 (gmt 0)

Preferred Member

10+ Year Member

joined:Sept 22, 2002
posts:646
votes: 0


Kirby and Chicago,

I have been trying to get a response about this issue since the latest dance began. I noticed that on 64 that major city areas were showing pre-florida style rankings; however, smaller city or town areas were not. When Florida happened it was the large city terms that were hit first. Later, Austin update got all the areas and even hit secondary terms. I think Google is working in reverse now. Fix the problems where people would notice them first. Later smaller cities will be sorted out possibly in another update.

For the Steveb's out there that don't believe there is a filter and for Googleguy who says there was only a change to algo, I'm sorry but you are mistaken.
It was obvious that only certain terms were targeted with the new algo (or filter). The reason I call it a filter is this fits the more scientific definition where the index is the same for all terms, then later a filter (different algo) was applied to certain terms. In the case of Florida it was the large city terms, where the index was placed through a sive where commercial rich content sites (like rocks) were filtered out and directories were allowed to show (like sand). I studied hundreds of terms and saw the pattern over and over again. It overwhelmingly corresponded to the commercialness of the term. This is not a conspiracy theory just the facts. Possibly Google had legit results to clean up these areas because they were the hardest hit by spam and the timing near Christmas was coincidence.

4:33 am on Feb 16, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Oct 8, 2001
posts:2882
votes: 0


allanp73/Kirby/Chicago, sounds like you guys are more familiar with the city widget niches than I am. If you shoot me emails or spam reports, I'd be glad to look at them though. My guess is that Big City SERPs may have been more likely to have spam? Send me some specifics though and I'll take a look..
4:42 am on Feb 16, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 1, 2003
posts:1206
votes: 1


>I noticed that on 64 that major city areas were showing pre-florida style rankings; however, smaller city or town areas were not

allan, this is pretty close to what I'm seeing.

Chicago, I'm looking at several niche sites that are city/area specific - 'domain.com/cityA' vs. 'domain.com/cityB', not one domain with several city pages.

As for the nudge to get back on the 'its all about semantics' theme, well, that isnt what many are seeing. For the moment I'm going with the "its a process" theory and assuming the process isnt nearly complete yet. We'll see if the dichotomy that exists within certain industries is resolved over the next few days.

4:45 am on Feb 16, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 1, 2003
posts:1206
votes: 1


GG, Didnt see your post when I responded. Its not spam, its directories, etc., showing for one area and targeted local sites for other areas.
4:50 am on Feb 16, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Oct 8, 2001
posts:2882
votes: 0

"We try and produce clean code (CSS is brilliant for this) and use H1 tags for emphasizing contents subject etc. Use image and alt tags too. As you say try and use the tags provided for us. We don't do any analysis of keyword density etc etc though. But is this considered seo?"

peter andreas, it maybe be SEO, but I wouldn't consider it spam.

Here's how I look at the subject. People think about how their pages will show up in search engines. That's SEO but perfectly normal. People tweak the words they use a little. Maybe they add a site map. Maybe they redo their site architecture so that the urls are static instead of dynamic. That's SEO, and no problem. Things like removing session IDs, using text links instead of Flash or frames or weird stuff that some search engines can't read can make a site more usable for users. It's easier to bookmark a page. It's easier for blind people, people with older browsers, or people using WebTV. All of the changes like this can improve a site for users and search engines alike. Is it SEO? Sure. Is it wrong? Of course not.

The part where I object is where SEO veers into spam. What is spam? Well, this is a pretty fair start:
http://www.google.com/webmasters/guidelines.html#quality
There's a concrete list there: things like cloaking, duplicate content like doorway pages, and things like hidden text.

More importantly, there's a list of principles above the concrete list. If you're doing something that a regular user would consider deceptive (or that a competitor would!), it's probably something to steer clear of. From that list of general principles, you can infer a lot of specifics. For example, on http://www.google.com/webmasters/seo.html we mention a few extra specifics like operating multiple domains with multiple aliases or falsified WHOIS info. But given the principles and the concrete examples, hopefully you should probably be able to take a pretty fair guess at whether something would be considered spam.

My personal take would be that there's lots of people who practice good site design for users and search engines, and that Google has no objection to SEO if it reflects those ideas.

Different people have different definitions of SEO and spam, but that's a rough cut at my outlook.

Hope that helps,
GoogleGuy

4:53 am on Feb 16, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member rfgdxm1 is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 12, 2002
posts:4479
votes: 0


>Entering search term (snipped) as 34th result.

>Search for (snipped) as result number 25.

>I think Google is broken.

oohayoohay: Those are very uncompetitive search terms. And, those examples of Google being "broken" are buried so far down in the SERPs ain't nobody likely to ever spot them. A bad result buried in the SERPs here and there is normal for search engines. Now, if all of the top 10 for those searches was irrelevant, THAT would be significant.

4:56 am on Feb 16, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 21, 2002
posts:1056
votes: 0


GG, a reiteration from Kirby. This is not about spam.

This is about a seperate filter or algo being applied to major local search areas.

This pattern started many months ago, and is still persistant today.

How can one page equal a #1 rank while another equals #600 with the same level of optimization within the same site? How can this same pattern be repeated accross hundres of pages on multiple domains with the only correlary being the search volume within those cities. And please don't tell me its competition.

Are you asserting that you know of no activity that uses KW search volume and/or city type as a means to trigger a specific filter or algo?

4:58 am on Feb 16, 2004 (gmt 0)

Full Member

10+ Year Member

joined:Dec 20, 2003
posts:268
votes: 0


"It was obvious that only certain terms were targeted with the new algo (or filter)"

Thank you. call it LSI if you want... it works like a "candy coated" filter. Not wanting to dis GG or dwell in conspiracy either but just as I had noted earlier... informational pages are not necessarily the best converting pages... information retrieval for information retrieval's sake doesn't pay the bills either and is not something anybody would necessarily want stock in. The adwords value is climbing... I'm past that now. I would like to see a serious thread about how best to cope without necessarily caving in until that 70-80% search number is split up?

Also, yes people do use the www for information and it IS actually replacing brick and mortar libraries... but actually if you get technical about it more people search for celebrity pictures in a given day than search for all the great literary works combined. If the internet is going to evolve into anything greater than this piffle (and I think it will) it will be because of commerce.

5:19 am on Feb 16, 2004 (gmt 0)

Junior Member

10+ Year Member

joined:Nov 21, 2003
posts:152
votes: 0


Thanks GG for replying to my questions. I suppose thats what wev'e been doing, you have to BIAS a page towards its subject in subtle ways. We never do any cloacking or whatever (Wouldn't know how to) but we do want people to look at our site after all. I'll look at the links you suggest. We want to be in this long term too.
5:23 am on Feb 16, 2004 (gmt 0)

Junior Member

10+ Year Member

joined:Oct 13, 2003
posts:147
votes: 0


GG writes..;

using text links instead of Flash or frames or weird stuff that some search engines can't read can make a site more usable for users. It's easier to bookmark a page. It's easier for blind people, people with older browsers, or people using WebTV.

GG - In one of my more important categories, sites using <title> tags are seemingly penalized, or at least poorly ranked. Same with alt tags - a common denominator in several 3-word categories for good relevance "grades" from Google (at least for now) is the complete absence of both.

Yet Google's info for webmasters tells us to use a Lynx browser, for example, to "test" our pages.

Aren't the two (use a Lynx browser, but don't use alt/title tags) conflicting in theory?

Particularly re: "easier for blind people" ..?

.

5:39 am on Feb 16, 2004 (gmt 0)

Senior Member from ES 

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 11, 2003
posts:1381
votes: 0


GG - In one of my more important categories, sites using <title> tags are seemingly penalized, or at least poorly ranked.

Don't think keywords in title tags - They are trying to go beyond that:

GG:

The goal of a good search engine should be both to understand what a document is really about, and to understand (from a very short query) what a user really wants. And then match those things as well as possible.

Time to think bigger than 'put some words here, put some words there'.

And with regards to the tags, again, fuhget-about-it ;-]

Think bigger!

At the very minimum, they won't penalize you for coding with standards, and I can show you 14000000 poorly ranked pages with title tags.

5:47 am on Feb 16, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 1, 2003
posts:1206
votes: 1


GG,

I am starting to see some 'widget city' searches improve from earlier today. Hope this means the process is still ongoing.

5:56 am on Feb 16, 2004 (gmt 0)

Junior Member

10+ Year Member

joined:Oct 13, 2003
posts:147
votes: 0


GG writes..

.. We want quality sites to do well--ideally without worrying too much about SEO. ... I'd like to make sure that we keep looking at any issues with our scoring

I'm little puzzled reading blanket terms like "quality" / "new "signals of quality" GG.. or anyone.

Quality is being defined as a) Authoritative? b) Unique, as you mentioned? c) Backlink volumes?

I ask because I'm seeing sites in my category with + hundreds of "votes" (per the "PR explained" page in G)
that are only hundreds of messages to various newsgroups, with the owner's address in the signature, and little else in the way of backlinks with any reasonable PR.. Unless newsgroups in Yahoo inherit the Yahoo PR and pass it along to the thousands of messages posted within it? I can't get past that - but that's all there is..

This 327 message thread spans 22 pages: 327