Welcome to WebmasterWorld Guest from 22.214.171.124
Forum Moderators: open
"Any clue as to the possible role greater reliance on semantics is playing in your never ending quest for more relevant results?"
I'd say that's inevitable over time. The goal of a good search engine should be both to understand what a document is really about, and to understand (from a very short query) what a user really wants. And then match those things as well as possible. :) Better semantic understanding helps with both those prerequisites and makes the matching easier.
So a good example is stemming. Stemming is basically SEO-neutral, because spammers can create doorway pages with word variants almost as easily as they can to optimize for a single phrase (maybe it's a bit harder to fake realistic doorways now, come to think of it). But webmasters who never think about search engines don't bother to include word variants--they just write whatever natural text they would normally write. Stemming allows us to pull in more good documents that are near-matches. The example I like is [cert advisory]. We can give more weight to www.cert.org/advisories/ because the page has both "advisory" and "advisories" on the page, and "advisories" in the url. Standard stemming isn't necessarily a win for quality, so we took a while and found a way to do it better.
So yes, I think semantics and document/query understanding will be more important in the future. pavlin, I hope that partly answers the second of the two questions that you posted way up near the start of this thread. If not, please ask it again in case I didn't understand it correctly the first time. :)
Searching for "high-quality" links before the site itself is high-quality is putting the cart before the horse, so to speak. That time would be better invested in enriching the site ........etc.
And various responses : GG, your advice makes a lot of sense for information or content publishers, but how should widget marketers interpret that statement, and etc..........<
You will drive GG crazy trying to answer to specifics/questions since raised to his very good broad advice: which is simple and says it all: enrich the site, make it the best site in its category for the person who searches for a site like that. If you do that - GG says you will win the Google war. And he is very likely right.
We try and produce clean code (CSS is brilliant for this) and use H1 tags for emaphasising contents subject etc. Use image and alt tags too. As you say try and use the tags provided for us. We don't do any analysis of keyword densiity etc etc though. But is this considered seo?
I tried some keywords in the Brandy update and we are better in some worse in others so I suppose it hasn't really effected us.
I have been trying to get a response about this issue since the latest dance began. I noticed that on 64 that major city areas were showing pre-florida style rankings; however, smaller city or town areas were not. When Florida happened it was the large city terms that were hit first. Later, Austin update got all the areas and even hit secondary terms. I think Google is working in reverse now. Fix the problems where people would notice them first. Later smaller cities will be sorted out possibly in another update.
For the Steveb's out there that don't believe there is a filter and for Googleguy who says there was only a change to algo, I'm sorry but you are mistaken.
It was obvious that only certain terms were targeted with the new algo (or filter). The reason I call it a filter is this fits the more scientific definition where the index is the same for all terms, then later a filter (different algo) was applied to certain terms. In the case of Florida it was the large city terms, where the index was placed through a sive where commercial rich content sites (like rocks) were filtered out and directories were allowed to show (like sand). I studied hundreds of terms and saw the pattern over and over again. It overwhelmingly corresponded to the commercialness of the term. This is not a conspiracy theory just the facts. Possibly Google had legit results to clean up these areas because they were the hardest hit by spam and the timing near Christmas was coincidence.
allan, this is pretty close to what I'm seeing.
Chicago, I'm looking at several niche sites that are city/area specific - 'domain.com/cityA' vs. 'domain.com/cityB', not one domain with several city pages.
As for the nudge to get back on the 'its all about semantics' theme, well, that isnt what many are seeing. For the moment I'm going with the "its a process" theory and assuming the process isnt nearly complete yet. We'll see if the dichotomy that exists within certain industries is resolved over the next few days.
peter andreas, it maybe be SEO, but I wouldn't consider it spam.
Here's how I look at the subject. People think about how their pages will show up in search engines. That's SEO but perfectly normal. People tweak the words they use a little. Maybe they add a site map. Maybe they redo their site architecture so that the urls are static instead of dynamic. That's SEO, and no problem. Things like removing session IDs, using text links instead of Flash or frames or weird stuff that some search engines can't read can make a site more usable for users. It's easier to bookmark a page. It's easier for blind people, people with older browsers, or people using WebTV. All of the changes like this can improve a site for users and search engines alike. Is it SEO? Sure. Is it wrong? Of course not.
The part where I object is where SEO veers into spam. What is spam? Well, this is a pretty fair start:
There's a concrete list there: things like cloaking, duplicate content like doorway pages, and things like hidden text.
More importantly, there's a list of principles above the concrete list. If you're doing something that a regular user would consider deceptive (or that a competitor would!), it's probably something to steer clear of. From that list of general principles, you can infer a lot of specifics. For example, on http://www.google.com/webmasters/seo.html we mention a few extra specifics like operating multiple domains with multiple aliases or falsified WHOIS info. But given the principles and the concrete examples, hopefully you should probably be able to take a pretty fair guess at whether something would be considered spam.
My personal take would be that there's lots of people who practice good site design for users and search engines, and that Google has no objection to SEO if it reflects those ideas.
Different people have different definitions of SEO and spam, but that's a rough cut at my outlook.
Hope that helps,
>Search for (snipped) as result number 25.
>I think Google is broken.
oohayoohay: Those are very uncompetitive search terms. And, those examples of Google being "broken" are buried so far down in the SERPs ain't nobody likely to ever spot them. A bad result buried in the SERPs here and there is normal for search engines. Now, if all of the top 10 for those searches was irrelevant, THAT would be significant.
This is about a seperate filter or algo being applied to major local search areas.
This pattern started many months ago, and is still persistant today.
How can one page equal a #1 rank while another equals #600 with the same level of optimization within the same site? How can this same pattern be repeated accross hundres of pages on multiple domains with the only correlary being the search volume within those cities. And please don't tell me its competition.
Are you asserting that you know of no activity that uses KW search volume and/or city type as a means to trigger a specific filter or algo?
Thank you. call it LSI if you want... it works like a "candy coated" filter. Not wanting to dis GG or dwell in conspiracy either but just as I had noted earlier... informational pages are not necessarily the best converting pages... information retrieval for information retrieval's sake doesn't pay the bills either and is not something anybody would necessarily want stock in. The adwords value is climbing... I'm past that now. I would like to see a serious thread about how best to cope without necessarily caving in until that 70-80% search number is split up?
Also, yes people do use the www for information and it IS actually replacing brick and mortar libraries... but actually if you get technical about it more people search for celebrity pictures in a given day than search for all the great literary works combined. If the internet is going to evolve into anything greater than this piffle (and I think it will) it will be because of commerce.
using text links instead of Flash or frames or weird stuff that some search engines can't read can make a site more usable for users. It's easier to bookmark a page. It's easier for blind people, people with older browsers, or people using WebTV.
GG - In one of my more important categories, sites using <title> tags are seemingly penalized, or at least poorly ranked. Same with alt tags - a common denominator in several 3-word categories for good relevance "grades" from Google (at least for now) is the complete absence of both.
Yet Google's info for webmasters tells us to use a Lynx browser, for example, to "test" our pages.
Aren't the two (use a Lynx browser, but don't use alt/title tags) conflicting in theory?
Particularly re: "easier for blind people" ..?
GG - In one of my more important categories, sites using <title> tags are seemingly penalized, or at least poorly ranked.
Don't think keywords in title tags - They are trying to go beyond that:
The goal of a good search engine should be both to understand what a document is really about, and to understand (from a very short query) what a user really wants. And then match those things as well as possible.
Time to think bigger than 'put some words here, put some words there'.
And with regards to the tags, again, fuhget-about-it ;-]
At the very minimum, they won't penalize you for coding with standards, and I can show you 14000000 poorly ranked pages with title tags.
.. We want quality sites to do well--ideally without worrying too much about SEO. ... I'd like to make sure that we keep looking at any issues with our scoring
I'm little puzzled reading blanket terms like "quality" / "new "signals of quality" GG.. or anyone.
Quality is being defined as a) Authoritative? b) Unique, as you mentioned? c) Backlink volumes?
I ask because I'm seeing sites in my category with + hundreds of "votes" (per the "PR explained" page in G)
that are only hundreds of messages to various newsgroups, with the owner's address in the signature, and little else in the way of backlinks with any reasonable PR.. Unless newsgroups in Yahoo inherit the Yahoo PR and pass it along to the thousands of messages posted within it? I can't get past that - but that's all there is..