Welcome to WebmasterWorld Guest from 54.161.64.174

Forum Moderators: open

Message Too Old, No Replies

Update Brandy Part 3

     
7:41 pm on Feb 15, 2004 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Continued From: [webmasterworld.com...]

"Any clue as to the possible role greater reliance on semantics is playing in your never ending quest for more relevant results?"

I'd say that's inevitable over time. The goal of a good search engine should be both to understand what a document is really about, and to understand (from a very short query) what a user really wants. And then match those things as well as possible. :) Better semantic understanding helps with both those prerequisites and makes the matching easier.

So a good example is stemming. Stemming is basically SEO-neutral, because spammers can create doorway pages with word variants almost as easily as they can to optimize for a single phrase (maybe it's a bit harder to fake realistic doorways now, come to think of it). But webmasters who never think about search engines don't bother to include word variants--they just write whatever natural text they would normally write. Stemming allows us to pull in more good documents that are near-matches. The example I like is [cert advisory]. We can give more weight to www.cert.org/advisories/ because the page has both "advisory" and "advisories" on the page, and "advisories" in the url. Standard stemming isn't necessarily a win for quality, so we took a while and found a way to do it better.

So yes, I think semantics and document/query understanding will be more important in the future. pavlin, I hope that partly answers the second of the two questions that you posted way up near the start of this thread. If not, please ask it again in case I didn't understand it correctly the first time. :)

7:51 pm on Feb 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>and "advisories" in the url.

Interesting, GG

7:51 pm on Feb 15, 2004 (gmt 0)

10+ Year Member



looks like 66.x server went back to 216.x data. I am getting same Austin results on both. Not sure what this means.
7:53 pm on Feb 15, 2004 (gmt 0)

10+ Year Member




ooooh glad you point that out i was about to post GoogleGuy.

If none of you have noticed doing a keyword like

Rent

Google will pull out words like Rent Rantal Rentals Hire etc

Now this is the best thing Google can now find the web sites that didn't optimize for Hire but its still relevent to Rentals do you follow me...

I for one love it and its the best way forward, the over optimisation filter may or may not exsist but if you optimize for all the keywords then maybe its too much and your site could been as like a doorway page.

I am not going to give any more away other wise i won't make the pennies or cents i need.

:)

7:54 pm on Feb 15, 2004 (gmt 0)

10+ Year Member



hello googleguy,

if semantic is going to play a bigger role then one can just hope that the new guys in zurich are part of a team that will tackle some of the other languages then english, as i think google lacks document/query understanding in german (compared to english) for example.

[edited by: viggen at 7:55 pm (utc) on Feb. 15, 2004]

7:55 pm on Feb 15, 2004 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



"Has Google applied some sort of OOP or filter to the algorithm since the Florida update or was the drastic change in SERPs purely the result of new ranking criteria?"

It's the second one. People post around here about filters, blocking, penalties, etc. etc. A far better explanation is "things which used to work before don't receive the same amount of credit now." It's natural for people who are way out there with their linking strategies or their page-building strategies to think of a drop as an over-optimization penalty, but it's more realistic to conclude that Google is weighting criteria differently so that over-optimized sites just aren't doing as well now.

7:58 pm on Feb 15, 2004 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



viggen, we know that non-English languages are at least as important as English. You can assume we want to figure out more about documents and queries in every language. :)

Okay, I'm off to get some exercise. I'll check in again in a few hours..

8:03 pm on Feb 15, 2004 (gmt 0)

10+ Year Member



Speaking about new "weighting criteria", I sent an email to Google the other day stating that a search query "<snip>" pulls up 3(!) URLs of dentists out of the first 100 results.

Today it shows 6 listing out of 100.

[edited by: Brett_Tabke at 8:42 pm (utc) on Feb. 15, 2004]
[edit reason] no specific searches per the tos - please. [/edit]

8:07 pm on Feb 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Googleguy do you want any feedback on 216? I'm seeing some super spammers on it, using techniques i've never seen before to totally dominate some serps. Spam spammer is on 64 just on page 2.
8:09 pm on Feb 15, 2004 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I *always* want to hear about super spammers. :) sasha, when I check at either old or new data centers, I find dentists, but the new data appears to have even more dentists--certainly more than three?

Okay, now I really am off for a walk--gotta counteract all that time spent in front of a monitor. :)

8:11 pm on Feb 15, 2004 (gmt 0)

10+ Year Member



it's more realistic to conclude that Google is weighting criteria differently so that over-optimized sites just aren't doing as well now

Well, that settles that. Thanks GG.

I think your answer here will save a lot of webmasters a lot of time and point everyone in the right direction.
Instead of trying to figure out why our web sites were penalized we can focus on how best to present our web sites now.

If Google is indeed seeking to rank web sites based on LSI (CIRCA Technology and Applied Semantics) and traditional PR value along with themes that can be applied site-wide and not just for 1 particular page, Google will be lightyears ahead of the competition.

The trick is in finding the right mix which will satisfy the public's need for relevant results without having to plow thru tons of spam on the way. Therefore, depending on how Google applies its ALGO and weighs semantics/themes to traditional methods of providing SERPs it seems like individual pages could only be relevant to a search if the total site were in harmony with the individual pages.

8:29 pm on Feb 15, 2004 (gmt 0)

10+ Year Member



How come Google can't filter out 302 redirects? I keep seeing a site rate 1-2 in the SERPS because it has a link from DMOZ, but the domain in question is just a redirect so the company benefit twice because they also rate at 10 for the actual domain.
9:02 pm on Feb 15, 2004 (gmt 0)

10+ Year Member



Am I the only one who is getting IDENTICAL results on both 66. and 216 data centers?

This morning the SERPS were different, but right now it is the same.

I cleared the cache and history a bunch of times too. I am in San Francisco, if that helps.

9:08 pm on Feb 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Some nice juicy details there. Time to find as many "authoritative" high quality links as possible. Difficult in some extremely competitive areas for sure without paying a ton of money.
9:09 pm on Feb 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Am I right in thinking therefore that some peoples natural way of writing will be viewed more favourably than others? I get the feeling that hearts and flowers are becoming the order of the day.
This 327 message thread spans 22 pages: 327
 

Featured Threads

Hot Threads This Week

Hot Threads This Month