Welcome to WebmasterWorld Guest from 54.234.63.187

Forum Moderators: open

Message Too Old, No Replies

Update Brandy Part 3

     

GoogleGuy

7:41 pm on Feb 15, 2004 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Continued From: [webmasterworld.com...]

"Any clue as to the possible role greater reliance on semantics is playing in your never ending quest for more relevant results?"

I'd say that's inevitable over time. The goal of a good search engine should be both to understand what a document is really about, and to understand (from a very short query) what a user really wants. And then match those things as well as possible. :) Better semantic understanding helps with both those prerequisites and makes the matching easier.

So a good example is stemming. Stemming is basically SEO-neutral, because spammers can create doorway pages with word variants almost as easily as they can to optimize for a single phrase (maybe it's a bit harder to fake realistic doorways now, come to think of it). But webmasters who never think about search engines don't bother to include word variants--they just write whatever natural text they would normally write. Stemming allows us to pull in more good documents that are near-matches. The example I like is [cert advisory]. We can give more weight to www.cert.org/advisories/ because the page has both "advisory" and "advisories" on the page, and "advisories" in the url. Standard stemming isn't necessarily a win for quality, so we took a while and found a way to do it better.

So yes, I think semantics and document/query understanding will be more important in the future. pavlin, I hope that partly answers the second of the two questions that you posted way up near the start of this thread. If not, please ask it again in case I didn't understand it correctly the first time. :)

coconutz

7:51 pm on Feb 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>and "advisories" in the url.

Interesting, GG

sasha

7:51 pm on Feb 15, 2004 (gmt 0)

10+ Year Member



looks like 66.x server went back to 216.x data. I am getting same Austin results on both. Not sure what this means.

lasko

7:53 pm on Feb 15, 2004 (gmt 0)

10+ Year Member




ooooh glad you point that out i was about to post GoogleGuy.

If none of you have noticed doing a keyword like

Rent

Google will pull out words like Rent Rantal Rentals Hire etc

Now this is the best thing Google can now find the web sites that didn't optimize for Hire but its still relevent to Rentals do you follow me...

I for one love it and its the best way forward, the over optimisation filter may or may not exsist but if you optimize for all the keywords then maybe its too much and your site could been as like a doorway page.

I am not going to give any more away other wise i won't make the pennies or cents i need.

:)

viggen

7:54 pm on Feb 15, 2004 (gmt 0)

10+ Year Member



hello googleguy,

if semantic is going to play a bigger role then one can just hope that the new guys in zurich are part of a team that will tackle some of the other languages then english, as i think google lacks document/query understanding in german (compared to english) for example.

[edited by: viggen at 7:55 pm (utc) on Feb. 15, 2004]

GoogleGuy

7:55 pm on Feb 15, 2004 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



"Has Google applied some sort of OOP or filter to the algorithm since the Florida update or was the drastic change in SERPs purely the result of new ranking criteria?"

It's the second one. People post around here about filters, blocking, penalties, etc. etc. A far better explanation is "things which used to work before don't receive the same amount of credit now." It's natural for people who are way out there with their linking strategies or their page-building strategies to think of a drop as an over-optimization penalty, but it's more realistic to conclude that Google is weighting criteria differently so that over-optimized sites just aren't doing as well now.

GoogleGuy

7:58 pm on Feb 15, 2004 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



viggen, we know that non-English languages are at least as important as English. You can assume we want to figure out more about documents and queries in every language. :)

Okay, I'm off to get some exercise. I'll check in again in a few hours..

sasha

8:03 pm on Feb 15, 2004 (gmt 0)

10+ Year Member



Speaking about new "weighting criteria", I sent an email to Google the other day stating that a search query "<snip>" pulls up 3(!) URLs of dentists out of the first 100 results.

Today it shows 6 listing out of 100.

[edited by: Brett_Tabke at 8:42 pm (utc) on Feb. 15, 2004]
[edit reason] no specific searches per the tos - please. [/edit]

markus007

8:07 pm on Feb 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Googleguy do you want any feedback on 216? I'm seeing some super spammers on it, using techniques i've never seen before to totally dominate some serps. Spam spammer is on 64 just on page 2.

GoogleGuy

8:09 pm on Feb 15, 2004 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I *always* want to hear about super spammers. :) sasha, when I check at either old or new data centers, I find dentists, but the new data appears to have even more dentists--certainly more than three?

Okay, now I really am off for a walk--gotta counteract all that time spent in front of a monitor. :)

Bobby

8:11 pm on Feb 15, 2004 (gmt 0)

10+ Year Member



it's more realistic to conclude that Google is weighting criteria differently so that over-optimized sites just aren't doing as well now

Well, that settles that. Thanks GG.

I think your answer here will save a lot of webmasters a lot of time and point everyone in the right direction.
Instead of trying to figure out why our web sites were penalized we can focus on how best to present our web sites now.

If Google is indeed seeking to rank web sites based on LSI (CIRCA Technology and Applied Semantics) and traditional PR value along with themes that can be applied site-wide and not just for 1 particular page, Google will be lightyears ahead of the competition.

The trick is in finding the right mix which will satisfy the public's need for relevant results without having to plow thru tons of spam on the way. Therefore, depending on how Google applies its ALGO and weighs semantics/themes to traditional methods of providing SERPs it seems like individual pages could only be relevant to a search if the total site were in harmony with the individual pages.

GodLikeLotus

8:29 pm on Feb 15, 2004 (gmt 0)

10+ Year Member



How come Google can't filter out 302 redirects? I keep seeing a site rate 1-2 in the SERPS because it has a link from DMOZ, but the domain in question is just a redirect so the company benefit twice because they also rate at 10 for the actual domain.

sasha

9:02 pm on Feb 15, 2004 (gmt 0)

10+ Year Member



Am I the only one who is getting IDENTICAL results on both 66. and 216 data centers?

This morning the SERPS were different, but right now it is the same.

I cleared the cache and history a bunch of times too. I am in San Francisco, if that helps.

nutsandbolts

9:08 pm on Feb 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Some nice juicy details there. Time to find as many "authoritative" high quality links as possible. Difficult in some extremely competitive areas for sure without paying a ton of money.

JudgeJeffries

9:09 pm on Feb 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Am I right in thinking therefore that some peoples natural way of writing will be viewed more favourably than others? I get the feeling that hearts and flowers are becoming the order of the day.

GoogleGuy

9:17 pm on Feb 15, 2004 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



nutsandbolts, you're still thinking/posting like an SEO. That won't necessarily help you. The best advice I'd give is to make a site the sort of attraction that gives people reason to like a site on its own.

Searching for "high-quality" links before the site itself is high-quality is putting the cart before the horse, so to speak. That time would be better invested in enriching the site by adding good content and more reasons for people to like it on its own merits. Just trying to keep folks from going down a blind alley when there's lot of ways to spend that time improving a site itself. As always, Brett's guide to making a site is a great thread to go back and read again.

mipapage

9:18 pm on Feb 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Googleguy:
Better semantic understanding helps with both those prerequisites and makes the matching easier.

And what better way to make your page easy to understand than strive for sematically rich markup...

uksports

9:19 pm on Feb 15, 2004 (gmt 0)

10+ Year Member



Sasha - "Am I the only one who is getting IDENTICAL results on both 66. and 216 data centers?
This morning the SERPS were different, but right now it is the same"

No, same here from the UK - what is disappointing is that the excellent results that were on 64.xx have gone.

The only difference I can see from pre-brandy results is that the total number of results returned for a random selection of search terms has increased so there is a change of something, but it will be a BIG letdown if 64.xx is released as it is currently

BobbyN

9:23 pm on Feb 15, 2004 (gmt 0)

10+ Year Member



I'm not sure on this word 'semantics' and the ideas being mentioned here.

Anybody care to write a quick paragraph explaining it, or point me to a recent thread.

Thanks

dazzlindonna

9:28 pm on Feb 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



what is disappointing is that the excellent results that were on 64.xx have gone.

yes, i too am seeing the good 64.xx results disappearing. i sure hope this isn't really happening. this "feel-good" thread will turn ugly quickly.

Robert123

9:31 pm on Feb 15, 2004 (gmt 0)

10+ Year Member



I am not seeing the 64 dissapear at all..go direct to the ip address for 64...

google guy already established it would take longer than this weekend and re-affirmed that the 64 results are the new ones..he has done this mulptiple times.

relax..they are coming

dazzlindonna

9:34 pm on Feb 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



yes, i am talking about going direct to 64. searching directly from there is vastly different now than it was yesterday (at least for the things I'm looking at)

Net_Wizard

9:36 pm on Feb 15, 2004 (gmt 0)



Searching for "high-quality" links before the site itself is high-quality is putting the cart before the horse, so to speak. That time would be better invested in enriching the site by adding good content and more reasons for people to like it on its own merits. Just trying to keep folks from going down a blind alley when there's lot of ways to spend that time improving a site itself.

But that's exactly what is the problem here.

In competitive keywords the one who are on the top are in direct opposite to what you are saying.

The page/s that are being feed to googlebot(cloaked pages) are not even human readable....mixed words nonsense and intermixed by the target keywords and oftentimes the density of those keywords are just so rediculous...paragraphs and paragraphs of nonsense. Talk about OOP and user experience.

On top of that, backlinks are not even coming from similar or related site, such as guestbooks, forums, blogs

So the algo question is...how did a site like this get to the top?

flobaby

9:36 pm on Feb 15, 2004 (gmt 0)

10+ Year Member



I'm using the Google dance tool and all four 64 datacenters show the good stuff still.

nutsandbolts

9:37 pm on Feb 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Okay GG, I get the drift :)

Net_Wizard: I see it too in many areas I look at. We are talking about 6,000+ backlinks for some of the top ranking sites - they rule the index because of backlinks they have paid for.

[edited by: nutsandbolts at 9:40 pm (utc) on Feb. 15, 2004]

merlin30

9:38 pm on Feb 15, 2004 (gmt 0)

10+ Year Member



I believe the 64.* results are different to yesterday -but they still include sites ranked on the first page that were lost in Florida and aren't on 216.*

dazzlindonna

9:40 pm on Feb 15, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



they still include sites ranked on the first page that were lost in Florida and aren't on 216

i wish that were true for me. mine returned yesterday and are now gone gone gone again.

look out kiddos...yours may be next. :-(

jocelynd

9:41 pm on Feb 15, 2004 (gmt 0)

10+ Year Member



I guess the soup is still cooking or it would be live.

a_chameleon

9:41 pm on Feb 15, 2004 (gmt 0)

10+ Year Member




On top of that, backlinks are not even coming from similar or related site, such as guestbooks, forums, blogs

Amen, Net_Wizard. In my arena, link popularity comes primarily from newsgroups / discussion groups, specifically, messages posted with signatures that include the poster's website address....?

:(

lasko

9:42 pm on Feb 15, 2004 (gmt 0)

10+ Year Member



Okay GG, I get the drift

hmmmm sounds like you have been converted to a webmaster.

I spend more time on the workings of my site these days then SEO, looking for gaps or holes that leak my visitors away.

Building new pages and expanding my services making the site more fresh and up to date.

Its no good being just top the site needs to be attractive, informative and work.

This 327 message thread spans 11 pages: 327
 

Featured Threads

Hot Threads This Week

Hot Threads This Month