homepage Welcome to WebmasterWorld Guest from 54.161.175.231
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 35 message thread spans 2 pages: 35 ( [1] 2 > >     
Udi Manber blogs about Google Search
tedster




msg:3657580
 5:00 pm on May 23, 2008 (gmt 0)

I mentioned this post on Google's official blog in another discussion, but it deserves a dedicated thread.

Udi Manber is Google's VP for Search Quality. This week he made a blog post that gives an overview of how Google's Search teams operate.

[googleblog.blogspot.com...]

Some key points I noted:

We are, to be honest, quite secretive about what we do. There are two reasons for it: competition and abuse.

Does that remark and this entire blog post signal a little bit of a shift in the secrecy area? Time will tell.

we made significant changes to the PageRank algorithm in January.

Tantalizing - not changes to the overall ranking algo, but precisely significant changes to PageRank.

 

zuko105




msg:3657678
 7:01 pm on May 23, 2008 (gmt 0)

Extremely interesting. In addition to PageRank algo:

Other parts include language models.....query models (it's not just the language, it's how people use it today).....time models (some queries are best answered with a 30-minutes old page, and some are better answered with a page that stood the test of time), and personalized models (not all people want the same thing).

.....the goal is always the same: improve the user experience. This is not the main goal, it is the only goal.

With as many people trying to game the system, it never amazes me that the nuggets of information that we get from Google, always goes back to the non SEO approach with quality coding standards....Just make a good site that is useful with clean code and you'll do well.

tedster




msg:3657745
 8:23 pm on May 23, 2008 (gmt 0)

improve the user experience. This is not the main goal, it is the only goal.

Yes, that one jumped at at me too. It's a strong statement, and it's not nearly the same as "organize all the world's information". Udi's statement does not imply serving every website or webmaster the way the"organize" mission does. In that I feel it is more grounded in reality. But that reality also includes "without going broke."

Dugger




msg:3657780
 8:53 pm on May 23, 2008 (gmt 0)


With as many people trying to game the system, it never amazes me that the nuggets of information that we get from Google, always goes back to the non SEO approach with quality coding standards....Just make a good site that is useful with clean code and you'll do well.

With all due respect, that is not what I glean from Udi's statement. It seems to me he is not talking about "your" site - but Google's goal in increasing the user experience in Google by delivering more targeted results in the serps.

Bentler




msg:3657850
 10:36 pm on May 23, 2008 (gmt 0)

Nice file name on the post.

santapaws




msg:3657859
 10:47 pm on May 23, 2008 (gmt 0)

Just make a good site that is useful with clean code and you'll do well.

you cant be serious?

improve the user experience. This is not the main goal, it is the only goal

problem is its appears to be a self defined user experience.

Personally my own experience longs for a WEBSITES ONLY and WEIGHTED FOR WORD PROXIMITY check button.

reseller




msg:3657865
 10:54 pm on May 23, 2008 (gmt 0)

Udi mentioned two points which I find very interesting:


In 2007, we launched more than 450 new improvements, about 9 per week on the average.

Possibly they are doing the same in 2008. And that might explain the instability of the serps on the DCs we see at present.


There is a whole team that concentrates on fighting webspam and other types of abuse. That team works on variety of issues from hidden text to off-topic pages stuffed with gibberish keywords, plus many other schemes that people use in an attempt to rank higher in our search results. The team spots new spam trends and works to counter those trends in scalable ways; like all other teams, they do it internationally. The webspam group works closely with the Google Webmaster Central team, so they can share insights with everyone and also listen to site owners.

And that reminds us that Matt Cutts is the head of WebSpam Team, not the head of Search Quality Team :-)

Miamacs




msg:3658077
 10:53 am on May 24, 2008 (gmt 0)

Them opening up is welcome.

The timing makes you wonder though, that the more details Udi will share, the more movement there IS right now in the same area. So, either buckle your seatbelts ... or learn to fly ( *me doing the latter* )

...

explain old things

not that I wouldn't be interested in them explaining even outdated technology...

and also, seeing non-marketing personnel posting even small crumbs of info ( let alone a comprehensive post ) gives Google/Search a much more friendly face. tons better than the feeling of interacting with nothing but that background="#FFFFFF" homepage of theirs all the time.

/w all due respect to Matt and Adam.
we badly need more info and faces to go with the work done behind that white bg...

potentialgeek




msg:3658346
 9:51 pm on May 24, 2008 (gmt 0)

why did googleguy depart?

santapaws




msg:3658358
 10:14 pm on May 24, 2008 (gmt 0)

so he could write his own blog and not have to write within the webmasterworld guidelines....

tedster




msg:3658408
 11:49 pm on May 24, 2008 (gmt 0)

GoogleGuy may have a more infrequent lower posting rate lately, but he's not departed. Most recent post, Feb 8, 2008 [webmasterworld.com]

Halfdeck




msg:3658480
 4:45 am on May 25, 2008 (gmt 0)

"Just make a good site that is useful with clean code and you'll do well."

He was talking about Google.

"the goal is always the same: improve the user experience. This is not the main goal, it is the only goal."

Every webmaster, at least those not working in the black hat arena, should hang those words on hir wall. For many, their only goal is to make more money - which means maintaining a good ROI. To do that, you spend the least amount of time/money possible on content/optimization to yield maximum results - a clear road to failure.

[edited by: Halfdeck at 4:47 am (utc) on May 25, 2008]

Lorel




msg:3658649
 4:02 pm on May 25, 2008 (gmt 0)

"Just make a good site that is useful with clean code and you'll do well."

I'd like to see anyone try that without gathering links.

santapaws




msg:3658716
 6:50 pm on May 25, 2008 (gmt 0)

so 11 people make useful sites with clean code, what happens next because one of them is on page two at best. So it is somewhat naive to suggest you just make a useful site with clean code. That would be fine if yours was the only one and google somehow managed to find it when we know they require external links to it before they rank it.

edd1




msg:3658816
 10:30 pm on May 25, 2008 (gmt 0)

Dear Udi

Please give us some examples of sites (not blogs) doing really well in highly competitive areas with clean code, awesome user experience but less than a year old with hardly any inbound links.

Im not being cynical and I've no doubt they exist but I'd like to see some just to reassure me that you are being straight up and that our white hat efforts really will pay off.

CainIV




msg:3658822
 10:34 pm on May 25, 2008 (gmt 0)

But that reality also includes "without going broke."

Well said tedster. Of course there is somewhat of a disparity between the two camps. Simply creating stellar content nowadays and expecting the website to promote itself is simply not effective. There are far too many websites, too much similar content, and too many people willing to copy content and pass it off as their own.

However, the core of what does work well is unique, great content and services being marketed and promoted properly.

There is always room for both when done right.

Genie




msg:3658840
 11:37 pm on May 25, 2008 (gmt 0)

Udi Manber did not say "Just make a good site that is useful with clean code and you'll do well."
That was zuko105 in post 2 above.

tedster




msg:3658841
 11:39 pm on May 25, 2008 (gmt 0)

Right - Udi was talking about Google's focus on THEIR end user's experience.

Does anyone have a sense of how Google changed the PageRank algorithm back in January? PageRank is the part of the total algo that is not dependent on the query - it's more like an abstract, non-semantic "strength" score for a url.

I'm about to go dumpster diving in some old records, but if anyone has picked up a hint, I'm all ears. I suspect it's got something to do with dampening the PR vote of "related" sites - but that's just where I'm going to start looking.

CainIV




msg:3658945
 4:15 am on May 26, 2008 (gmt 0)

I suspect it's got something to do with dampening the PR vote of "related" sites

What makes you think that tedster? If anything, you would think that by nature they would be moving a direction of dampening the vote between non-related websites to combat spam.

tedster




msg:3658950
 4:38 am on May 26, 2008 (gmt 0)

I mean "related" in the sense that the domains have a common ownership, or some other apparent business relationship - creating a node on the webgraph that appears unnatural. Backlinks from independent sites with related topics makes plenty of sense - that's not what I was trying to say.

Dugger




msg:3658953
 4:48 am on May 26, 2008 (gmt 0)

I have several dozen sites with common ownership linked together and I did not see a change so from my view the PR change did not affect linking of sites with common ownership.

tedster




msg:3658979
 5:36 am on May 26, 2008 (gmt 0)

Thanks for that. I only noticed the effect in a few limited slices of the web, and I may not have the right analysis. We're talking about January right now - not present time.

So if Google changed something significant about PR calculation in January, it certainly should have had an effect on the SERPs. From our January 2008 SERPs Changes [webmasterworld.com] discussion, I picked out a few of the common themes:

1. The Position 6 Bug [webmasterworld.com] that showed up at the end of December was "fixed" after our members began reporting it in January.

2. Many webmasters reported a "major shakeup" in their keyword area. Traffic increases and decreases of up to 40% for various sites. For a change, we seemed to hear about more increases than decreases here. Changes seemed to roll out first on 64.233.171.**

3. Some rather significant changes were seen in both the link: and site: results. I don't read much into the link: changes, however.

4. There was a good bit of ongoing churm reported during the month - not at the level we see right now, but still noticable. Was that new PR folding its way into the search results, or would "new PR" have hit all at once, I wonder.

5. It seemed that more urls were being filed as Supplemental during January. Could that have been the result of revised PR formula?

6. The proxy url problem started to fade.

7. At least some spam sites targetting semantic variations of keywords stopped ranking.

8. Interesting quote from MLHmptn: "Google is flat out turning our SEO idealogies inside out. It's funny now days you can even rank a given page for it's keywords without any inbound link having targeted anchor text. Give it an authority link and WAM!, who cares if the anchor text is "Click here!". Your going to rank for title keywords without one inbound anchor text!"

9. A PR update rolled out in January [webmasterworld.com], about a month earlier than we expected

Dugger




msg:3659123
 12:44 pm on May 26, 2008 (gmt 0)

On January 2 I made a note of 50 of my sites that were ranking in the top 10 for the competitive commercial keywords. Later in January the list changed only moderately - a few dropped out and a few dropped in. That list has remained pretty static since then.

All sites are optimized the same and have the same basic link pattern - which is primarily interlinking to a large degree.

The constant changes with the algo and serps fluctuations (mostly seen outside the top 10) which still go on today - would make it very hard to isolate one change in that ocean of fluctation and say "That is the results of the PR changes". I can't do it anyway :)

Dugger




msg:3659124
 12:48 pm on May 26, 2008 (gmt 0)

Having said that, I do have an old site with almost no links to it - some would consider it being used for doorway pages - that suddenly began to show up in the top 30 for lots of searches. The site is about 6 years old and has never ranked like this in Google before. In fact - I pretty much forgot about it.

Dugger




msg:3659136
 1:19 pm on May 26, 2008 (gmt 0)

If I just had one site and it happened to be one of the sites on my "50" list that dropped out of the top 10 and completely out of the serps - I would be thinking there had been massive changes at Google and would be here every day posting in one of those multi-page threads. Looking at a wider view however there does not seem to really be much change when I look at all of the sites together.

I see my sites dropping out and dropping in all the time and have for years. Most of the time I just wait and they eventually come back. Sometimes fixing a dead link or 2 on the main entry page causes them to come back faster....

mcneely




msg:3659635
 7:13 am on May 27, 2008 (gmt 0)

" ...Just make a good site that is useful with clean code and you'll do well... "

Precisely

We've some client sites that are the mid range services types on a local level and they are like candy to Google. Always placed tops for their terms, and geo's.

With respect to the little guys, I think Google likes the street addresses and local phones placed next to the toll free ones on a site.

Oh, and without links?

There are two clients here that have done that already.

One site is two years old, and has never left Googles top 9. The other, was online 2000-2001, left, and reregistered in 2006, and it does the same as the other. Neither have ever had any "links", "resource pages" or "directories". The only external links they have ever put on that I can recall, was when the rodeo or a festival of some sort that came through, and that's it.

santapaws




msg:3659693
 8:31 am on May 27, 2008 (gmt 0)

you seem to be confusing inbound and outbound links. No offence but your post sounds more like a seo self promo than useful info. Simply the fact that they are apparently employing you to improve their rankings suggests this has little to do with simply building a useful site with clean code.

mcneely




msg:3659756
 10:56 am on May 27, 2008 (gmt 0)

Inbound, outbound?

Sure, one has, at best, a page (10) links (listed at links:websitedotcom/ in Google) pointing to it, but it isn't linked to any of them. The other might have 2 or 3 pointed to a few mates and that's it.

No, ... no confusion about the links there mate. Sorry to disappoint you.

On the otherhand, I could get on about the industry these might be in, only to get a snip snip from a mod ... not much of a help there either I'm afraid.

I can say this however.

Google likes them honest. Plain and simple. These two don't have large amounts of cash to be throwing about for the nettie you see, so we keep them good with the basics, and, as a result, their listings are enough to make guys like me want to buy a larger hat.
So yes. If I appear to be putting on a bit of a braggie, well, then, great.

I can say that these two aren't large by any stretch. One is 260 pages, whilst the other has a short blog and about 155 pages ... all straight html then, you see. Nothing fancy. It's a straight blow-and-go when Google visits. Everything gets read, and everything gets listed, in all of about, oh, lets say, 65 seconds. Google visits, and there's no question as to what these two mates are all about.

We don't have much of a problem with the little guy at all. The problems usually bring themselves to bear with our larger clients and all of the paint, spit, and finish they want to go with their sites (large menus, heavy scripting, flash, framesets, affiliates, directories and buttloads of incoming and outgoing links ... it just goes on and on) ... and this is where we see Google getting into the mix, and often.

Dugger




msg:3659835
 12:53 pm on May 27, 2008 (gmt 0)

As for PR changes in January, I have a site that was hit with one of the first PR0 penalties back in November 2001 and never came back - until January. It came back with a PR3 and began ranking again. It had not been findable for any keywords since the PRO penalty over 7 years ago - but it is now :)

Back in November 2001 a bunch of sites were hit with PRO for no apparent reason - it was talked about here for weeks. Numerous re-inclusion requests over the years produced nothing - except once there was a vague statement leaving the impression that there was a technical issue involved that may prevent them from reincluding my site. Perhaps they finally fixed that.

tedster




msg:3659967
 3:47 pm on May 27, 2008 (gmt 0)

About the "little guy" - Matt Cutts once blogged that Google takes steps to give a boost to Mom and Pop shops. I believe I see this in action, both in toolbar PR that well exceeds what you'd expect from the backlinks, and also directly in ranking.

As for PR changes in January, I have a site that was hit with one of the first PR0 penalties back in November 2001 and never came back - until January. It came back with a PR3 and began ranking again.

Dugger - thanks for a very interesting observation.

I'm wondering now if the new PageRank algorithm that Udi mentioned counts more than "links", at least at a dampened level. In other words, domain citations that are not linked, src attributes including "hotlinks" and maybe more. Enough of those popping up and a site might gain trust that it once lost.

This 35 message thread spans 2 pages: 35 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved