Forum Moderators: Robert Charlton & goodroi
Udi Manber is Google's VP for Search Quality. This week he made a blog post that gives an overview of how Google's Search teams operate.
Some key points I noted:
We are, to be honest, quite secretive about what we do. There are two reasons for it: competition and abuse.
Does that remark and this entire blog post signal a little bit of a shift in the secrecy area? Time will tell.
we made significant changes to the PageRank algorithm in January.
Tantalizing - not changes to the overall ranking algo, but precisely significant changes to PageRank.
Other parts include language models.....query models (it's not just the language, it's how people use it today).....time models (some queries are best answered with a 30-minutes old page, and some are better answered with a page that stood the test of time), and personalized models (not all people want the same thing)......the goal is always the same: improve the user experience. This is not the main goal, it is the only goal.
With as many people trying to game the system, it never amazes me that the nuggets of information that we get from Google, always goes back to the non SEO approach with quality coding standards....Just make a good site that is useful with clean code and you'll do well.
improve the user experience. This is not the main goal, it is the only goal.
Yes, that one jumped at at me too. It's a strong statement, and it's not nearly the same as "organize all the world's information". Udi's statement does not imply serving every website or webmaster the way the"organize" mission does. In that I feel it is more grounded in reality. But that reality also includes "without going broke."
With as many people trying to game the system, it never amazes me that the nuggets of information that we get from Google, always goes back to the non SEO approach with quality coding standards....Just make a good site that is useful with clean code and you'll do well.
With all due respect, that is not what I glean from Udi's statement. It seems to me he is not talking about "your" site - but Google's goal in increasing the user experience in Google by delivering more targeted results in the serps.
Just make a good site that is useful with clean code and you'll do well.
you cant be serious?
improve the user experience. This is not the main goal, it is the only goal
problem is its appears to be a self defined user experience.
Personally my own experience longs for a WEBSITES ONLY and WEIGHTED FOR WORD PROXIMITY check button.
In 2007, we launched more than 450 new improvements, about 9 per week on the average.
Possibly they are doing the same in 2008. And that might explain the instability of the serps on the DCs we see at present.
There is a whole team that concentrates on fighting webspam and other types of abuse. That team works on variety of issues from hidden text to off-topic pages stuffed with gibberish keywords, plus many other schemes that people use in an attempt to rank higher in our search results. The team spots new spam trends and works to counter those trends in scalable ways; like all other teams, they do it internationally. The webspam group works closely with the Google Webmaster Central team, so they can share insights with everyone and also listen to site owners.
And that reminds us that Matt Cutts is the head of WebSpam Team, not the head of Search Quality Team :-)
The timing makes you wonder though, that the more details Udi will share, the more movement there IS right now in the same area. So, either buckle your seatbelts ... or learn to fly ( *me doing the latter* )
...
explain old things
not that I wouldn't be interested in them explaining even outdated technology...
and also, seeing non-marketing personnel posting even small crumbs of info ( let alone a comprehensive post ) gives Google/Search a much more friendly face. tons better than the feeling of interacting with nothing but that background="#FFFFFF" homepage of theirs all the time.
/w all due respect to Matt and Adam.
we badly need more info and faces to go with the work done behind that white bg...
He was talking about Google.
"the goal is always the same: improve the user experience. This is not the main goal, it is the only goal."
Every webmaster, at least those not working in the black hat arena, should hang those words on hir wall. For many, their only goal is to make more money - which means maintaining a good ROI. To do that, you spend the least amount of time/money possible on content/optimization to yield maximum results - a clear road to failure.
[edited by: Halfdeck at 4:47 am (utc) on May 25, 2008]
Please give us some examples of sites (not blogs) doing really well in highly competitive areas with clean code, awesome user experience but less than a year old with hardly any inbound links.
Im not being cynical and I've no doubt they exist but I'd like to see some just to reassure me that you are being straight up and that our white hat efforts really will pay off.
But that reality also includes "without going broke."
Well said tedster. Of course there is somewhat of a disparity between the two camps. Simply creating stellar content nowadays and expecting the website to promote itself is simply not effective. There are far too many websites, too much similar content, and too many people willing to copy content and pass it off as their own.
However, the core of what does work well is unique, great content and services being marketed and promoted properly.
There is always room for both when done right.
Does anyone have a sense of how Google changed the PageRank algorithm back in January? PageRank is the part of the total algo that is not dependent on the query - it's more like an abstract, non-semantic "strength" score for a url.
I'm about to go dumpster diving in some old records, but if anyone has picked up a hint, I'm all ears. I suspect it's got something to do with dampening the PR vote of "related" sites - but that's just where I'm going to start looking.
So if Google changed something significant about PR calculation in January, it certainly should have had an effect on the SERPs. From our January 2008 SERPs Changes [webmasterworld.com] discussion, I picked out a few of the common themes:
1. The Position 6 Bug [webmasterworld.com] that showed up at the end of December was "fixed" after our members began reporting it in January.
2. Many webmasters reported a "major shakeup" in their keyword area. Traffic increases and decreases of up to 40% for various sites. For a change, we seemed to hear about more increases than decreases here. Changes seemed to roll out first on 64.233.171.**
3. Some rather significant changes were seen in both the link: and site: results. I don't read much into the link: changes, however.
4. There was a good bit of ongoing churm reported during the month - not at the level we see right now, but still noticable. Was that new PR folding its way into the search results, or would "new PR" have hit all at once, I wonder.
5. It seemed that more urls were being filed as Supplemental during January. Could that have been the result of revised PR formula?
6. The proxy url problem started to fade.
7. At least some spam sites targetting semantic variations of keywords stopped ranking.
8. Interesting quote from MLHmptn: "Google is flat out turning our SEO idealogies inside out. It's funny now days you can even rank a given page for it's keywords without any inbound link having targeted anchor text. Give it an authority link and WAM!, who cares if the anchor text is "Click here!". Your going to rank for title keywords without one inbound anchor text!"
9. A PR update rolled out in January [webmasterworld.com], about a month earlier than we expected
All sites are optimized the same and have the same basic link pattern - which is primarily interlinking to a large degree.
The constant changes with the algo and serps fluctuations (mostly seen outside the top 10) which still go on today - would make it very hard to isolate one change in that ocean of fluctation and say "That is the results of the PR changes". I can't do it anyway :)
I see my sites dropping out and dropping in all the time and have for years. Most of the time I just wait and they eventually come back. Sometimes fixing a dead link or 2 on the main entry page causes them to come back faster....
Precisely
We've some client sites that are the mid range services types on a local level and they are like candy to Google. Always placed tops for their terms, and geo's.
With respect to the little guys, I think Google likes the street addresses and local phones placed next to the toll free ones on a site.
Oh, and without links?
There are two clients here that have done that already.
One site is two years old, and has never left Googles top 9. The other, was online 2000-2001, left, and reregistered in 2006, and it does the same as the other. Neither have ever had any "links", "resource pages" or "directories". The only external links they have ever put on that I can recall, was when the rodeo or a festival of some sort that came through, and that's it.
Sure, one has, at best, a page (10) links (listed at links:websitedotcom/ in Google) pointing to it, but it isn't linked to any of them. The other might have 2 or 3 pointed to a few mates and that's it.
No, ... no confusion about the links there mate. Sorry to disappoint you.
On the otherhand, I could get on about the industry these might be in, only to get a snip snip from a mod ... not much of a help there either I'm afraid.
I can say this however.
Google likes them honest. Plain and simple. These two don't have large amounts of cash to be throwing about for the nettie you see, so we keep them good with the basics, and, as a result, their listings are enough to make guys like me want to buy a larger hat.
So yes. If I appear to be putting on a bit of a braggie, well, then, great.
I can say that these two aren't large by any stretch. One is 260 pages, whilst the other has a short blog and about 155 pages ... all straight html then, you see. Nothing fancy. It's a straight blow-and-go when Google visits. Everything gets read, and everything gets listed, in all of about, oh, lets say, 65 seconds. Google visits, and there's no question as to what these two mates are all about.
We don't have much of a problem with the little guy at all. The problems usually bring themselves to bear with our larger clients and all of the paint, spit, and finish they want to go with their sites (large menus, heavy scripting, flash, framesets, affiliates, directories and buttloads of incoming and outgoing links ... it just goes on and on) ... and this is where we see Google getting into the mix, and often.
Back in November 2001 a bunch of sites were hit with PRO for no apparent reason - it was talked about here for weeks. Numerous re-inclusion requests over the years produced nothing - except once there was a vague statement leaving the impression that there was a technical issue involved that may prevent them from reincluding my site. Perhaps they finally fixed that.
As for PR changes in January, I have a site that was hit with one of the first PR0 penalties back in November 2001 and never came back - until January. It came back with a PR3 and began ranking again.
Dugger - thanks for a very interesting observation.
I'm wondering now if the new PageRank algorithm that Udi mentioned counts more than "links", at least at a dampened level. In other words, domain citations that are not linked, src attributes including "hotlinks" and maybe more. Enough of those popping up and a site might gain trust that it once lost.