| 8:05 pm on May 31, 2010 (gmt 0)|
I see what you're saying better now Reno, and I'm glad you're not complaining just because you aren't getting all the traffic any more... ;)
|That's what worries me -- if it's done |
I think the definition of the word 'done' could probably be interpreted... By done I think they mean it's present through out the system, so the 'integration is complete on all datacenters', which I think is a good think, because now the 'dial turning' can begin.
So, to me, them saying it's 'done' is a good thing because now it's an 'adjustment and fine tuning' situation rather than more implementation.
I think this is why we hear quite a bit of 'they can't keep doing this ... Google's broken' when they are implementing a new piece to the puzzle and right after they get 'done', because Cutts also says there are about 400 minor algo adjustments a year, and once a rollout is 'done' it can be tuned with the adjustments. So, I think 'done' is subjective, and 'done' to Google may not mean exactly the same as 'done' would mean to the rest of us, because 'done' seems to imply stagnant, set, not-changing, but what it really means IMO is 'there in the algo for G to adjust, edit, fine tune, update slightly, etc.'
| 8:28 pm on May 31, 2010 (gmt 0)|
Just another little thought I have sometimes on them fine-tuning the algo...
Does anyone else ever think about how many possible combinations of 'weight' they can apply and why it might take them a bit to integrate a new variable into the full algo?
If they have 200 variables and each has a setting from 1 to 10 for weight (forget about decimals) 10 being the most importance and 1 being the least for each variable, the number of possible settings they have are (roughly):
Uh, I'll let anyone who feels like it do the math, because I can tell just by looking at it that's a bunch...
| 8:59 pm on May 31, 2010 (gmt 0)|
200! (200 factorial) is near-enough 8 x10^387 (that's 8 with 387 zeroes after it).
But that's not the possible combo. It's 10^200, a which is about 0.5% of that huge number.
Of course, once you appreciate that there ARE decimals, that number gets pretty big, pretty quickly.
Added- I just noticed you weren't calculating 200!, but (200!)-(190!).
However, that makes no material difference to the number, 190! being 20-odd orders of magnitude smaller and thus a rounding error.
| 9:33 pm on May 31, 2010 (gmt 0)|
I monitor a few niches, some product based and others product/informational. Here are my observations. The solely product based niches (the ones where people look to buy) are being front loaded with major brands on page 1 and low/mid e-commerce sites on page 2. I mentioned a week or so ago in the Mayday thread that my site and others were pushed back 1 page. Well these e-commerce sites replaced our trophy KW positions on page 2. Which we all know is an insignificant ranking spot because it just teases you of the potential of being on page 1.
No hard feelings because for belly fat Kws my site and others are ranking with the big boys. Ahead of most of them too on page 1.
The long tail kws (loosely related to the niche) they went to hubpages, squidoo, etc. And to websites more deserving or niches of those loosely related Kws.
One particular niche I delved in, a tough one, info/product based. I have seen a website climb up in rankings with only a few hundred links. This website obviously did not partake in the link craze. It is more than remarkable that such a site is amongst giants that have PR 8. How did he do it with such a low amount of links and low PR? I studied his link profile, there is nothing special there about super duper edu – gov – links, etc!
| 9:47 pm on May 31, 2010 (gmt 0)|
Thanks for that quality input, sean.
|front loaded with major brands on page 1 |
That sounds familiar, doesn't it. Eric Schmidt and others have been talking about brands for a while, and we even had the Vince update [webmasterworld.com] last year to kick off the campaign.
So the question remains, what defines a brand in Google's eyes? What does it take to be seen as "a brand"? I wonder if that mystery site you mentioned at the end found a piece of that formula.
| 10:04 pm on May 31, 2010 (gmt 0)|
I don't know. It's certainly easier to identify a brand that sells something. Maybe the criteria that Googles uses to determine 'brand' differs from say strictly ecommerce to informational/ecommerce sites.
| 10:48 pm on May 31, 2010 (gmt 0)|
Here's another reason I tend to fall into the "something is wrong" camp:
When I go to GWT and click "Your site on the web" > "Search queries", as you know a graph will come up at the top. I have a number of sites where the exact same number of impressions comes up every day for many days in a row. I have one site that shows an absolutely straight line since May 7th! When I put my mouse arrow over the dot representing each day, the number of impressions is the same. I have to think that the likely probability of this happening is off the charts, so to my eyes, something is still not right in paradise, no matter what MC's public relations dept is saying. I'd be curious to know if others are seeing the same thing -- it's easy enough to check.
| 11:17 pm on May 31, 2010 (gmt 0)|
|So the question remains, what defines a brand in Google's eyes? What does it take to be seen as "a brand"? |
The (very) cynical side of me has sorta picked at the idea and wondered if publicly traded or investor funded domains makes a difference. Googlers/employees can ensure their investments do well in a variety of ways (turning dials on sites that compete, throttle traffic, or adjusting algos to benefit invested properties or getting loads of links in dmoz, etc.) Really cynical stuff. They definitely have the power at their fingertips to make the money magic happen (for their personal gain)...the question is, do they actually do it, and if Google itself is against the practice, can a rogue google employee (or two or three) touch dials in a way that the above scenario plays out? Is there any sort of in-company policing? I dunno.
I think search counts for site name could be a brand signal (widgets r us) and searches using keywords + brand name. Other than that, I'm out of ideas.
| 11:33 pm on May 31, 2010 (gmt 0)|
I don't often misquote, so I guess he was wrong. ;-)
| 11:53 pm on May 31, 2010 (gmt 0)|
I apologize for the confusion I created. Despite Brett's efforts to keep these two issues clearly separate [webmasterworld.com], I blurred them together and gave out bad information.
Brett did not say that WebmasterWorld lost long tail traffic with the Mayday update. He said we lost traffic with the new Google interface - and that final rollout happened on May 5, two days after the Mayday update was complete according to Matt Cutts.
| 12:11 am on Jun 1, 2010 (gmt 0)|
Ah OK .... so my theory about relevance signals being lost with pagination / marginal meta titles , which i speculated was a problem with WMT, is probably not correct.
What's the story with Rand's SEOMOZ loss of longtail , is that still valid ? In terms of content quality this is a great site - but it's the reasons for the drop that is intriguing.
It would be good if we could pin some specific examples to work on , to isolate the precise reasons for the long tail drop. I think we're close with your earlier post and the collection of useful comments around it.
| 3:26 am on Jun 1, 2010 (gmt 0)|
I still feel it has to do with back links and how they are passing link juice or the lack of G knowing about some of the links.
| 4:34 am on Jun 1, 2010 (gmt 0)|
I just wonder if this clipping of longtail on established brand sites will bite Google Adwords by lowering their income. I had great organic results yet still spend $500+ per month on Adwords, but with recent loss of longtail, my conversions are down and hence my income, so I will be cutting back on my Adwords spending...and probably going out of business. Maybe Google thought we'd take out loans to buy more Adwords ads to stay afloat...they thunk wrong.
| 9:29 am on Jun 1, 2010 (gmt 0)|
Perhaps we're overlooking the obvious?
My stats provide information on too narrow a scope to be reliable but they do suggest that Google has increased the value of OFFLINE quality signals in assigning value to a site. Basically ~ dOES THIS SITE HAVE A REGISTERED PHYSICAL BUSINESS RELATED TO THE SEARCH QUERY? IF SO - SLIGHT BOOST AHEAD OF ONLINE ONLY SITES. iT wOULD explain why online only businesses like SEOMoz and WW saw decreases.
| 3:24 pm on Jun 1, 2010 (gmt 0)|
MC’s video does not give much info on what factors they are now using to provide this supposed better results.
| 3:29 pm on Jun 1, 2010 (gmt 0)|
LOL... You're right! I was thinking 'about the same odds as winning the lottery' and used the wrong calculation.
Thanks for pointing it out.
Let's just say it's: BIG!
| 3:42 pm on Jun 1, 2010 (gmt 0)|
Here's the facts on my site:
Bounce rate as a percentage has increased
Pages per visit has dropped
Average time on site has decreased
Pageview per visitor has dropped
All of this suggests to me that my google refered visitors are not getting what they came looking for with this algo change..
I'm a very original content rich site.
| 4:02 pm on Jun 1, 2010 (gmt 0)|
|You may notice that some queries (1-word and 2-word) almost never show a "transactional" result, or never show an "informational" result. Other queries always show a mix. On a very high-level, that's an indication of one kind of taxonomy kicking in, but the full taxonomy is much more granular. |
I can definitely corroborate that, in my niche very slight changes of phrasing are taken to indicate different user intent, and can lead to a completely different category of results being shown. It is quite clear that Google is picking the category first, and then ranking the results within that category, as there is no cross-contamination of categories even though they may use very similar wording.
| 4:17 pm on Jun 1, 2010 (gmt 0)|
Very strange, out of 1800 referrals so far (site has about 1500 pages) only a dozen referrals were identical to another.
For example only 2 referrals for "Green Widgets in Ca"
| 4:27 pm on Jun 1, 2010 (gmt 0)|
@Edge, are the changes your described just for Google search traffic, or across all types of traffic?
| 5:57 pm on Jun 1, 2010 (gmt 0)|
Tedster - just Google, Bing and Yahoo are business as usual.
| 6:01 pm on Jun 1, 2010 (gmt 0)|
Since May 18th, we have dropped back to sales levels of five years ago. We had very robust & reliable sales since 2005, but now it's down to a trickle. Strangely, site traffic looks like it did not change much, I guess this is the "quality" traffic that MC mentioned. I would have to say that the new algo is no longer matching up the right people with our service, so who's benefits from this change? No one.
| 6:12 pm on Jun 1, 2010 (gmt 0)|
Sounds like we are in the same boat...
| 6:55 pm on Jun 1, 2010 (gmt 0)|
I think this is the second serps update that has negatively affected my traffic ever. Last one that trimmed my traffic was back in 2005? I'm very, very white hat.
When we figure out where the traffic went - we can then make the requisite adjustments.
Somebody somewhere is very happy!
| 6:57 pm on Jun 1, 2010 (gmt 0)|
I am hearing more and more average people switching over to Bing and when I ask why they are telling me that they like the results better. MC and Gogle can think this new update is great, but the users will eventually speak with their pocketbooks.
They either know G is broken and not letting the word out or they are clueless and will suffer just as other corporations have by not listening to the webmasters that are on this site.
| 7:16 pm on Jun 1, 2010 (gmt 0)|
Both my sisters (very non-tech) spontaneously mentioned to me that they started using Bing in April, but clearly there was not yet a maass movement. In factr, Google grew in search market share during April. Google's downfall has been predicted during every significant update thread here, going back ten years.
It will be interesting to see the May browser figures. If Mayday is really bad for the average user, then we should see a drop in the search market share for May. But considering my own experience, I'm now with Paul Simon: "a man hears what he wants to hear and disregards the rest." That's what I did.
| 10:00 pm on Jun 1, 2010 (gmt 0)|
The average user doesn't look at the web with the same level of bias and scrutiny as we do. I really doubt that the update would cause any serious changes in search engine market share.
| 11:10 pm on Jun 1, 2010 (gmt 0)|
Is there any possible way to determine who GAINED traffic? It might help us figure out some of the factors.
| 11:46 pm on Jun 1, 2010 (gmt 0)|
So far I've seen evidence of this on a few US and Canadian sites that I've got access to but not on sites in other parts of the world.
Is there any evidence that this is a change that's already been rolled out world wide or is it only in North America are we still waiting for a wider implementation? Has anyone seen this in other markets yet?
| 12:58 am on Jun 2, 2010 (gmt 0)|
I want to put some minds at ease: I have two yellow page type websites, think of them as placeholder for future development. You know thin they are and how often they are duplicated, yet traffic increased on both of them. I did get some (not so 'good') back-links but nothing major.
| 1:09 am on Jun 2, 2010 (gmt 0)|
walkman, I'm not sure if that is such good news for the rest of us. Google's idea of quality is a very odd thing indeed.
| This 133 message thread spans 5 pages: < < 133 ( 1 2  4 5 ) > > |