homepage Welcome to WebmasterWorld Guest from 54.205.144.231
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 215 message thread spans 8 pages: < < 215 ( 1 2 3 4 5 [6] 7 8 > >     
2:Google Updates and SERP Changes - June 2010
g1smd




msg:4151813
 11:10 pm on Jun 12, 2010 (gmt 0)

< continued from [webmasterworld.com...] >

I'm currently looking at an e-comm site where sales for the four complete weeks since May 16th are down 77% compared to the previous four weeks.

So far, I have no additional words of wisdom to add to this thread, and not much of a clue how to fix it.

Traffic levels haven't fallen by much, but instead it's as if Google is sending completely the wrong type of visitors.

Number of pages reported by site: search kept on falling, but in recent days have started to go up. Sales returned for two days, and then dried up again.

[edited by: tedster at 12:05 pm (utc) on Jun 15, 2010]

 

Planet13




msg:4155354
 8:43 pm on Jun 19, 2010 (gmt 0)

Matt goes on to mention factors like "how long do people stay"


What is the implication by Matt Cutts here?

Is it that google is now able / willing to measure visitor time on page and bounce rate on our sites?

If so, how are they doing it? Through google analytics data?

What about sites that DON'T have GA installed?

And couldn't webmaster's hire a couple of kids / bots to browse their sites for long periods of time, thus increasing time on page and pages visited, and reducing the bounce rate?

Or maybe they are able to measure the rate at which a user clicks on a link in the google SERPs and then clicks the back button to return to the same google SERP page?

If that were the case though, I would click on the link to my competitor's pages from the google SERP and then immediately click back to the SERP to lower their value...

tedster




msg:4155360
 8:53 pm on Jun 19, 2010 (gmt 0)

What about sites that DON'T have GA installed?

Many ISPs do sell their click stream data. And then, there's the huge Google Toolbar user base, too.

And couldn't webmaster's hire a couple of kids / bots to browse their sites for long periods of time, thus increasing time on page and pages visited, and reducing the bounce rate?

Only back in the '90s. Over the years since then, Google has developed lots of safeguard technology for this kind of thing, especially to detect Adwords and Adsense fraud.

Planet13




msg:4155363
 8:58 pm on Jun 19, 2010 (gmt 0)

Thank You, Tedster:

So in your opinion then, is google using things like bounce rate, time on site, and page views to rank pages now?

Thanks in advance.

tedster




msg:4155367
 9:09 pm on Jun 19, 2010 (gmt 0)

If you're going to apply any metric at all, then it must be something you can measure for every site in the index. Anything else is bad science. So if they have a dependable source for that kind of data, and I think they do, then that's the kind of thing they use.

I also think googlebot information plays a big part. If googlebot sees a lot of technical missteps, that might well be a predictor of a low quality user experience. If the index shows lots of stub pages in the domain, or a lot of content that's duplicated elsewhere, etc, etc.

Planet13




msg:4155375
 9:36 pm on Jun 19, 2010 (gmt 0)

If googlebot sees a lot of technical missteps...


meaning broken links, invalid html/xml, etc.? Or do you mean something else by "technical missteps"?

Thanks in advance.

cien




msg:4155380
 9:47 pm on Jun 19, 2010 (gmt 0)

GA is your site's worst enemy. Bounce rate can get rankings dropped for some pages. I've proven this myself. I uninstalled that thing ages ago and my site started doing a lot better since then; until "March Day" anyway.

Planet13




msg:4155382
 9:54 pm on Jun 19, 2010 (gmt 0)

GA is your site's worst enemy.


I would appreciate it greatly if you have any links to sites backing up this statement. If there is a body of scientific evidence about this, I would love to see it.

Thanks in advance.

dvduval




msg:4155448
 3:23 am on Jun 20, 2010 (gmt 0)

GA is your site's worst enemy. Bounce rate can get rankings dropped for some pages.


You've mentioned two different items here. Bounce rate may be something google is able to find without google analytics. I have seen sites all over the spectrum in terms bounce rates compared with the ratio of uniques to google search traffic.

tedster




msg:4155457
 4:08 am on Jun 20, 2010 (gmt 0)

GA is your site's worst enemy. Bounce rate can get rankings dropped for some pages. I've proven this myself.

My experience is quite different. I've installed or removed GA from many sites and seen absolutely no pattern emerge for rankings.

meaning broken links, invalid html/xml, etc.? Or do you mean something else by "technical missteps"?

Invalid code, no problem - unless the mark-up is so bad teh content can't even be deciphered. Broken links - that depends on how many an pointing to what.

But mostly I mean poor http status codes - chaos between 301, 302, 401, 404, 503. That can cause major problems for indexing a site. You think you've got 120 URLs on the site, but googlebot finds thousands and is still turning up more. Canonical problems are not just some esoteric geek thing - they can really matter.

Most of the potential technical issues were untangled here a few years ago, during the heyday of the "Supplemental index" problems. The best of those threads are listed in the Hot Topics area [webmasterworld.com], which is always pinned to the top of this forum's index page.

aparraga




msg:4155550
 10:57 am on Jun 20, 2010 (gmt 0)

If bounce rate is a ranking factor, webmasters could ask mechanical turks to nagivate on their sites, right?
So, I don't think that bounce rate is a ranking factor. Other guys were thinking sometime ago that click-through was a ranking factor, same thing.

drall




msg:4155564
 11:49 am on Jun 20, 2010 (gmt 0)

If they are taking bounce rate into account I want all of the Mensa membership cards back. In many cases, giving the visitor exactly what they want in a clean and clear format can cause a high bounce rate.

Take one of our sites for example which is a script download site. We do not hide our download links. We do not make you jump through hoops to get what you need.

So the Google search user types in Apache blah blah. We pop up number one, the user clicks our result. They download the script they need and take off to go patch that server. This causes a huge bounce rate. But they know us, they trust us. They have trusted us for over ten years.

They are not looking to whittle around my site for hours. They get what they need and bail. Does this mean we should rank lower? I mean we are the most up to date site out for this stuff. We keep ads to 1 per page which techs love. Bottomline we make the user happy.

Is this why we are seeing spammy almost blatant made for ads sites with little history take over our positions? Because they make the user hop through 10 pages to get that script they are looking for? Oh and lets not forget the user must look at pages and pages filled to the brim with ads and internally scraped garbage.

On the surface it all makes sense now why we got hammered with Mayday. But I cant believe it's that simple.

mhansen




msg:4155568
 12:40 pm on Jun 20, 2010 (gmt 0)

In my case - pages suffering (query = page) the most also have higher bounce rates (+75%). That may just be an exaggeration of the long-tail issue, but it is what it is.

As far as HOW it could be tracked, you only need to look at a recent G Webmaster Tools discussion to learn the exact way! Its on a completely different topic, but well worth the read if you can ignore the noise.

[http://www.google.com/support/forum/p/Webmasters/thread?tid=43c85ef328efaea4&hl=en]

In reply to:
>>> If you'd like to explain your theories about how my technique
>>> interacts with the Google Toolbar, I'd be happy to criticize them.


A Bionic Googler replied:

Alright. The toolbar can monitor user behaviour. It'll detect that your visitors sometimes leave your site towards another one. How? The bot didn't see any links there? Quite a contradiction, is it? Something must be wrong with your site, let's send in the next quality rater available. S/He comes with the source code in hands the bot has crawled


The googler is saying that the toolbar MONITORS key indicators about your website at the client side... and when it finds certain behaviors or inconsistencies from googlebot, it automatically dispatches one of the 10,000 manual reviewers to your site.

I didn't realize Google were the Internet Police - but its obvious they are!

pontifex




msg:4155576
 1:36 pm on Jun 20, 2010 (gmt 0)

@mhansen:

I didn't realize Google were the Internet Police - but its obvious they are!


no, they are just the portal, everyone wants to be on and with that desire they earn a lot of money. Like in every other market they act as the source of something people want and try to maintain and expand this desire with marketing. There they do a good job. Like Coke, Adidas, Porsche or any other brand trying to make a profit.

Problem in this (Internet) market is: The desire for the product is not as old and not as explored as the other desires (thirst, having shoes, prestige). This market is young (around 17 years or maybe just 15 if you look at the web in detail), fast (created, up, down, gone within years) and quite transparent (everything is just a word dot com away).

The current development of the SERPs is very understandable from marketing perspectives - keep the masses, feed the masses.

Google always states: "for the users" - that means in the world of Coke and Pepsi: we do not care too much about the suppliers of water and sugar, we care about the customers who buy a can of soda! Like Coke and Pepsi they will do EVERYTHING to the suppliers to keep the soda drinkers as buyers. In that supply chain the average webmaster is the supplier of content that in major parts is replaceable, so Google will continue to treat us like that as long as the "consumer" picks up their frontends. Simple as that.

Now pour the standard cost calculations in it and the whole picture falls into place: if you as the sugar supplier do something that creates the slightest hassle for Coke: there are others in the line that make that sugar delivery just easier for Coke and you are out of the supply chain.

For Coke or Pepsi its as much marketing and the exact same laws as for Google and Bing.

Your perception of Google or Bing is just as emotionally influenced as your favorite cola! Would you call Coke the "soft drink police"?

P!

tedster




msg:4155593
 3:29 pm on Jun 20, 2010 (gmt 0)

A Bionic Googler replied:

For proper "weighting" of those statements, notice that the comments are from a "Bionic Poster" - a top contributor - but that is not the same as being a Google employee.

mhansen




msg:4155606
 4:28 pm on Jun 20, 2010 (gmt 0)

@ Pontifex -

I understand what you mean about the Consumer/Product/Supplier analogy.

In this case, if Coke or Pepsi were closely monitoring each step I (or the average consumer) take as I go from store to store looking for information on sugar, and then started closing roads leading to the stores that had brands they didn't particularly like because [insert super-secret reason here], I WOULD call Coke the soft drink police.

dvduval




msg:4155607
 4:30 pm on Jun 20, 2010 (gmt 0)

They are not looking to whittle around my site for hours. They get what they need and bail. Does this mean we should rank lower? I mean we are the most up to date site out for this stuff. We keep ads to 1 per page which techs love. Bottomline we make the user happy.

Is this why we are seeing spammy almost blatant made for ads sites with little history take over our positions? Because they make the user hop through 10 pages to get that script they are looking for? Oh and lets not forget the user must look at pages and pages filled to the brim with ads and internally scraped garbage.


You make a great case for why Bounce Rate or Time on Site should not matter. Additionally, this is why the internet can be so frustrating to people sometimes. You just want to find a review for a specific restaurant, or a recipe or TV show schedule or the side effects of a medication ...

And then you want to get on with your day!

If google is punishing sites for being a destination that provides the result they need, and then helping sites that make the user go through ads or otherwise spend more time getting what they want sounds pretty horrendous.

With ISPs throttling the bandwidth of sites that use more (eg Google and Youtube), is it becoming "chic" to make people wait longer for what they want? If you don't make the consumer click through ads first, does that somehow not fit into "quality" search results now?

I do hope this is not correct, but there are certainly reasons to make the case that extends beyond just a shear "conspiracy theory".

[edited by: dvduval at 4:55 pm (utc) on Jun 20, 2010]

arizonadude




msg:4155611
 4:38 pm on Jun 20, 2010 (gmt 0)

As far as GA goes, I don't think it's one size fits all.

I used to use it for some of my affiliate sites and every site in there got wacked, but NONE of the affiliate sites that were not in there did.

You can draw your own conclusions but I also think you will find there is no one size fits all pattern when it comes to webmasters using GA.

Some swear by it, others don't.

Personally, I'm not giving Google anything for free other than my content if they want it although they are doing more and more to keep people from leaving Google once my content has brought them there.

wingslevel




msg:4155620
 5:11 pm on Jun 20, 2010 (gmt 0)

On serp changes, what has everyone been seeing on the longtail stuff - in the past few days - any changes?

For me, there have been some very large swings in indices that haven't moved much in years - i am talking about conversion rates on medium/large sites (>10k users/day and >100k pages) - for the last couple of years these rates have moved within a very tight band (like +/- 10% relative) - since these are bigger sites, the sampling is pretty good to indicate a change - so, when i saw the needles move up 25% on thursday and friday of last week, and then down 40% on saturday - i went scrambling to check my #'s.

no changes to sites that would account for this - no big changes in # of visitors - just, in my opinion, huge variances in the quality of organic longtail referrals

anybody else?

anand84




msg:4155621
 5:13 pm on Jun 20, 2010 (gmt 0)

I don't think Bounce rate is THE factor, but one of the several factors that Google has in its new algorithm. Remember MC said "keep your engagement level with your customers high".

So, the point is probably not just about giving your customer what she wants and make them leave, but to keep her engaged and probably sell more (that is also the differentiator in the real world between good shops and great shops).

PS : How would people affected by Mayday rate the bounce rates on your affected sites?

wingslevel




msg:4155622
 5:14 pm on Jun 20, 2010 (gmt 0)

one more thing, if you chime in here, pls differentiate your affected site between 1. content/adwords 2. larger ecommerce or 3. microsites - because we know that 3. has been specifically addressed in the algo change

wingslevel




msg:4155623
 5:17 pm on Jun 20, 2010 (gmt 0)

anand, amidst these big swings in conversion rate, my bounce rate hasn't moved one bit - a flat line for the last month...

dvduval




msg:4155654
 7:29 pm on Jun 20, 2010 (gmt 0)

Thanks to some changes I made to my site, my bounce rate is significantly better than it was previously, and has steadily improved since April.

If bounce rate is being counted as a key metric, wouldn't that mean your would want to avoid advertising with companies that might cause the bounce rate to increase? (ex. stumbleupon) And if that is true, that would be another trump card for google adwords.

If we don't make some sense of this soon, I may have to start designing landing pages with google ads to slow down my users from getting to the content they are seeking so my bounce rate and time on site improve.

But at this point, this is only speculation, though some clarification would be in order.

tedster




msg:4155667
 8:35 pm on Jun 20, 2010 (gmt 0)

Many Google reps have stated very clearly, and several times, that bounce rate is too noisy a metric to use in the ranking algorithm. I also see no evidence for it in the data I see. I have a client whose highest search traffic page has an 85% bounce rate, but they still rank #1.

Still, lowering a page's bounce rate by attracting visitors into exploring your website, or by getting better targeted traffic, can be a good thing.

Planet13




msg:4155744
 11:38 pm on Jun 20, 2010 (gmt 0)

I see. I have a client whose highest search traffic page has an 85% bounce rate, but they still rank #1.


I hate to admit this, but most of my pages with the highest traffic also have the highest bounce rate - by about 10% higher than the site average.

I have looked at the keywords that people are using to get to those pages from the search engines and tried ensuring that the content they want is front and center. I have also added links to what I consider very relevant pages in the text of those landing pages.

Yet the bounce rate remains high...

On the other hand, the Time On Page for those pages is generally about 15% or 20% higher than the site average, so I am guessing that people are reading those pages all the way through (since they tend to be text heavy articles and informational pages). So they are at least finding (more or less) what they were expecting, I guess, and then moving on...

anand84




msg:4155816
 3:16 am on Jun 21, 2010 (gmt 0)

Like I said, bounce rate is just ONE metric to measure engagement levels..Here are two other things I can think of

1. % New visits
2. User generated content (reviews, comments,etc.)

Amazon has been noticed to be ranking high. That's possibly because we keep going back to Amazon to at least check on the best price. This means visitors are returning and hence that is a site that has engaged the customers well.

I have a wordpress based site and since my content is factual based (not opinion based), people hardly comment. Also, over 96% are new visits - I'm just assuming that my website may be seen as less engaging than the most popular websites in my niche since they are pretty big names and enjoy regular repeat visitors.

mercedesP




msg:4155919
 10:10 am on Jun 21, 2010 (gmt 0)

IMO Google always has "the user experience" at the heart of their experiments and changes for its ranking algorithem. If that's the objective, user's behieviour within a site has as much value as to understand "what type of user" is using the site.
In marketing terms, as "shoppers" we all fall in one of these categories: The Leader, The Follower, The Insecure and The Secure.
All of these groups seek different information and act/interact differently. For instance, an insecure one will spend a lot more time in any given site than a secure one. So, for Google to "obvectively judge a site", according to user behivour, G. also needs to have an "online personal user clasification", and IMO personalization would do the job.
The latest or future algorithoms could put a lot more emphasis on the "user reactions" to any given site than on its backlink profile that can be "manipulated".

Food for thought....

mcdarwin




msg:4155963
 12:13 pm on Jun 21, 2010 (gmt 0)

Back to the old Mayday topic - my traffic came back on Friday.

Numbers are almost back to what I had before a big drop on April 14th and the small difference could very well be seasonal. Traffic during the weekend was also good and today looks fine too. I checked with Google Analytics and it is long-tail traffic I have gained back.

I have seen no jumbs up and down before like others has reported.

To early to be happy yet - but I'm curious if others are seeing this too?

It is a Danish site. I have made no big changes and hardly added new content either as the sudden drop nearly made me give up on this site...

ohno




msg:4155966
 12:17 pm on Jun 21, 2010 (gmt 0)

OK, last week was our best in months, today it's like a switch has been flicked yet again, no traffic, no sales, dead.

tomapple




msg:4155969
 12:28 pm on Jun 21, 2010 (gmt 0)

Traffic was slightly below our post "Mayday" average this weekend. We have not received an order since Saturday afternoon (6 sites).

ohno




msg:4155974
 12:42 pm on Jun 21, 2010 (gmt 0)

Which i'm assuming is not in anyway normal for you?

tomapple




msg:4155976
 12:48 pm on Jun 21, 2010 (gmt 0)

Sorry for not being clear. No, this is not normal activity (even for the depressed state we have been in since April). Expected 10 to 15 orders in that time frame.

This 215 message thread spans 8 pages: < < 215 ( 1 2 3 4 5 [6] 7 8 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved