homepage Welcome to WebmasterWorld Guest from 54.196.62.132
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 45 message thread spans 2 pages: < < 45 ( 1 [2]     
What user metrics does Google use to determine rank?
diberry




msg:4505616
 8:36 pm on Oct 8, 2012 (gmt 0)

We've talked a lot around here about Google using user metrics to rank webpages. It's certainly an approach that would make sense, but it also opens a few questions. Assuming Google is being truthful when they say they don't use Analytics data to rank web pages:

--What data could they be using to gather user metrics? Adsense? Chrome? Cookies from the Google search page to see how long searchers stay on a SERP before clicking back?
--Exactly what metrics could they be gathering from their sources, and which metrics would they not be able to access?
--Once they gather these metrics, how are they interpreting them? For example, can they tell the difference between a bounce that happens because a visitor got all they wanted from a site and a bounce that happens because the searcher didn't like the page?

 

deadsea




msg:4506987
 5:09 pm on Oct 11, 2012 (gmt 0)

Thanks for that link, Tedster. I haven't used Google Analytics in the past to track my bounce rate. But today I'm implementing Google Analytics event tracking for some of the key uses of my interactive pages. Hopefully I will have a better opinion of their bounce rate metric soon.

deadsea




msg:4507262
 10:34 am on Oct 12, 2012 (gmt 0)

I got the Google Analytics ping back working when somebody uses my interactive content. My bounce rate has fallen from 82% down to 20% as reported by Analytics.

Has anybody tried to get ad clicks removed from the bounce rate? I've done it without when running a house Ad system and log file analytics by having all the ad clicks go through a tracked redirector. It seems like it will be a little harder with these 3rd party tracking and ad products.

I was thinking about adding an "onmousedown" event to the div around the ads so that I can ping Google Analytics with an event when somebody is going to click.

diberry




msg:4507493
 3:52 pm on Oct 12, 2012 (gmt 0)

Tedster, the Adjusted Bounce Rate sounds like an even better metric than Exit Rate for determining whether a user got what they wanted. Clicky (what I use for stats) has this built in - the only count as "bounces" users who both hit the back button AND stay less than 30 seconds.

Could Google measure the lag between someone clicking a result and hitting the back button to Google? If so, that would give them some sense of adjusted bounce rate to feed into the algos without using Analytics data. Would they be able to track things like that I click a result on Google, look at the webpage for 10 seconds and then exit the page by clicking a non-Google (say, Federated Media) ad? If so, this could be a pretty powerful indicator of user satisfaction.

If not, it would probably still work well on large sites, where lots and lots of incomplete data can still show patterns. And didn't a lot of us note that Panda seemed to start with bigger sites? So if claarky's right that Panda is mainly about one or two metrics, this one might be a biggie.

Checking this theory against my own sites: I'm pretty sure my Penguinized site just got Pandalyzed on the Sep 27 update, even though it's adjusted bounce rate is pretty decent. But the pages that have been hardest hit are the ones that have a number of links to other websites. Users tend to visit these pages, click a link, come back to my page, click another link and often end up subscribing/bookmarking me... it may be that Google isn't recognizing that as site engagement because it can't track everything I can.)

Whitey




msg:4507625
 9:59 pm on Oct 12, 2012 (gmt 0)

Why wouldn't Adwords teach them something about how a site in the organic SERP's should look and perform?

webindia123




msg:4508106
 10:16 am on Oct 15, 2012 (gmt 0)

Google's reply should be taken as a pinch of salt. We have closely analyzed in end of 2010 in a controlled experiment and found that GA data was used by google in deciding SERP.

Google uses all the data sets from entry points of each of google verticals. Infact to such an extent that if your info@example.com is deemed as SPAM id then your site gets tanked big time in google SERP so if they are not using gmail data then how come they decided it ?

They do use Chrome datapoints too for their SERP. And why you forget, one of the bone of contentions with firefox was also sharing of user data for their internal research. Now this internal research can't be for building opt-in list but to make search results personalized and relevant.

Ethical practices are evoked subjective to the business requirements and 'Do No Evil' just becomes a disguised caption. Its not just google but any company that becomes monolith or market leader reinstates lies as 'facts' after certain level where they cannot be questioned or argued.

And their lobbyists does the rest.

- Lalit Kumar

Google's reply should be taken as a pinch of salt. We have closely analyzed in end of 2010 in a controlled experiment and found that GA data was used by google in deciding SERP.

Google uses all the data sets from entry points of each of google verticals. Infact to such an extent that if your info@example.com is deemed as SPAM id then your site gets tanked big time in google SERP so if they are not using gmail data then how come they decided it ?

They do use Chrome datapoints too for their SERP. And why you forget, one of the bone of contentions with firefox was also sharing of user data for their internal research. Now this internal research can't be for building opt-in list but to make search results personalized and relevant.

Ethical practices are evoked subjective to the business requirements and 'Do No Evil' just becomes a disguised caption. Its not just google but any company that becomes monolith or market leader reinstates lies as 'facts' after certain level where they cannot be questioned or argued.

And their lobbyists does the rest.

- Lalit Kumar

diberry




msg:4508150
 3:06 pm on Oct 15, 2012 (gmt 0)

Lalit Kumar, I do take it with a pinch of salt and I assume everyone else does as well. Google's under no obligation to be truthful to us. But the reason I suggested we take Google at their word for the purpose of this thread was to see if we could backward engineer the data they are getting, and where they would need to be getting it from... which could actually help illuminate whether they're using data sources they've said they weren't.

crobb305




msg:4508458
 2:12 pm on Oct 16, 2012 (gmt 0)

I wonder how country/IP block in htaccess could impact the metrics. I still see a lot of people from blocked countries hitting my 403 from Google search. I have blocked those countries for a reason, and they have been 403 for years. But, Google still ranks me there and not in the U.S. It's very frustrating. I wonder if I should remove the country block, and just risk the potential dangers that I encountered prior to implementation.

webindia123




msg:4508558
 7:35 pm on Oct 16, 2012 (gmt 0)

@crobb305
But, Google still ranks me there and not in the US

So is this status quo even post

1) Extensive localized SEO
2) Keeping US the preferred location thro google WMT
3) Targeting local dirs n listing

- lalit kumar

diberry




msg:4519109
 6:02 pm on Nov 14, 2012 (gmt 0)

I have a theory, and I can't think how Google could possibly determine it from user metrics, but it matches some movement I've observed in the SERPs over the past year.

Could Google be (a) determining which sites heavily rely on Google for traffic and (b) scoring them lower on the assumption that optimizing for Google has been that site's only attempt to get traffic, and therefore the site owner is all about rank manipulation rather than building a great internet for people to surf?

I've seen in the SERPs a number of good but heavily Google-dependent sites fall in the past year or so while not-so-great but less Google-dependent sites rose. Of course I can only make these observations on sites where I have information about their traffic - my own, or the sites of webmasters I've talked to.

Has anyone else noticed anything like this? And is there a way Google could tell that most of your traffic is coming from them?

indyank




msg:4519236
 3:12 am on Nov 15, 2012 (gmt 0)

And is there a way Google could tell that most of your traffic is coming from them?


Their products like Google Analytics, Google Adsense and Ad planner, Goggle Trends etc. can easily determine your whole traffic. In addition, they can use third party (supposed to be third party sites for the general public) data like those from ISPs, toolbars and other external sources.

GreenDog18




msg:4519391
 4:26 pm on Nov 15, 2012 (gmt 0)

I had a site that was around 12 months old and I never really could get the traffic very high. My conclusion was that Panda had hit the site without me noticing due to the lower traffic.

Here is what I did:

I had a page on "Blue Widgets", a very long page in that. My line of thought was visitors were visiting my site and returning back to SERPS after reading and getting the information they need. I also figured they would continue browsing which "could" tell google they didn't find what they are looking for when all reality they may have.

So I created a "Blue Widget" article and created another set of articles that talked about the different sectors of "Blue Widgets" such as, "Blue Widget Coupons", "Where are Blue Widgets manufactured", etc.... All of these small articles were linked to from the main "Blue Widget article. I also cross linked these smaller articles to keep the visitor moving from link to link. Since these smaller articles only contained 200-300 words of text I put a noindex tag on them.

So now if a google user searched "Blue Widgets":

1. Land on the base, "Blue Widget" article.
2. Read the article.
3. Click thru to the "Blue Widget Coupon" article.
4. Read the article.
5. Possibly click on to something else within the site.

Instead of:

1. Landing on the base page "Blue Widget". Getting what they needed.
2. Bouncing back to SERPS to continue browsing.

Traffic doubled on the last Panda update. Charts to prove.

Not saying this is right or is the answer but after one year the site has finally started taking off after these changes.

Final thought:

My view on panda is that it measures engagement in your site. You could have the best content in the world but if doesn't appear the user is engaged then google has no way of telling good your content is.

BTW, I wonder how WW ranks for the word, "Widget". ;)

diberry




msg:4519401
 5:16 pm on Nov 15, 2012 (gmt 0)

Indyank, so it's *possible*, and now the only question is whether it's plausible that they would do that.

I think it is, because if your site is, say, way more Google dependent than others in your niche, that actually could indicate you're big on SEO, which they seem to consider a generally bad thing now. They may not drop a site's rankings for this alone, but I can buy it being a factor.

Shaddows




msg:4519403
 5:21 pm on Nov 15, 2012 (gmt 0)

BTW, I wonder how WW ranks for the word, "Widget".

Not remotely well.

Traffic doubled

No to be nit-picky, but what metric are you using? Monthly unique visitors? Monthly Visits (sessions)? Pageviews?

Good to hear a positive Panda impact in any case. Revenue up?

GreenDog18




msg:4519570
 4:10 am on Nov 16, 2012 (gmt 0)

Based the traffic growth off a 3 month average. Everyday for the last week has been twice as much as my average daily volume for the last three week.

Traffic has been the same for about 8 months leading up to the last panda update.

Today it was up even more... Wish I could post a chart.

indyank




msg:4519742
 4:59 pm on Nov 16, 2012 (gmt 0)

They may not drop a site's rankings for this alone, but I can buy it being a factor.


They might apply it indirectly too.

I had a page on "Blue Widgets", a very long page in that.


First, congratulations on the increased traffic.

Did you split the long page into smaller articles or created newer articles on topics that were closely related but still not covered in the long article?

Did you also "nofollow" the links to those new articles on the main "Blue widgets" page, in addition to adding "noindex" to those individual smaller articles?

Has traffic increased to this particular page or the whole site?

This 45 message thread spans 2 pages: < < 45 ( 1 [2]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved