homepage Welcome to WebmasterWorld Guest from 54.196.198.213
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 107 message thread spans 4 pages: < < 107 ( 1 2 [3] 4 > >     
Google algo moves away from links, towards traffic patterns
travisk




msg:764208
 11:11 pm on Apr 4, 2006 (gmt 0)

Does anyone else think that Google's actions over the last few years indicate a gradual change in the importance of traffic patterns over inbound links?

Think about it... the Google Toolbar, Google Analytics and click monitoring on the SERPs give Google an incredible picture of where people are going, what pages they stay on, what sites they frequently return to and where they go when they leave.

We know that Google is pushing the toolbar onto consumers. They're paying Dell a billion dollars to install it onto 100 million consumer PC's. Imagine what the behavior patterns of 100 million Internet users could tell Google about a particular site's value.

What scares me is that this will push the blackhats from link spamming over to the busy spyware world. Imagine if I could pay some shady company to have the web browsers of 100,000 pc's randomly click on my #10 ranked link and stay on my site until Google decides that I should be #1. Who cares if these users buy anything on my site. I just want Google to THINK that they're using it. Will Google start bundling anti-spyware with the toolbar to stop this?

Am I on to something, or has this been going on for years?

[edited by: tedster at 8:38 pm (utc) on April 6, 2006]

 

wanderingmind




msg:764268
 3:21 am on Apr 12, 2006 (gmt 0)

If giving a shot at traffic to the little guy was true --

I know a niche site, very focused, and totally original nice content. Their visitors are currently all type-in or from MSN and yahoo. No adsense on the site yet.

For some 3 weeks in March, they were in SERPs. Its a new site, so rankings were nothing great but they were there. Around 300 visitors a day from Google in that period.

Then the pages vanished from Google, and only the homepage remained.

For me, for a niche and new site, the traffic from Google when their pages appeared in SERPS should have at least kept them somewhere in the SERPs, at least deep down in the index.

That doesn't seem to be happening.

One of my own new sites with again excellent content (its an experiment, but I wrote 200 articles for that) has been supplemental forever - once it escaped supp hell and traffic shot up. Back to supp, and no traffic.

That shot at traffic seems to happen - but with no impact for the future.

Abhilash




msg:764269
 5:47 am on Apr 12, 2006 (gmt 0)

Does traffic influence Googlebot visitation frequency and, if so, to what degree?

Once we came out of supplemental hell, our traffic numbers picked up suddenly, and I also saw our pageviews/visit increase. Accordingly, I saw the site climb in the SERPs. However the problem that sits with any such conjecture like this is that the conclusions can never really be isolated from other contributing factors.

For example, of course I was also building links and I had added some content a while back. Did the content finally hit? Or was it one of the links that carried quite a bit of weight?



Asking this question is almost impossible to answer. However, there is one definite & problematic situation that I found recently, and if traffic numbers matter, then the problem is extra-serious: Please guys, can anyone offer some help on this problem:

2 of my top 10 referrers are extremely strange sites that are displaying AdWords AND Yahoo Search Ads on their sites simultaneously. My site does not do AdSense or YPN. The sites look too cookie-cutter to have gone through the rigorous Google search partner approval process. The traffic has been extremely low quality for us and results in 98% bounce rates, therefore the terrible # of pageviews/visitor (1.1) could be really impacting the site negatively.

<snip>

I fear some real click fraud here, b/c one of these sites are showing ads in a very disturbing way. Potential value (ppc) of this traffic has already exceeded $10k. Has anyone else had to deal with this issue? Should I re-post this question somewhere else (although I thought since it had to do w/traffic #'s & Google...)?

Thanks in advance.

[edited by: engine at 2:34 pm (utc) on April 12, 2006]
[edit reason] No specific sites. [/edit]

deliriumtremens




msg:764270
 9:59 am on Apr 12, 2006 (gmt 0)

Did I hear someone say that newer sites wouldn't have a chance because the top sites in the serps would dominate the traffic?

New sites have a chance now?

Is user data the key to the true meaning of:
"There isn't a sandbox, but the algorithm might affect some sites, under some circumstances, in a way that a webmaster would perceive as being sandboxed."

As for a voting democracy, I've heard some compelling arguments in favour of a benevolent dictatorship, if Google could become just a little more benevolent, I'd be happy to oversee their human slaves in Google's underground PR mines....

voices




msg:764271
 11:12 am on Apr 12, 2006 (gmt 0)

Google needs to find a way to weed out spam and then rotate the top 50 or 100 results daily. If the results are constantly rotated the little guy will have a chance and there will be no way that SEO can affect the ranking.
The old way of ranking a page by reading what was actually on the page was the best algo but also the easiest to take advantage of.

Liane




msg:764272
 11:14 am on Apr 12, 2006 (gmt 0)

This all makes my head hurt. I am soooo glad to be a Mac user and unless someone does a deal with Apple in regards to a toolbar or other such stuff ... I think I will remain a Mac user! :)

Does anyone know if users can delete the toolbar on the Dell computers once they receive their new machine?s The toolbar really is spyware (as has already been mentioned) if you can't "opt out" by choice.

If the results are constantly rotated the little guy will have a chance

And where does that leave the user? Is that in his/her best interest? Users should recieve the best possible results ... not fed some "little guy's" site just because Google wants to "give them a chance".

trinorthlighting




msg:764273
 1:01 pm on Apr 12, 2006 (gmt 0)

Here is a funny thing. I had one page that visitors were constantly hitting via msn and yahoo, but was not even indexed in google. I was getting 1000 visits a day to this page via yahoo and msn. I started using google analytics and a week later it was indexed and now google is sending a ton of hits to it. Obviously google is using this data to look at the other search engines and if they see a worthy page that is not in their index that is getting a lot of traffic, they put it in their index....

I would say google is using the data

Simsi




msg:764274
 1:52 pm on Apr 12, 2006 (gmt 0)

It makes a lot of sense that user behaviour is preferred to factors that can be manipulated - ie: linking, keyword domains etc.

It's been around for a long time, and there are various ways to achieve it without major detriment to newer sites. But essentially, it would make for much more relevant SERPS. So I for one hope it continues to receive focus.

bwstyle




msg:764275
 2:03 pm on Apr 12, 2006 (gmt 0)

Google doesn't care about the little guy, it cares (or at least used to care) about delivering the highest quality results to their end-users. So rotating the top 10 or 50 wouldn't make much sense, would it?

voices




msg:764276
 3:12 pm on Apr 12, 2006 (gmt 0)

When I search for a product I don't want to see the top sites like Amazon and bizrate always at the top. I know these sites and would go straight to them if that is what I wanted. I think the best results should include lesser known sites. Since google doesn't want anyone to be able to manipulate the results, rotation would help.

ronburk




msg:764277
 6:10 pm on Apr 12, 2006 (gmt 0)

Google doesn't care about the little guy, it cares (or at least used to care) about delivering the highest quality results to their end-users. So rotating the top 10 or 50 wouldn't make much sense, would it?

Makes perfect sense. Leaving the top 10 or 50 static for extended periods wouldn't make much sense. First, that would leave no way of using click data to correct poor choices that the rest of the ranking algorithm has made. Second, that would leave no way to respond quickly to changes in user needs.

A page that contains the words "Brittney Spears" and "giraffe" might not deserve to rank highly for "Brittney Spears" today. But if she gets bitten by a giraffe tonight, that could change -- much faster than can be handled by the crawl-index-rank-export_to_datacenter cycle.

It's not as though Google has to constantly be making gigantic changes in the top 50. They merely have to conduct sampling tests periodically to see if it reveals that a lower-ranking listing is currently more relevant than those on page 1 or 2.

Oliver Henniges




msg:764278
 6:32 pm on Apr 12, 2006 (gmt 0)

> pageviews/visit

We had a related discussion recently. Two of us (including me) have an average ratio of 2.5, which seems very few compared to what I personally perform on my visits at webmasterworld, for instance.

Low pageviews per visit and even short visit-time could also indicate a very good usability on the website in question.

I do believe that google does analyse and use visitor behaviour, but I also think the level of mathematical abstraction and analysis is far higher than suggested here. E.g. neuronal networks and statistical correlations we can't think of, simply because we have no insight into that data.

As has been pointed out several times elsewhere by MC and others, big daddy's new infrastucture is very much about improvement on search quality on non-english websites. Any idea whether/how this might be related to traffic patterns?

JanFer




msg:764279
 8:03 pm on Apr 12, 2006 (gmt 0)

That's a good point, Oliver. Average pageviews can represent opposite statistics.

A site with poor navigation, and/or empty content would have a higher average pageview than a site which gives its visitors exactly what they're looking for within a click or two.

Conversely, lots of pageviews can mean a site has a lot of interesting pages, which encourages visitors to dig deeper.

So, lots of pageviews can mean a site is crappy, but not so crappy visitors leave immediately, or it can mean that the site is relevant and has a lot of interesting content.

Low page views can show that a site is well organized and gives vistors exactly what they're looking for in an easy-to-find format, or that the site has nothing much of interest.

Therefore, pageview statistics should be taken with a grain of salt.

Wouldn't the rate of return visits be a more accurate demonstration of a site's relevance?

If google were to put emphasis on pageviews as an indicator of relevance, would that not encourage webmasters to make their sites harder to navigate?

Silvery




msg:764280
 8:27 pm on Apr 12, 2006 (gmt 0)

A number of top SEOs have opined that Google cannot yet use the toolbar or other usage info such as their Urchin webbugs for deciding rank of pages. The general assessment has been that this would still be too prone to abuse/manipulation (hacking the Google toolbar would open this method up to being artificially influenced, because people could deploy automated requests against the toolbar apis to try to make sites seem more popular than they really are).

Someone earlier mentioned that in a Meet the Engineers session a Google engineer had mentioned that they wouldn't use this as a direct method of deciding rankings, but might use it for indirect influence of some unspecified sort.

Here's my take, based on usage analysis of a Fortune 10 company website: Google is not using Toolbar stats to generate a rank of a page (yet), but they are likely using it as a component of their quality assessments to pull bad pages/sites out of rankings.

Consider: if Toolbar and Urchin usage data was used in combination with methods which were proposed in the "Combating Web Spam with TrustRank" paper which was published out of Stanford, Google would have a strong tool for suppressing spammish/low-quality pages from their SERPs in an automated fashion. We know that they, like other SEs, use a number of staff members to manually assess the quality of search results and to identify spammers and other black hats. But, the combination of methods mentioned in their "Information Retrieval Based on Historical Data" patent and the Stanford paper would pave the way to improved automated assessment methods.

Simplistic explanation of how this works: the TrustRank study suggested that human assessors could rate a sample set of internet pages on whether they were good results for a keyword search or not. An algorithm could then be applied to those pages which were ranked as bad, and link structure patterns could be used to take all the pages associated with the "bad" sample set and suppress the low-quality pages at those sites and their related network sites.

Toolbar or webbug usage data could then be used to flag sites with high abandonment rates or links which users never chose to click, and then suppress rankings of all sites with pages associated to the bad ones. This would be fairly trustworthy data, and not as prone to manipulation by black-hats. After all, it would be using negative info rather than positive -- pages not visited, and pages which are not stayed-on longer by users could be automatically dropped in rankings. It's not as prone to manipulation by black-hats because they couldn't generate a negative -- they'd have to stay OFF sites and leave sites in droves, and compared with all the other browser users who are visiting a site, they wouldn't be as able to artificially cause competitor sites to drop in rankings.

So, if my theory is correct, Toolbar/Analytics data is not being used to make your site rank higher, but it could be used to drop you down in the results or smack you out of the indices entirely (if you've got pages that users hate).

chadlerh




msg:764281
 8:38 pm on Apr 12, 2006 (gmt 0)

Isn't there proof that google uses the toolbar to find new pages. If they have that capability isn't it easy to use that same data for ranking purposes. I don't understand why some think Google is not capable.

2by4




msg:764282
 9:01 pm on Apr 12, 2006 (gmt 0)

Silvery, that's one of the more compelling ideas I've seen in this thread, and makes pretty good sense.

Trustrank is my favorite google product, mainly because it seems to once again find a way to reward websites that are actually trying to create unique valuable content. It's very hard to emulate the trustrank link schemes, although I'm sure blackhats are hard at work on the problem, and have been for quite a while. But as long as the initial pool is kept relatively uncontaminated of seo sites, it will be hard to break into that level, although you won't see as much benefit I'd guess in the heavily commercial categories.

It makes total sense to treat this type of data as a negative page ranking factor, when people leave a page in a consistent manner, it's fairly obvious that it's not useful to the searchers.

When I first saw signs of trustrank it gave me a lot of hope that google could start working out a way to both maintain its income and the quality of its serps, that's a fine line, hard for them to walk.

chadlerh: yes, absolutey, we've had test sites spidered because of that issue. No question at all. But that part is just the google indexing system being notified of a url that someone with the googlebar visited. Even if Google isn't currently using the user data in all the ways it can, it's not throwing it away, that's for sure.

Demaestro




msg:764283
 9:04 pm on Apr 12, 2006 (gmt 0)

I would buy into the urchin theory more if they hadn't pulled it as a free service.

The fact that it was too big as a free service and they stopped handng out free accounts tells me they don't care about getting all that data or they are simply not ready to use it all. Otherwise they woul have found a way to make it work so they could continue to get the massive infux of data.

Once it becomes free again then I will muse over what their true intentions with that tool are.

grant




msg:764284
 9:42 pm on Apr 12, 2006 (gmt 0)

Click Metrics don't work.

Examples:

A user clicks on a SERP and leaves quickly leaves the site and moves on. Is the site "relevant" and valuable?

Maybe the user thought the site sucked and left. Or maybe they found their information quickly (price comparing, answering questions, solutions, definitions, etc.).

In virtually every scenario with click metrics, there are completely inverse scenarios that perfectly explain user behavior.

I saw an engineer from MSN speak on this and said their research shows that click metrics has far too many assumptions built in.

2by4




msg:764285
 10:04 pm on Apr 12, 2006 (gmt 0)

grant, no need to think about it in complex ways, just look at a simple case:

site a ranks for green widgets, number 3

This term is searched for 1 million times a day.

If a large percentage of searchers return to the search page and click on the next option, it's quite clear that site a is not what the searchers wanted. Individual behavior isn't that important, but it's clearly one way you can measure how close the serps are to user expectations, that's why google included this in their recent patent application.

It's not the only way, of course, but it is one tool you have. And if I read the msn search javascript correctly, they are tracking clicks, although that script is hard to read, google's is radically simpler. About 1 line versus 50 or so... typical Microsoft.

gregbo




msg:764286
 10:41 pm on Apr 12, 2006 (gmt 0)

If a large percentage of searchers return to the search page and click on the next option, it's quite clear that site a is not what the searchers wanted. Individual behavior isn't that important, but it's clearly one way you can measure how close the serps are to user expectations, that's why google included this in their recent patent application.

Suppose a representative subset of them are doing SERP comparisons? It isn't clear (for all searches) that hitting the next link is a sign of dissatisfaction.

2by4




msg:764287
 10:56 pm on Apr 12, 2006 (gmt 0)

Seos and website owners are the only people in the world doing that stuff for all practical purposes, try to think like average searchers, nobody here is an average searcher, the overwhelming majority of users does a search, then either stays on the page, looks at it for a bit, or comes back right away. I believe the patent application referred to this exact behavior though I can't remember the exact wording.

Average searchers just know that they typed in a search term, they got some pages, they click on them, if they are right, they stay, if they are not, they go back to the next search item until they find one they like. Google's goal is to keep people using their product, which means to give users what they are looking for. This isn't particularly earth shattering information if you ask me.

Try to step out of the seo box, seos have very little impact on search patterns, we're talking about far too many searches a day for seos to influence things, though of course that won't stop them from trying. Obviously they can and do influence pages landing in the serps in the first place, the trick here is to get rid of garbage and replace it with reasonably good quality results.

Most people type in a search, then go to the site. If they don't come back to the google search, the result can be considered successful, if they do, it can be considered as a failure. There's no need to make simple things complicated.

Keep in mind, if this is being used, it's simply one factor among many, nobody knows what weighting any one factor has, so stressing about one component really isn't worth bothering about.

To keep it ultra simple: it's a good thing if your page gives the searcher what they were looking for. It's not necessarily a bad thing if it doesn't, since it's hard to know what they are looking for. But once the search numbers get big enough this stuff all becomes pure statistics, stuff you can study and integrate into other components.

crobb305




msg:764288
 11:48 pm on Apr 12, 2006 (gmt 0)

Ranking sites based on visitor time spent/movement would penalize sites with easy/intelligent navigation and Google Adsense with high CTR.

mattg3




msg:764289
 12:03 am on Apr 13, 2006 (gmt 0)

I do believe that google does analyse and use visitor behaviour, but I also think the level of mathematical abstraction and analysis is far higher than suggested here. E.g. neuronal networks and statistical correlations we can't think of, simply because we have no insight into that data.

I would guess the target is less an undefined quantity that is hard to optimise [the world population over all web pages - good luck to anyone trying], but more a clearly defined target, like maximising income. That is a neat figure you can optimise.

Silvery




msg:764290
 12:03 am on Apr 13, 2006 (gmt 0)

Ranking sites based on visitor time spent/movement would penalize sites with easy/intelligent navigation and Google Adsense with high CTR.

crob305, there's a big difference between a user clicking through to content he finds on the page, and just hitting the back button to find better results on the SERP.

So, using pattern identification based on visitor time spent on a page and movement doesn't necessarily penalize the good sites, and it could likely be used to red-flag the bad ones for special treatment.

blend27




msg:764291
 12:45 am on Apr 13, 2006 (gmt 0)

"to red-flag the bad ones for special treatment"

as well as move them up in SERP as long as the site has related content and has been around for a while, why not just 'lift' it up a 'notch' just cause there is no budjet in'vested from the site, and 2 below spend like monkies at G-Parade.

G-Parade is a registered XMark of 'Show me the Budjet Corporation'

P.S. 9 + 2 = 11

europeforvisitors




msg:764292
 12:58 am on Apr 13, 2006 (gmt 0)

So, using pattern identification based on visitor time spent on a page and movement doesn't necessarily penalize the good sites, and it could likely be used to red-flag the bad ones for special treatment.

Plus, it would be only one factor. Let's say that site A and site B both have visitors leaving after viewing only one page. Site A is a dictionary, and site B is a scraper site. Site A is likely to have quality inbound links and a low SEO "spam quotient," while site B is likely to have few (if any) quality inbound links while having site characteristics that make Google's virtual nose wrinkle in any sniff test. Furthermore, site B's users are likely to be leaving via ads or the back button, while site A's users may not have such obvious departure patterns. Given the preponderance of evidence, it wouldn't be unreasonable for Google to assume that site B's lack of stickiness suggests poor quality, while site A's lack of stickiness is neither good nor bad.

gregbo




msg:764293
 3:59 am on Apr 13, 2006 (gmt 0)

Seos and website owners are the only people in the world doing that stuff for all practical purposes, try to think like average searchers, nobody here is an average searcher, the overwhelming majority of users does a search, then either stays on the page, looks at it for a bit, or comes back right away. I believe the patent application referred to this exact behavior though I can't remember the exact wording.

What about people who are shopping for something? They may go to several sites from more than one SERP before they find what they were looking for.

annej




msg:764294
 7:32 am on Apr 13, 2006 (gmt 0)

If I could afford it I'd try getting rid of adsense to see if that increased time on the site. I'm sure that sites in the top 10 of serps get less time on site just because more surfing about the search results is going on.

Another reason people may not stay is if there is a long article, one that could be well researched and perhaps with some great up to date information. But people tend to take one glance at a long article and just move on.

dataguy




msg:764295
 11:45 am on Apr 13, 2006 (gmt 0)

I think another thing to keep in mind is that it is extremely easy for Google to know which of us are webmasters and which web sites we own, so our traffic patterns can be discounted or thrown out altogether.

I run a few small directories and this is even easy for me to do with my visitors, and I don't have toolbar data, Adsense data, Urchin data, etc. One of Google's patents mentions using previous search criteria to customize new searches for specific users. If Google is aware that when my mother searches for "Turkey" she is looking for recipes and not a geo-location, they certainly know I'm a webmaster with certain search bias's.

And I'm sure Google has a special category for those of us who continually search using the 'site:command'.

I wonder how many of us have helped our competitors by searching our target keywords and clicking on their listings? How much time do YOU spend viewing scraper sites, trying to figure out how to destroy them? Maybe you are inadvertantly improving their rankings.

abacuss




msg:764296
 1:07 pm on Apr 13, 2006 (gmt 0)

This will be an unfair decision if at all implemented by google.A web site already having good traffic will be carrying on the same trend otherwise if some thing serious thing happens.

This will be a serious kill for the new sites.Sandbos and now this one?

i donno where these google people are heading for.

Gimp




msg:764297
 1:39 pm on Apr 13, 2006 (gmt 0)

abucuss

This is a perfect none issue. Until Google implements something why start a row over something that does not exist? Repeatedly in this set of forums advix=se is given on what to do. Why not just follow the advice and stop worrying about the sky falling?

grant




msg:764298
 4:14 am on Apr 14, 2006 (gmt 0)

2by4:

If a large percentage of searchers return to the search page and click on the next option, it's quite clear that site a is not what the searchers wanted.

This is incorrect. All it says is that the SERP description was not was the user wanted, not THE SITE, as the user hasn't seen the site.

Just as PPCers know that the text ad affects CTR, we know the SERP description does as well.

Therefore an assumption has been made about the site, whereas the description in the index is really the determing factor.

This 107 message thread spans 4 pages: < < 107 ( 1 2 [3] 4 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved