homepage Welcome to WebmasterWorld Guest from 54.204.67.26
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 428 message thread spans 15 pages: < < 428 ( 1 ... 4 5 6 7 8 9 10 11 12 13 [14] 15 > >     
Zombie Traffic from Google and Traffic Shaping/Throttling - Analysis
tedster




msg:4437837
 4:24 am on Apr 6, 2012 (gmt 0)

We need a dedicated thread to look at this odd phenomenon being reported by a subset of our members. It really isn't about any particular "update" because the apparent signs have been reported since 2008.

I have personally seen just a few examples of traffic shaping and nothing I could really call zombie traffic, but I think it's time for all of us to take the reports seriously and at least give advice on how to analyze what these webmasters are seeing.

To truly make sense of this, we'll need to pull in many areas of Google that we rarely talk about. This ain't your daddy's SEO! Here's a pretty good overview, from 2010: [webmasterworld.com...]

 

Martin Ice Web




msg:4528041
 9:09 am on Dec 14, 2012 (gmt 0)

backdraft i hold your theory. More than ever after they introduced google shopping to be paid. Organic results are now allways under the fold.
All the good eshops, informational sites that suffered from panda/penguin have been at top because poeple liked it and linked to it. Poeple have been happy with them. This is how internet and "static" business work. Google cut of the signals for its own profit.
Did you see coming up a new ecom site informational site in last 8 month because poeple linked to it while they liked it? Me not. You have to think there should be at least one new site that gains some rankings based on the new rules. Or are all poeple runing a website are only amateurs?

I think the zombies are totaly deliberately mismatched serps. I see it, you see it, all are seeing it that many of the search entries are far away from the query. Poeple click on everything ( because g$$gle delivers the best results ! :( ) and get pretty fast the impression that your site is not about the search think. I do need only 1-2 second to see if i am on a matching page.

Next think what worries me, is in LP interview he said that they have a pretty good knowledge about the way the e-business runs.That means, they realy know who wants to buy or is looking for information. The ads are always on target!

How to get the information:
-by web history
-by IP adress ( static, dynamic, mobile )
-by query
-by analysing the result clicks

Panda is not about content or UE but all about how to canalize the streams.

Remember the australien site that suffered from penguin in april. It was in the press and also discussed here on WebmasterWorld. I remember that Cutts himself gave the advise that he has some bad links. The guy removed the links and oh wonder a penguin updated occured 2 weeks later and the guy was back.
Many of us deleted links and after 8 month: nothing has changed.

SEchecker




msg:4528053
 10:02 am on Dec 14, 2012 (gmt 0)

There is no point to follow G any more (we made a decision in house)!

They can not provide nothing for us, no matter organic or paid, organic = junk & zombies (there is no referral in our statistics what produces such a bad bounce or inactive user volume than G and G refers 25% of our traffic, so we do have comparison data).

Paid is a never ending story to suck out the last cent for the POSSIBILTY to maybe sell. The put the risk to 100% on the back of the advertiser, support is not qualified and just have one goal to take MORE money! There is absolute no real control about clicks you receive and we all know G statements if you complain. The issue is always on the advertisers side! We have statistic here what shows perversions like 1 IP downloading products 10 up to 80 times, or zombie clicks on mass (click and close) no matter if we target only specific countries... clickers coming mostly from Asia or not target countries, so on Ö

There bidding system provides best results to the highest bidder only. A bidding system what can not guarantee results, just gives a possibility or chance is like gambling and favouring the highest spender and that is unethical in my opinion! I do not see anything positive form 1 year running adwords.

G makes the rules, what is their right! But G is nothing than a BRIDGE. A bridge where a surfer can cross from ONE CONTENT to ANOTHER. G displays OUR CONTENT and want to take the money from us the content provider. We, the Webmasters and business owner made them big with allowing them to index our content, now they are big enough and it seems they doesnít need us any more. But they might be wrong with that! And the future will show if Iím right or wrong...

Do not let yourself be fooled guys! The Internet is build up that one content leads to another. G knows that and is now starting to forbid links you control or donít control.

I do not eat the crap that penguin is about spam, in that case G could in silence adapt their rating system or filter things out. But G goes public and say that controlled linking is against their guidelines. Cuz they know, that we all can live without them if we control the links we have and get to some degree! Iím not talking from links with the purpose of spam. Iím talking to freely link and ask for links and sell links and so on. G succeeded to shock all Webmaster world to be scared to link as you never know exactly is it right or wrong. Linking is how the Internet function and G will not change that, even if they try hard!

There are hang out places where you can find your target group and there you need to be present and there you need to link! Forget the BRIDGE! And maybe the BRIDGE will stop charging to connect ONE Content with the OTHER! Take the direct way thatís my advice!


Best bet is to make your business without them if you can and most likely the Zombies will disappear and let you breath normal again and focus on your main job. Provide info, services and products. Do not give G more food and power Ö.

TypicalSurfer




msg:4528088
 1:42 pm on Dec 14, 2012 (gmt 0)

>> 2002 conspiracy theory

Google has publicly stated via their CEO that their game has changed, things like knowledge graph, paid shopping, "getting closer to the action", even MC has said that ads have value in terms of relevance and quality, this is a far cry from the google of 2002.

To beat down others that recognize that the ground has shifted is at least being out of touch, I won't speculate any further than that but it's certainly not a credible notion to be floating, that google is the pure search engine that it was a decade ago. This is supposedly a webmaster forum not the Google Defense League.

I've seen zombie stuff and throttling but backdraft and a few others are doing well enough with the description of the phenomena that I don't feel a need to add onto it. I'm not alone.

One question on the mobile/zombie thing..Are your sites that are experiencing this table based? While some consider the speed improvement on table based vs. css based sites laughable it is real and could be a factor as it relates to sending mobile traffic. A light weight css based or table based site may win a few points in mobile page scoring, enough to send mobile searchers to a site.

tedster




msg:4528090
 2:46 pm on Dec 14, 2012 (gmt 0)

not more than one hour later, I get a call from the Google Adwords team offering to get me a #1 spot in Google. Coincidence?

Ouch! If there is any connection at all (and it sure sounds like there could be) that is at the very least a foolish and insensitive thing to do. Any other reports like this?

---------

For those who are seeing periods of zombie traffic, can you do this further analysis of the traffic that is good compared to the traffic that is zombie? In particular, I'm wondering if the "keyword not provided" number is significantly different. This could be a further clue to how it is happening.

diberry




msg:4528109
 4:06 pm on Dec 14, 2012 (gmt 0)

Google has publicly stated via their CEO that their game has changed, things like knowledge graph, paid shopping, "getting closer to the action", even MC has said that ads have value in terms of relevance and quality, this is a far cry from the google of 2002.

To beat down others that recognize that the ground has shifted is at least being out of touch, I won't speculate any further than that but it's certainly not a credible notion to be floating, that google is the pure search engine that it was a decade ago. This is supposedly a webmaster forum not the Google Defense League.


+1

I believe any theory that ends with "...and then Google makes more money" should not be dismissed out of hand, because that's what Google does. Such theories should be tested for plausibility: how much money would G make [from zombie traffic]? How could they guarantee the webmaster won't buy advertising elsewhere, or just give up the site entirely? If they're doing this on a big enough scale to make real money, why aren't we hearing more about it?

BTW, after filing my reinclusion request a day or so before Backdraft mentioned that phone call, I got my "no manual action" email and no suggestion I should buy traffic. Then again, I don't use Adwords so maybe that accounts for the difference.

gadget26




msg:4528205
 6:44 pm on Dec 14, 2012 (gmt 0)

I was one of the "originals" that caused the start of this thread. In light of the current discussion, I thought I would give you an update. Just the facts, because every theory presented here just gets shot down.

Old established US Ecom site.

I ran a decent sized adwords campaign for many years. I ramped it down then killed it due to non-performance a few months before Panda 1.0 hit me hard.

I've been infested by zombies and under strict throttling for over a year. Backdraft7 may remember that my zombie pattern seemed to run exactly inverse to his.

Aug 13: Started adwords back up for google product search. It worked pretty well so...

Oct 1: Started regular adwords back up. (Not as big as before, but big enough to test.) Results are mixed. I probably need to do a lot more tuning.

Nov 20: Zombies left and never came back. Throttling also seems gone. Both the quantity and type of visitors seem to be back to normal and smoothly random. Overall referrals seem to average about the same, but the daily and weekly counts now vary the way I would expect them to. The inter-day plots also look natural now, with the expected bumps at the expected times.

I stopped paying, and was hit by Panda several times then by zombies and throttling.

I started paying and the zombies and throttling were lifted.

Coincidence? You decide.

Cheers,

Jeff

Panthro




msg:4528245
 9:00 pm on Dec 14, 2012 (gmt 0)

Lol.

@ Jeff - I'd be interested to know what changes to your site you've made in the meantime, though.

Sorry if you've already mentioned those in this thread, I haven't looked the middle chunk.

SEchecker




msg:4528250
 9:14 pm on Dec 14, 2012 (gmt 0)

Well as far I understand he started to pay again that was the biggest change. And thatís what the majority is doing ,if they can afford it. Thatís one of the reasons why I donít believe on any REAL 100% recovery stories, cuz if you recover, you most likely are not willing to pay any more or at least not offering that big budged!

Makes sense?

tedster




msg:4528255
 9:59 pm on Dec 14, 2012 (gmt 0)

@gadget26 - did you learn anything detailed about your zombie traffic? Any technical reasons why it was so unresponsive and poor for your site, for instance?

Awarn




msg:4528271
 11:32 pm on Dec 14, 2012 (gmt 0)

Heres one for you. My traffic was off over 50%. I worked and managed to make it to the top of page 2 (lived at the top of page 1 for years). I increased Adwords because now is the busy time and I need to be on page 1. Guess what, after a week of hefty Adwords I can't even locate the homepage in the search results. I have increases in Bing in Yahoo and only paid traffic from G. Personally I think its time for lawyers.

diberry




msg:4528296
 2:36 am on Dec 15, 2012 (gmt 0)

Jeff, that's extremely interesting data. It does stretch credulity as a mere coincidence.

Another possibility occurred to me today:

Despite quarterly earnings, Adwords (and indeed all net advertising) is under pressure to prove itself to advertisers. What if Google is feeding the best traffic to the sites that pay for Adwords, and the zombies are simply all that's left for you guys during certain periods?

And maybe this could explain the weird on/off effect - the sense you guys get that somebody flipped a switch and here came the zombies, that could just be when some major advertiser has started a major campaign.

tedster




msg:4528315
 3:40 am on Dec 15, 2012 (gmt 0)

@diberry, how could Google be "feeding the best traffic to the sites that pay for Adwords". That is, what mechanism could guarantee that "the best" SERP users click on an ad rather than an organic result?

TheMadScientist




msg:4528322
 3:54 am on Dec 15, 2012 (gmt 0)

Okay, I skimmed and got to this part:
I get a call from the Google Adwords team offering to get me a #1 spot in Google. Coincidence?

I can see why you're going, ah ha! (and I can definitely see how it would feel like a b*tch slap) BUT I can only see it as 'definitively zombie traffic related' if people who submit a reinclusion request for other reasons while they're still in the results don't get a similar 'opportunity' to be number one in Google for the right price.

IOW: I can see pushing Adwords to people who submit requests, especially if they're still in the results, because what they're 'indirectly stating' by submitting a request while still in the results is 'I want to rank better', but the only way I could see it being definitively correlated to 'zombie traffic periods' is if the rest of the people who submit reinclusion requests are excluded from the advertising offer, otherwise, all I see is 'a request from someone for reinclusion being made, which indirectly states they would like to rank better, so Google's Adwords team tries to capitalize on a sales opportunity and "sell a number one ranking".', but that's not 'cause and effect' of zombie traffic definitively, it's 'cause and effect' of a reinclusion request while in the results and possibly any reinclusion request in general.

Bewenched




msg:4528323
 4:15 am on Dec 15, 2012 (gmt 0)

Adwords (and indeed all net advertising) is under pressure to prove itself to advertisers.

If they really want to prove it to advertisers then they need to send the referrer to us.

What I also see in our niche, is that the big time advertisers are also showing up much much higher in the natural SERPs.... even when searching for a very long hand written phrase that we have on our page and they do not.... coincidence?

TheMadScientist




msg:4528328
 4:25 am on Dec 15, 2012 (gmt 0)

even when searching for a very long hand written phrase that we have on our page and they do not.... coincidence?

But, it sounds like to me, you're still thinking search is 'keyword based' rather than, for lack of a better way of explaining, 'definition based', so if the competing site has a phrase that's 'algorithmically defined' to equate to the essentially the same thing as your phrase is about, then I'm not surprised at all, especially if you click on the page(s) from the competing site(s) to see if it/they contain the phrase more often than you click on your own site for the same query or type of queries ... That has nothing to do with Adwords spend and everything to do with the new algo we're dealing with.

I expanded on this a bit with a question in the 'All About the Links' thread [webmasterworld.com...] because it raised a question for me and I don't want to get way OT here in this thread.

tedster




msg:4528339
 5:14 am on Dec 15, 2012 (gmt 0)

OK - so far our analysis of "zombie traffic" seems to have come up with one pretty solid explanation for some cases: mobile traffic. I can see how that might create the on-again off-again phenomenon - if Google were testing the site in mobile search results.

How might a stable total level of traffic ("throttling") also be a result of mobile traffic? That part I cannot even theorize about.

TheMadScientist




msg:4528341
 5:32 am on Dec 15, 2012 (gmt 0)

I don't think I'd try to equate that to mobile specifically, but rather the testing mechanism in general ... Result set type 1 for 100 (or N) queries ... Result set type 2 for 100 (or N) queries ... On/Off ... If there's a relatively constant number or queries per device type for the terms and the site/page(s) are shuffled back and forth between types it's likely they'd end up with an 'essentially the same' number of visits, where visits would increase, imo, is if they were 'defined' as 'flexible' and showed in both sets of results simultaneously, but, to me it really sounds like there's a test of some kind going on and sites are getting 'stuck' in the middle.

Say for instance (I'm making numbers up as an example), when 100 people with a smaller device (say an iPad) visit a site and the 'signal' derived from those users == 'user satisfied', the site hits the 'check for mobile' threshold and is throw into the mobile queries for a period of time or number of queries to determine if it's mobile friendly and to get a good sample it could be 'throw into mobile' during different day parts and/or query intent types On/Off ... On/Off ... On/Off for a while ... Doesn't meet the mobile threshold, back to the regular results ... Hits the 'test for mobile' threshold and it's back to on and off again.

Another thing besides my 'small screen' example for throwing the site into the mobile mix could be the G's mobile bot ... Has anyone who's seeing mobile checked to see if there's any type of relationship between G's mobile bot and the traffic changes?

tedster




msg:4528343
 5:51 am on Dec 15, 2012 (gmt 0)

The reason I zeroed in on mobile is that some members in this discussion pulled that difference out of their log files. So we have some members who actually caught that critter in the wild.

I agree that the entire testing mechanism Google apparently uses is worth considering, but testing what exactly would be good to pin down - again, now that we have the idea, let's capture the actual critter. Here are ideas I've had - differences to look at between zombie traffic periods and regular periods.

1. Analyze the number of "not provided" keywords - Google could be testing some searches against personalized, logged in users.

2. Analyze whether the variety of search terms shifts between long tail and short tail.

3. Google could be testing clearly purchase intent searches against informational searches - trying to determine which type of page your page is.

4. Analyze the country of the user's IP address during the two periods.

TheMadScientist




msg:4528344
 6:19 am on Dec 15, 2012 (gmt 0)

Yeah, I noticed it was definitely determined to be mobile in some cases, but for your next question about how 'mobile' specifically could do it, I was thinking broadly (for those who haven't determined it's mobile yet) it might be best to explore a 'testing mechanism' in general, but since we know it's mobile on some sites, maybe tracking that a bit more and digging in to it further will help us narrow down things and how they could apply to sites not noticing the mobile/non-mobile shifts...

gadget26




msg:4528347
 6:29 am on Dec 15, 2012 (gmt 0)

@panthro: I'd be interested to know what changes to your site you've made in the meantime, though.


After the first panda I fixed a lot of technical issues. I removed duplicate content. I spell-checked everything. I wrote new descriptions and took new pictures. I used rel=canonical and noindex to prune the site to a fraction of it's former size, removing similar and lower quality pages from google's index. Then I focused on user engagement statistics, testing and implementing dozens of changes. The last of these major changes were completed over 6 months ago. Every change seemed to result in reduced traffic.

Then I finally relented and split the informational pages out of my main ecommerce website into a sub-domain. (I hated to do this because it was worse for my users, but I wondered if google was confused about whether I was an ecomm or info site.) This was completed about 3 months ago.

The only thing I did in the last three months is reindex a bunch of the pages that I had noindexed earlier. Looking at the top pages in the SERPs, it's clear that many of them would not be there if there was a duplicate content penalty and they seemed to actually thrive with many similar pages. (Similar products, remember we're ecomm.)

Keep in mind that I am positive that I am still heavily Pandalized. Google organic referrals are still roughly 30% of what they were before Feb 2011. And google organic conversions are about normal for this season. It's just that they are consistent, with no zombie periods. In the long run, that makes a BIG difference.

And if the throttling is also lifted, I will now be "allowed" to increase traffic as I improve further.

Why? I dunno. Guesses?

1) Splitting my info pages to a sub-domain caused my main site to be more targeted and google was no longer confused about whether I was an ecomm or info site. (Were my zombies just info referrals to an ecomm site? In idea I put forth earlier.)

2) Restarting adwords brought many highly targeted visitors to my site, which changed the overall user metrics considerably. Could this cause the lifting of a hypothetical zombie/throttling "penalty"?

Other ideas?

Jeff

diberry




msg:4528357
 6:52 am on Dec 15, 2012 (gmt 0)

@diberry, how could Google be "feeding the best traffic to the sites that pay for Adwords". That is, what mechanism could guarantee that "the best" SERP users click on an ad rather than an organic result?


You're right, they can't control what we click but I think they can nudge us in certain directions. But let's say I bid for "red widgets", and Google can tell I'm selling red widgets not just presenting info about them. If it can separate the buy searchers from the info searchers and just show my ad to the buy ones, chances are I'll get some conversions (while someone else gets nothing but info searchers to their ecom site).

I'm sure they could probably get a lot more specific than that - I bet through their services, they've got a lot of searchers thoroughly profiled. Another possibility is localization. What if red widgets sell far better to people in Kansas than anywhere else in my selling area? They could set something to make sure my ads for red widgets gobble up plenty of exposure to Kansas IPs.

gadget26




msg:4528360
 7:05 am on Dec 15, 2012 (gmt 0)

@tedster: Did you learn anything detailed about your zombie traffic? Any technical reasons why it was so unresponsive and poor for your site, for instance?


Whether it's intentional or not, google makes it hard to analyze. Zombies were never on for the majority of a day and the analytic tools I have work on a daily basis. Plus all of the "keyword not defined" stuff doesn't help.

As I've said before, the main thing I noticed just watching traffic was that the zombies were mostly informational queries. (Which is why I finally relented and split my informational pages to a sub-domain.)

I don't know about mobile traffic. I seem to get a lot of mobile traffic that converts (albeit with smaller sales), so I don't think that's it in my case. Sorry, but as with most things google, there may be no one-size-fits-all answer.

If zombies are real people, I hope that we can all agree that they are "mismatched" with the page/site they were referred to. Maybe mobile users to a non-mobile-friendly site. Or informational users to an ecomm page/site.

The question then becomes: "Why is google sending mostly mismatched users to a page/site for hours and days at time?"

Jeff

bluntforce




msg:4528362
 7:27 am on Dec 15, 2012 (gmt 0)

@tedster
The reason I zeroed in on mobile is that some members in this discussion pulled that difference out of their log files


Not to nit-pick, but I believe that difference was from a script, not log files. I'm a big believer in log files, it's what allows a deeper understanding of individual users and overall user interaction.

It's also why I don't have a lot of trust in user agents.

TheMadScientist




msg:4528364
 8:06 am on Dec 15, 2012 (gmt 0)

Am I really still up posting? Gotta sleep, but...

If the JS is getting bad info, so is the log file, because if you're spoofing what you're doing or IP Address or whatever to the JS, you're spoofing it for the request to the server too.

They're actually based on the same info from the browser, except for browsers not running JS, but we're not concerned with those right now, because we know the 'zombie traffic' triggers JS events, so it's actually the most 'narrowed down' way to extract the most information from them, because more info is available via JS than is available to the server via request and if you tried to just go by server logs you would not get time on page or know which visitors are triggering JS events and which are not, so all you would see are bounces and have no idea what's really going on or how long a visitor was on the single page they visited for, because there's not an exit time sent to the server for it to log like you can get from JS, so all the server logs would tell you is there's bounces. (IOW: Server logs, in this case, would not answer anything near what JS will and the actual logs would likely send you on a wild-goose-chase.)

And, since we're being a bit 'nit picky', but not meaning to too much, they're technically both 'scripts'. One is written in a server-side language which writes the default info sent to the server by the browser at the time of the initial request to a log file. The other is written in a browser-side language which gets the same default info (and then some) from the same browser as made the request to the server and sends the info to a server-side script to be processed.

xcoder




msg:4528368
 8:54 am on Dec 15, 2012 (gmt 0)

^ spot on!

@bluntforce

User agent sending spoofed headers will affect your server logs exactly the same as your JS files... that's tracking ABC...

[edited by: tedster at 3:24 am (utc) on Dec 17, 2012]

Awarn




msg:4528450
 4:37 pm on Dec 15, 2012 (gmt 0)

I once saw in the data where google hit on my search page and had a series searches for various terms. It was all mobile traffic I believe since it had tags stating safari. It was coming from Europe. Was a lot of searches. I have since changed that so it that google doesn't index that for just random words. Now I don't really notice that type of traffic. It could be just a mobile bot that gets in a search area on sites are hits a database and it just keeps digging.

I think Google is having issues with the whole mobile thing and maybe we are somewhat of a test group. I wonder how many of the sites hit by Panda or Penquin or have the zombie traffic are sites that have made attempts to make their sites mobile friendly? Is there a pattern? Are we using redirects for mobile or CSS or what? Are we really better off to attempt to pursue this mobile market if in fact Google may be having trouble interpretting the results?

For example many of us sit here and say our traffic is off 40% or more. We are seeing more mobile traffic which seems to be harder to get to convert to sales. So we are being hit by Google with normal traffic losses and increases in mobile traffic but we just don't get the conversions. So would be better off to play dumb until the system is perfected? Basically abandon mobile development.

backdraft7




msg:4528459
 5:28 pm on Dec 15, 2012 (gmt 0)

I think the Zombie traffic issue is about more than testing. I think it's simply a methodical stripping of all good converting keyphrases for a particular site and it doesn't take much testing to determine what those phrases are. Why would they do such a thing?
1. The money.
2. They can.

What we've been left with is phrases that attract "less targeted" and lower volumes of traffic. I've now seen a 50% + loss of sales, yet strangely traffic is only down by 25 to 30%. This actually makes sense because as you drop in traffic, the 2% rule becomes less likely to be fulfilled. (for those who don't know, two percent is the number of conversions, under typical conditions, that you can expect on you best days. It's considered in my book to be the Holy Grail of sales. If you can do 2%, you're doing great! In most cases it's less).

Look, it's not about quality content, the current serps prove that.
Google wants diversity and the obvious side effect of diversity is sending the customer multiple places before they find what they were looking for. When your users have the attention span of a flea (most do) by the time they find something useful, they've been diverted to a different topic and forget about what they were originally looking for.

Google has simply pulled one of the oldest tricks in the grocery business. Rearrange your store so your old reliable customers have to looks harder for what they came for. Is that a better "user experience". No. If it was better, we as users would be experiencing it.

I also suggest we not believe everything being told to us by MC, LP and the Mountain View PR office or at the very least filter it with common sense and real world observations.

bluntforce




msg:4528470
 6:27 pm on Dec 15, 2012 (gmt 0)

6 minute snapshots of a tracking script ignores the other 90% of the hour. Are those snapshots once an hour, once a day, only when there is an increase in mobile visitors? That isn't defined.

If those snapshots were towards the top of the hour and the site had a younger person focus, I could see a spike in mobile visits in between class times as normal. But, that information hasn't been provided.

A complete day's server log allows analysis of the overall traffic, including requests for images, scripts and css. If images load as pages scroll, you can of course see those user actions.

It is possible that the zombie phenomenon is due to mobile devices, I just don't feel comfortable arriving at that conclusion based on script excerpts.

FWIW: I use server logs with an analyzer and with a database program, GA and PIWIK. They each have their purpose.

TheMadScientist




msg:4528482
 6:47 pm on Dec 15, 2012 (gmt 0)

If images load as pages scroll, you can of course see those user actions.

And of course you're using a JS for tracking in that situation.

[edited by: TheMadScientist at 6:58 pm (utc) on Dec 15, 2012]

bluntforce




msg:4528483
 6:55 pm on Dec 15, 2012 (gmt 0)

I have no issues with JS to collect information.

I have issues with conclusions drawn from limited information, although sometimes those conclusions can be correct.

TheMadScientist




msg:4528484
 6:59 pm on Dec 15, 2012 (gmt 0)

Not to nit-pick, but I believe that difference was from a script, not log files. I'm a big believer in log files, it's what allows a deeper understanding of individual users and overall user interaction.

I have no issues with JS to collect information.

Really, which is it?

This 428 message thread spans 15 pages: < < 428 ( 1 ... 4 5 6 7 8 9 10 11 12 13 [14] 15 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved