Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Tracking placement in SERP for many keywords?

         

xnavigator

10:18 am on Apr 24, 2010 (gmt 0)

10+ Year Member



Hello guys in these days my website is getting some weird traffic change...

from day by day unique visitors can change by 20% too (we are talking about a website that makes more than 20k unique/day )

Since like all of my traffic comes from google.. there is a way to save the position on the serp of my keyword without spamming google with auomatic request?

I can't do this manually since i have more than 10.000 almost different keywords generating traffic to my site...

Thanks everyone for help

tedster

5:02 pm on Apr 24, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



First, appreciate that there is no one position for a URL on any keyword search. Google's results vary according to many factors. For example, variation is introduced by which data center Google's load-balancing sends a request to, by geographic location of the user, by browser, by "search history" personalization when a user is logged in to any Google account, and by cookie-based customization when a user is logged out. In recent times, some people are even reporting that "time of day" also seems to affect some search results.

That said, many rank tracking tools do exist, some free and many paid. But they all send automated requests to Google. And the positions they record are subject to all the above fuzziness.

There is one way to get some ranking data without sending automated queries - see this current thread WMT Now Shows ALL Keywords - with CTR and Position [webmasterworld.com]. As you'll note in the discussion, the accuracy of the data is questioned by some, and certainly given my earlier comments, the positions reported must be averages.

In your situation, the first place I would turn is whatever analytics package you are using - it should be capturing traffic by keyword.

It seems unlikely that ALL keywords are down by 20% across the board. Instead, something like the 80-20 rule is likely in play - and it may well be more like 95-5 04 even 99-1 in the case of a large website. With analytics data, the loss of traffic can probably be pinned down to a significant degree, and that would give you a lot fewer keywords to look at in your diagnosis.

xnavigator

10:22 pm on Apr 25, 2010 (gmt 0)

10+ Year Member



Well after searching for some Tracking SERP app on google i have found many of them..
can u suggest me one? (free or not isn't a big problem)

for analytics stuff i use google analytics (of course)
Problem is there are like more then 10.000 key generating traffic daily so i can't browser manually all of them ..

And i don't think g.a. can sort the results having the keywords with bigger fall on top, or can it?

thanks

tedster

10:25 pm on Apr 25, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



No, sorry - this forum does not discuss specific SEO tools (see the Google Forum Charter [webmasterworld.com])

You can export the Analytics data in a .csv file, and then use Excel or another spreadsheet application to massage the data however you want.

xnavigator

11:28 pm on Apr 25, 2010 (gmt 0)

10+ Year Member



will try csv lol

i couldn't think it before

xnavigator

9:00 pm on Apr 29, 2010 (gmt 0)

10+ Year Member



OMG

it seems google analytics got some weird bug with exporting tool ====///

i get 2 totally different value on google analytics and on csv exported file...

someone can test it please? i am going mad right now

[edited by: tedster at 10:55 pm (utc) on Apr 29, 2010]