Forum Moderators: DixonJones
I'd like to do a little more than what's offered by most run of the mill trackers, so what would you like to see? I'm especially looking for things that you feel help you with SEO, and are not currently available (as a standard report), or cost $$$ to add-on to your current tracking system.
Thanks in advance!
Auto-detection managed by logging on to a central server where bot IP's can be added as new ones are discovered - maybe WebmasterWorld can manage a bit of bandwidth for that?
TJ
bcc1234, do you mean P3P string rather than CPC?
I meant CPP (Compact Privacy Policy) which is a part of a P3P specification.
You don't need the rest of it. Just CPP will do it for most browsers.
<added>
Even while typing this response, I typed CPC. I guess I type CPC way too much and my hands got used to it :)
Freudian typo? :)
Tracker wish list - drilling down visitor paths, i.e. the ability to track visitors to a certain page all the way back to their original referrer.
I second trillian on bot detection, and also on the ability to add new bots. I also like customisation of search engine tracking, so I can add a new search engine referrer and tell the tracker where to look for the query string. Lots of options = good ;)
I second trillian on bot detection, and also on the ability to add new bots.
That's the beauty of tracker pixels. Even though I have never even included the pixel url into the robots.txt, I haven't had a single request from a bot.
In other words, with a year's data, tell me what unique has been going on in the last week.
Also, storing that info helps to determine patterns. For example you might find that people come to a certain page and then leave, but another similar page keeps visitors longer and they tend to make a purchase.
You can cross-reference many things and find out much more than you ever could with raw log files. If you find that people who come from a source A and visit a page B during their session are more likely to buy - then you might direct traffic from source A straight to the page B.
I was really against pixels for a long time and considered them to be somewhat messy, until I actually tried it.
Another approach I might do (if I only could find some time) is to write an apache module that would process every request and do the same thing as a tracker pixel does. There is a similar mudle out there, but it does not do exactly what I need. In that case, I would really have to worry about bots. Image files won't be a problem cause I could filer the requests by the contect type apache itself sends in the response.
So if you ever get to starting a project for the tracking system, send me a sticky, I might get on it too.
What about newbies though? I tend to focus on the the nitty gritty when thinking about tracking, but what would be really useful to show newbies (we'll have lots of them) that's not easily available now?
I've been thinking of things like top keywords by page with the ability to drill down and see those same words by engine. Another would be simply showing top engines by page. Much of this is out there in other packages, but either the $$$ is high, or the knowledge level needed get at the data is.
Ideally, I want to provide some eye opening reports that the average business person could understand if they just had easy access to them. By 'easy' I mean just clicking a link to the report, no fussing around. For now we are focusing on search engine related reports since it ties nicely with another of our products. Any ideas along those lines?
top keywords by page with the ability to drill down and see those same words by engine
Sounds good, plus make sure you include a link to the actual serps url that the visitor came through. Nothing better than clicking back to see what your visitor saw.
By 'easy' I mean just clicking a link to the report
Summary reports are nice, and are epecially helpful for people new to visitor analysis who don't want to mess around with statistics to get the information they want.
For now we are focusing on search engine related reports
I think this is a very sensible direction to go in, as many of the established stats packages ignored search engines for too long ;0)
[added]County/state information - I second that too :)[/added]
A feature to spy on web traffic on an Ethernet interface would be a real killer ;)
1. total session control; from referrer - site path - forwarder information
2. se % referrer towards total referers (excl. direct hits)
3. easy configuration of dynamic urls into measurable content/text
4. easy configuration of specific content groups (country data, product group data e.g.)
5. webbased access
6. fast performance (I use NetTracker and that is extremely good but also extremely slow at analysing)
7. affordable pricing / licensing structure
Search strings grouped by referrer.
E-Commerce links grouped by referrer. (Where do my paying customers come from)
Reverse link from a summary report to the detail entry that caused it., e.g. 4 hits from googlebot.
Bot id should be done by a lookup table in a separate ASCII file.
Custom visitors list in a separate ASCII file so that we can track any domain or IP we want to highlight in a "VIP visitors" list.
The tiniest of tracking codes on web pages.
For this purpose (pro use, along the lines of the wishes posted), i have one concern only: Scalability.
It must be able to handle tons of pageviews. If it can, I could think of quite a few customers.
Anyway, you wrote:
What about newbies though?
- and that's something completely different. You might wish to make two different versions. Newbies, and management as well, don't like too many bells and whistles, imho.
What i think of is something extremely basic:
The newbie/management-wishlist
1) Figures for visitors per page (ranking/toplist), both with and without www. and searchstrings/DB-id's. (simply add them)
1a) a search function to get stats for that one special vip-page that's always there somewhere.
1b) some kind of map showing the many roads that lead to this very interesting page
1c) ability to see how many leaves the road at which stages.
2) Ability to get customized stats (no. of visitors/pw) "clustered" like, eg.:
- [domain.com...] (with all underlying url's)
- [domain.com...]
- [domain.com...]
...etc.
2b) The url-hierachy is not always that logical, so it would be best if some specialist (@ the customer) could set up the relevant groups.
3) time-series (!): so many this week, so many last week, so many same week the year before.
3b) Ability to aggregate # of visitors or pw for a specified period of time (typically an ad-campaign)
4) loyalty: repeat visitors: how many of our visitors this week were also here last week, how many of our current visitors are new?
4a) DNS-lookup on the IP's visiting. The list should be the host names, not the IPs (this is too technical)
4b) toplists of visitors, aggregated by day, week, month. Just three figures: visits, visitors, pageviews.
5) Referrers: Toplist, aggregated by domain. It should not be the very detailled lists including search phrases, that a professional would like to see.
5a) ability to search, by inputting some current advertising-url or affiliate site.
5b) Some possibility for a specialist at the company to aggregate, so that ie. google.com and google.co.uk becomes "google".
5c) listing of searchwords by engine.
That was five basic suggestions. They will provide good and valuable info without getting "too specific". A product that could do only 1-5 would be valuable. I could think of quite a few customers for this product as well. Other customers, that is.
Pay attention to design, though. The interface must be very simplistic, imho.
Hope this was of any use.
/claus
That is what I would like too. I realise there is no way, other than hard work to acquire the info, but one can dream
It is difficult to know whether "ruritanian widgets" and the various combinations of singular and plural widgets, ruritania or ruritanian, word order, or something else altogether will get more searches
The software available (yes, I have it, and know also where to look for free info), while perhaps good for high volume searches, tends not to be statistically valid for many small businesses. All one can then do it trawl web logs and tweak sites to try to hoover up a wider span of searches.