Forum Moderators: DixonJones

Message Too Old, No Replies

whats your tracking wish list?

building web based tracker, would love some input.

         

Craig_F

1:57 pm on May 2, 2003 (gmt 0)

10+ Year Member



Looking for some help here. After looking at other options, we decided to build a web based tracking system for our members.

I'd like to do a little more than what's offered by most run of the mill trackers, so what would you like to see? I'm especially looking for things that you feel help you with SEO, and are not currently available (as a standard report), or cost $$$ to add-on to your current tracking system.

Thanks in advance!

trillianjedi

2:30 pm on May 2, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Auto-detection of bots, then naming them and telling you what has been botted, and what hasn't (especially those pages visited a lot by humans).

Auto-detection managed by logging on to a central server where bot IP's can be added as new ones are discovered - maybe WebmasterWorld can manage a bit of bandwidth for that?

TJ

bcc1234

2:52 pm on May 2, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Tracker pixel and cookies is the way to go.
Make sure you set two cookies for the session and for the visitor. The visitor cookie should have at least 6 months expiration.

And make sure your server returns a decent CPC string, otherwise new IE won't accept your cookies by default.

dkubb

6:11 pm on May 3, 2003 (gmt 0)

10+ Year Member



And make sure your server returns a decent CPC string, otherwise new IE won't accept your cookies by default.

bcc1234, do you mean P3P string rather than CPC?

bcc1234

7:42 am on May 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



bcc1234, do you mean P3P string rather than CPC?

I meant CPP (Compact Privacy Policy) which is a part of a P3P specification.
You don't need the rest of it. Just CPP will do it for most browsers.

<added>
Even while typing this response, I typed CPC. I guess I type CPC way too much and my hands got used to it :)

pixel_juice

9:51 am on May 4, 2003 (gmt 0)

10+ Year Member



>>Even while typing this response, I typed CPC. I guess I type CPC way too much and my hands got used to it

Freudian typo? :)

Tracker wish list - drilling down visitor paths, i.e. the ability to track visitors to a certain page all the way back to their original referrer.

I second trillian on bot detection, and also on the ability to add new bots. I also like customisation of search engine tracking, so I can add a new search engine referrer and tell the tracker where to look for the query string. Lots of options = good ;)

dcheney

10:10 am on May 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



To me the most useful would be some sort of "what's different" type feature. In other words, with a year's data, tell me what unique has been going on in the last week. For example, proportionally more visitors from this search engine, or this site compared to the historic trends. Also my visitors going to this page than historically, etc.

bcc1234

7:38 pm on May 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I second trillian on bot detection, and also on the ability to add new bots.

That's the beauty of tracker pixels. Even though I have never even included the pixel url into the robots.txt, I haven't had a single request from a bot.

In other words, with a year's data, tell me what unique has been going on in the last week.

Also, storing that info helps to determine patterns. For example you might find that people come to a certain page and then leave, but another similar page keeps visitors longer and they tend to make a purchase.

You can cross-reference many things and find out much more than you ever could with raw log files. If you find that people who come from a source A and visit a page B during their session are more likely to buy - then you might direct traffic from source A straight to the page B.

I was really against pixels for a long time and considered them to be somewhat messy, until I actually tried it.

Another approach I might do (if I only could find some time) is to write an apache module that would process every request and do the same thing as a tracker pixel does. There is a similar mudle out there, but it does not do exactly what I need. In that case, I would really have to worry about bots. Image files won't be a problem cause I could filer the requests by the contect type apache itself sends in the response.

So if you ever get to starting a project for the tracking system, send me a sticky, I might get on it too.

Craig_F

1:06 pm on May 5, 2003 (gmt 0)

10+ Year Member



This is going along nicely. Some great info here.

What about newbies though? I tend to focus on the the nitty gritty when thinking about tracking, but what would be really useful to show newbies (we'll have lots of them) that's not easily available now?

I've been thinking of things like top keywords by page with the ability to drill down and see those same words by engine. Another would be simply showing top engines by page. Much of this is out there in other packages, but either the $$$ is high, or the knowledge level needed get at the data is.

Ideally, I want to provide some eye opening reports that the average business person could understand if they just had easy access to them. By 'easy' I mean just clicking a link to the report, no fussing around. For now we are focusing on search engine related reports since it ties nicely with another of our products. Any ideas along those lines?

Chris_R

1:09 pm on May 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Good location information

Country
State
or whatever.

So few have this - and you need to do a little bit of (ok a lot) of work to provide it, but it can be done [and yes I know it isn't 100% accurate].

pixel_juice

12:40 am on May 6, 2003 (gmt 0)

10+ Year Member



top keywords by page with the ability to drill down and see those same words by engine

Sounds good, plus make sure you include a link to the actual serps url that the visitor came through. Nothing better than clicking back to see what your visitor saw.

By 'easy' I mean just clicking a link to the report

Summary reports are nice, and are epecially helpful for people new to visitor analysis who don't want to mess around with statistics to get the information they want.

For now we are focusing on search engine related reports

I think this is a very sensible direction to go in, as many of the established stats packages ignored search engines for too long ;0)

[added]County/state information - I second that too :)[/added]

Craig_F

4:37 pm on May 8, 2003 (gmt 0)

10+ Year Member



> Auto-detection of bots

I REALLY want this too, but have no idea how to do this. Anyone know how? We are currently using JS tracking code with an image backup, but most bots ignore all this so we don't see them in the reports.

Thanks!

cfx211

5:13 pm on May 8, 2003 (gmt 0)

10+ Year Member



In my wish list would be the ability to do path analysis based on database attributes. For instance give me the most common entry points for everyone who created a membership, or for all of our purchasers yesterday show me the most common path to adding to bag.

pixel_juice

5:15 pm on May 8, 2003 (gmt 0)

10+ Year Member



Anyone know how

You need to use a server-side language like perl or php to track bots. Or this information is almost certainly in your server logs anyway if you have them. You just need a program to extract only the information about bots.

Perplexed

9:22 am on May 9, 2003 (gmt 0)

10+ Year Member



What would I really want from a tracker? The impossible of course.... I dont want to kow so much about what search words found my site ( well, I do, but not for the point of this post ) What I really want to know is which search words were used that DIDN'T find my site when I would have liked them to.

Storyteller

11:19 am on May 24, 2003 (gmt 0)

10+ Year Member



1. Hassle-free, plug-in installation. I've done a tracker as an Apache Perl module, but a good one needs to be a C-written DSO.
2. Featureful web-based interface. Easy if you're doing it as an Apache module.
3. Logging to a relational DB with ODBC capability (say, MySQL), so users can develop their own reports easily with common office tools (Excel, Access, etc).

A feature to spy on web traffic on an Ethernet interface would be a real killer ;)

shrirch

6:03 am on May 26, 2003 (gmt 0)

10+ Year Member



Here's my wishlist

-- Search key phrases by day by search engine
-- Google adwords detection
-- Visit length / depth by refer source

:)

pardo

12:52 pm on May 28, 2003 (gmt 0)

10+ Year Member



If I could have...

1. total session control; from referrer - site path - forwarder information
2. se % referrer towards total referers (excl. direct hits)
3. easy configuration of dynamic urls into measurable content/text
4. easy configuration of specific content groups (country data, product group data e.g.)
5. webbased access
6. fast performance (I use NetTracker and that is extremely good but also extremely slow at analysing)
7. affordable pricing / licensing structure

anallawalla

3:03 pm on Jun 4, 2003 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Search strings grouped if identical.

Search strings grouped by referrer.

E-Commerce links grouped by referrer. (Where do my paying customers come from)

Reverse link from a summary report to the detail entry that caused it., e.g. 4 hits from googlebot.

Bot id should be done by a lookup table in a separate ASCII file.

Custom visitors list in a separate ASCII file so that we can track any domain or IP we want to highlight in a "VIP visitors" list.

The tiniest of tracking codes on web pages.

claus

10:26 pm on Jun 27, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Craig_F, the wishlist seems long enough already, but i just found it now. With current wishes, it could become a very interesting, high-end, tool. Just don't build something that a spreadsheet is better at - or a database. Make it easy to export and save in stead.

For this purpose (pro use, along the lines of the wishes posted), i have one concern only: Scalability.

It must be able to handle tons of pageviews. If it can, I could think of quite a few customers.

Anyway, you wrote:

What about newbies though?

- and that's something completely different. You might wish to make two different versions. Newbies, and management as well, don't like too many bells and whistles, imho.

What i think of is something extremely basic:

The newbie/management-wishlist

1) Figures for visitors per page (ranking/toplist), both with and without www. and searchstrings/DB-id's. (simply add them)

1a) a search function to get stats for that one special vip-page that's always there somewhere.

1b) some kind of map showing the many roads that lead to this very interesting page

1c) ability to see how many leaves the road at which stages.

2) Ability to get customized stats (no. of visitors/pw) "clustered" like, eg.:

- [domain.com...] (with all underlying url's)
- [domain.com...]
- [domain.com...]
...etc.

2b) The url-hierachy is not always that logical, so it would be best if some specialist (@ the customer) could set up the relevant groups.

3) time-series (!): so many this week, so many last week, so many same week the year before.

3b) Ability to aggregate # of visitors or pw for a specified period of time (typically an ad-campaign)

4) loyalty: repeat visitors: how many of our visitors this week were also here last week, how many of our current visitors are new?

4a) DNS-lookup on the IP's visiting. The list should be the host names, not the IPs (this is too technical)

4b) toplists of visitors, aggregated by day, week, month. Just three figures: visits, visitors, pageviews.

5) Referrers: Toplist, aggregated by domain. It should not be the very detailled lists including search phrases, that a professional would like to see.

5a) ability to search, by inputting some current advertising-url or affiliate site.

5b) Some possibility for a specialist at the company to aggregate, so that ie. google.com and google.co.uk becomes "google".

5c) listing of searchwords by engine.

That was five basic suggestions. They will provide good and valuable info without getting "too specific". A product that could do only 1-5 would be valuable. I could think of quite a few customers for this product as well. Other customers, that is.

Pay attention to design, though. The interface must be very simplistic, imho.

Hope this was of any use.
/claus

cornwall

11:10 am on Jun 28, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>What I really want to know is which search words were used that DIDN'T find my site when I would have liked them to<<

That is what I would like too. I realise there is no way, other than hard work to acquire the info, but one can dream

It is difficult to know whether "ruritanian widgets" and the various combinations of singular and plural widgets, ruritania or ruritanian, word order, or something else altogether will get more searches

The software available (yes, I have it, and know also where to look for free info), while perhaps good for high volume searches, tends not to be statistically valid for many small businesses. All one can then do it trawl web logs and tweak sites to try to hoover up a wider span of searches.

lukasz

1:35 pm on Jun 28, 2003 (gmt 0)

10+ Year Member



1.Suport languages other than english (including asian langages)
2.Reliably tell how many people added my page to favourites.
3.Let mke know the real rebound rate (percentage of visitors who left seeing only one page).