Forum Moderators: DixonJones

Message Too Old, No Replies

Webtrends: Ad Views vs Ad Clicks

How to make a page unique.

         

FMG_Jeff

11:19 pm on Dec 10, 2004 (gmt 0)

10+ Year Member



I am looking to set up the Ad Views vs. Ad Clicks reporting abilities in Webtrends but I have stumbled upon an interesting question.

Our ads point to content that can also be accessed through regular navigation. This will bloat the visits recorded to the landing page. If I make the landing page unique by adding parameters to the URL, I don't want that page with parmeters to be indexed by search engines. If the page with parameters is indexed, and visited, it will be recorded as a hit from the ad.

Has anyone come up with a creative way to get around this problem?

cgrantski

5:52 pm on Dec 12, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've got a suggestion that may be entirely unsatisfactory to you, but here goes. It only works accurately if you are using page tagging (SDC) to get your data and if your site has a session or permanent cookie. Data on visitors that don't accept cookies will be slightly inaccurate. It assumes that if somebody clicks on the ad twice in one visit, you want to count it twice for your purposes. But, at the same time, if you're going to use the technique you described, you'll still have inaccuracies for visits in which the ad was clicked on more than once ... but that's another discussion.

This also assumes that the page that displays the ad does NOT have any other means to navigate directly to the ad's destination page.

Instead of a URL with a marker (and tabulating parameters for the destination page) consider using path analysis, particularly the WT feature called Single-Step REVERSE path analysis, and count the number of predecessors of the destination page that are the ad's display page.

Hitmatic

2:06 pm on Dec 13, 2004 (gmt 0)



Jeff,

If you are only posting the appended URLs with an Ad serving agency, why would they be indexed?

However, two possible solutions would be:

1) Point the ads to duplicates of the landing pages hosted within a directory that is marked disallow in the robots.txt file.

and/or

2) If appending the destination URL:

eg. landingpage.html?campaign=AdServer

why not cross check that the referrer for that user came from the adservers domain?

James

FMG_Jeff

5:21 pm on Dec 13, 2004 (gmt 0)

10+ Year Member



This technique is for pages served on our site. We are looking to track the performance of our front page and right column ads.

I have considered the idea of duplicate landing pages in a folder that is disallowed in robots.txt. This does mean that we would need to maintain a duplicate set of pages though.

Thanks for the input.

larryn

6:10 pm on Dec 13, 2004 (gmt 0)

10+ Year Member



Jeff,

I'm not sure about a windows server, but on a unix server instead of duplicating the pages, you could put in the file system symbolic links to the original content.

Larry

FMG_Jeff

6:56 pm on Dec 17, 2004 (gmt 0)

10+ Year Member



We are in a windows environment. Thanks for the dialouge.

Am I on the right track here? Tracking real estate on the homepage is important. What is the conventional way to to do this?

I did have another solution to this. From the homepage ad, a parameter could be sent, perhaps indicating the location of that ad on the front page. That parameter, on the landing page could switch the robots tag to NOINDEX. That should keep the page out of most search indexes.

cgrantski

7:24 pm on Dec 17, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Or, you could mark the link from the home page with a parameter as described above, then filter out (or simply subtract) visits that started at the parameterized internal page. The remainder would be visits in which the home page link was clicked on.

Another idea: use Reverse Path Analysis with the parameterized internal page as the starting point. You'd get a list of pages that preceded your target page, and the list would also tabulate instances where that page had no preceding page (was the start of a visit). When the preceding page is the home page, you'll have your count. However, this method will count clicks, not visits .... so if a given visit clicked on that link twice, this method would count it as two.

Either of these would allow you to not worry about whether a search engine is indexing the parameterized link (if the page is deep, you might want this link to be spidered since it may be the only way that page gets into an index).