homepage Welcome to WebmasterWorld Guest from 54.242.18.190
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld

Home / Forums Index / Google / Google AdWords
Forum Library, Charter, Moderators: buckworks & eWhisper & skibum

Google AdWords Forum

    
Keeping track of adwords/analytical details
Wicketywick




msg:4267890
 11:00 am on Feb 16, 2011 (gmt 0)

Anyone have best practices for keeping track of adwords data and optimizing it? Excels? Access databases? Or just eye the details and make educated guesses?

 

RhinoFish




msg:4267941
 2:18 pm on Feb 16, 2011 (gmt 0)

optimizing adwords depends on your goals, so you've asked a very broad question here. could you narrow it down some?

Wicketywick




msg:4267964
 3:21 pm on Feb 16, 2011 (gmt 0)

well the main problem is documenting campaigns, ads and there results. Right now I am using Adwords editor to manage my campaigns and ads. Hypertracker to A/B split-test landing pages and google analytics to track goals (opt-ins, time on site and pagesviewed).

Of course I have been tweaking and comparing ads a bit increasing the CTR quite significantly. For the sake of future reference, I have not deleted any old ads; this does however clutter my adwords editor.

I would like to store this information, it might prove helpful in the future.

The goals however are: more opt-ins, more calls and ultimately more cash.

What do the real pros do, they must somehow document campaigns results.

LucidSW




msg:4268264
 2:37 am on Feb 17, 2011 (gmt 0)

Adwords data is best handled by a database. Spreadsheets like Excel just don't cut it. They are not meant to do the sort of thing you really need to do. You got to use the right tool for the right job: don't use a hammer to saw wood.

I download the daily stats into my database (I use Foxpro since that is my specialty). I therefore have the highest data resolution possible, at least what Google provides: each day's data for each keyword and each ad. It's then a relatively simple matter to run queries (reports) based on all sorts of things to help me optimize a campaign on many levels.

Wicketywick




msg:4268344
 9:43 am on Feb 17, 2011 (gmt 0)

Hello LucidSW

Thanks for the idea. I kinda thought it would come to that.
I love excel but you are completely spot on about using it for this kind of job.

How do I get some decent database solution? I am no database developer, so creating something from scratch is not really an alternative.

Is there any pre-made databases for Adwords/Marketing campaigns?

Operating and customizing a database system, once developed, would be doable - I have some experience too.

LucidSW




msg:4268414
 2:16 pm on Feb 17, 2011 (gmt 0)

Unfortunately, if you are serious, you will have to invest time to learn databases. When I when to school to learn computer programming, there was a whole course for a whole year just on databases (still have the book we used on my desk) on top of the other courses such as C++ and some other languages and computer technologies in general.

There are no commercially available software that I know of specifically for PPC. Then again, I've never looked for one. If you did a search on "ppc management software" that may turn up some things. Or you could hire a programmer to build you one.

RhinoFish




msg:4268424
 2:51 pm on Feb 17, 2011 (gmt 0)

my view is that things change so often, no two sites behave the same, and the data is very difficult to make sense of when looking back because of the compounding action of changes made... i think the value of storing the data is far less than case specific testing, so there's a quixotic time thief factor in play here.

Wicketywick




msg:4268426
 2:52 pm on Feb 17, 2011 (gmt 0)

ok, sounds a bit daunting.

I did a search on ppc manangement software, but they are all solely for ppc campaigns - in my case I wouldn't be able to track result of offline campaigns.

Probably too much to ask, but would you mind sharing the structure of database i.e. how you split the information in your tables.

That would give me a starting point - establishing a structure is always the hardest part.

Wicketywick




msg:4268446
 3:53 pm on Feb 17, 2011 (gmt 0)

well I want to be able to re-use tests already done. Once documented, I can always show solid numbers to customers.

Of course if done wrong, the time factor outweighs the advantages gained.

LucidSW




msg:4268659
 8:39 pm on Feb 17, 2011 (gmt 0)

> would you mind sharing the structure of database i.e. how you split the information in your tables.

It's called normalizing the database. I would mind but, this is pretty simple, any other DB guy/gal could do it, and Adwords' structure and wording makes it easy, so here goes:

Campaigns table. Fields containing full name of campaign. I limit to 50 characters to make it simple. Not sure what Adwords actual limit is. I generate a unique ID since Google does not provide it but that's simple. I also have a field for the service (Google, Bing...), again a unique one-character code.

Adgroups table. Linked to above table through the campaign ID. I generate a unique ID here too since Google doesn't provide one. Length of groupname field is currently 50 but I need to increase that.

Ads table. Linked to adgroups table through adgroup ID. Here, Google is nice and provides their unique 10-character ID, so I use that as the primary. Also headline, adline1, adline2, dispurl and desturl fields, all memo fields (called variable length in other databases). Since I manage client accounts, I also have a field telling me if I created the ad or if the client did, just for statistical purposes.

Keywords table. Again, I must generate my own unique ID. Links to ads table. 80-character field of the keyword (this is Google's limit) and a matchtype field. Note that since this links to the ads table, there will be the same keyword repeated for the same ad group. That was my design decision which made sense: a keyword is linked to the ad, not the group. It's something to keep in mind when I do certain types of queries.

Results table. Links to keywords. I store each day's data for each keyword in here so I have the date (I call the field served), imps, clicks, adposition, cost, conversion and network (Google, Search Partner or Content network code).

Since I manage client accounts, I also have a clients table that links to the campaign table. If you have an MCC account, one field Google provides is their unique client ID, so I use that. But I generate my own should the client also use Bing, Yahoo or whatever else.

Next thing I want to implement is a landingpage table. The idea is to track versions of landing pages. This would contain the destination URL and a serialized version number with start and end dates. It would link to the ads table with a unique page ID (the desturl in that table would be replaced with it). The problem is tracking this. I like to know of even the smallest change on a page but clients don't tell me that. I could periodically read the page (say once a month) and compare the code. Not the perfect solution but better than nothing.

You may be wondering about QS. I could track this over time too. I used to on a daily basis (keyword ID, date and QS) but the table becomes huge. Plus, QS is for the keyword-ad, not just the keyword, so that's problematic. One solution to the table size problem would be tracking only when the QS changes (start/end dates). I finally decided it was not worth to do so. If I need QS for some calculation, best to just download the current QS and use that. If QS had better resolution such as one decimal point, I might reconsider as that would be more useful.

One thing I should do which I don't right now is a table of bid values. Again, start and end dates with the bid. This could be taken from the Changes report.

There's other data I could download and store too which I don't right now, the product extension data for one. Some which I occasionally do are the Search Query Report and the hourly report. These don't link to the main tables.

Does that answer your question? A pretty simple setup really.

Wicketywick




msg:4268696
 9:45 pm on Feb 17, 2011 (gmt 0)

Hi, what a great reply. I PDFed this page as a project - oh I can see it coming, long nights working on this. :-)

I will have to work on that for a while, I do have experience with Access so I am tempted to try and build a structure similiar to yours using Access. The links/relationships are probably hardest to solve. On Access one has to select which type of relationship - any tips on that?

I will implement your design step by step; hopefully reaching my ultimate goal.

Even if it doesn't work, I can still pass it on to any database designer as a template.

To track changes on landing pages (also those niggly changes done by any them clients), I would recommend two free services:

[trackengine.com ] tracks bookmarks changes on intervals, which you define, for changes, and sends you an e-mail with the html content highlighting changes. I use this to track rankings for a particular keyword on google.

The second solution is the add-on "Update Scanner" for firefox which is easier to use and local, but it exhausts your computer resources ultimately slowing down firefox (I don't think this add-on demands so much resources, it is just keeps adding up with every new add-on).

Thanks again - this is by far the best response I have gotten on a forum.

LucidSW




msg:4268795
 2:51 am on Feb 18, 2011 (gmt 0)

> On Access one has to select which type of relationship

I think you mean Unique, Candidate or Regular. At least those are the terms used in Foxpro. I never specify this and it defaults to Regular. But all the IDs I mentioned are Unique. I think all it does is make sure they are unique but my program to suck the data in handles it although it may be good practice to actually set it to unique.

The service you mention will be impractical. The total number of pages I'm talking about is around 250k. Plus I wouldn't need to know the changes, just that something has changed in order to run queries on the different pages. Best to do it myself although I know checking them all would take about 7 hours based on recent tests. Obviously an overnight job. But there's no need to do it every night, just rotate once a month or every two months. That would be 10 minutes of computer time to check 4000 pages, something it can do while I take a break.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google AdWords
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved