On some low traffic campaigns I run a redirect script for tracking which forwards the user on to the destination pageand also sends me an email as a quick way to see what terms are getting hit. Even though one campaign is paused, I've just had a flood of emails for every term it was targetting.
Checking the stats shows that these are accesses by googlebot, running at about 2 per second: - 184.108.40.206 - - [05/Apr/2005:17:50:52 +0100] "GET /redirect.php?term=1234 HTTP/1.1" 302 5 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
Has anyone else seen this? Do you think Google might be doing it to ensure the display url matches the true destination of the link (about time), or do you think they are going to start crawling adwords users sites more for natural search?
I know that second option is unlikely and controversial, but they've always said the mediaparters bot and googlebot are totally separate, so it would seem odd to use the googlebot name for a crawl that wasn't related to natural search.
Not sure if it's related, but I've recently seen a domain for which I manage their Adwords account come up in the SERPS due to the Adwords ads that have been apparently cached as plain text on the (content) site.
For example when searching for one of the targeted keyword combos "widget relief program", site www.example.com shows up in the SERPs. When clicking on the live link the search terms are nowhere to be found on that page. But if I look at the Google cache of the page my Adwords ad is highlighted and apparently triggered the search placement.
Just this morning (around two hours ago) I noticed Googlebot hitting a whole bunch of my AdWords destination URLs (all the same page with different reference tags added on to the end of the URL). Those URLs aren't used anywhere besides my AdWords ads, unless someone else picked them up and linked to them (and I haven't seen any evidence of that).
This was a big problem for me also as I got throusands hits from the Googlebot. It was searching for every keyword that I had on Adwords even for many of the deleted campaigns, adgroups and ads. My webhost threatened to close down my site as I was chewing up too much of CPU. Then I had to force all the clicks from the Googlebot to return empty and then things stabilized.
> I hope not, that would make tracking URLs rather useless...
That's what I meant by true destination url, the page the user actually sees, as apposed to the destination url you enter into the management center.
As I understand it at least the display domain should match the true destination url, but many are showing fake domains (e.g. .co.uk instead of .com or adding in hypens) to get past the one ad per domain rules.
I hope they didn't use the Googlebot to check the quality because I return an empty file for all hits coming from the google bot after it started chewing up my CPU time. If they use this for quality check again, the black hat operators can easily get over this problem, as all that one has to do is to redirect to the "Correct URL" if the "User Agent" is Googlebot. It would take a couple of seconds to code this.