It looks like you can add sites in which you do not want your ads to show. Right it seems to be only for Content Networks though.
This had been one of my must top priority features for me.
Thanks Google for listening to us and implementing it.
As usual you hear it first on Webmaster World :)
It would be great if there was a central repository of scraper urls that we could just go and grab... there's an idea a scraper site that scrapes the scrapers!
... it is a shame that google doesn't make this a bit easier, but it is better than nothing.
Awesome! This will be incredibly helpful.
I'll be linking to this entire thread in the Advertiser Feedback report, and am interested to hear what you have to say, whether it's positive or negative.
By the way, you all must know that feedback from this Forum played a real role in getting this tool in your hands.
;) AWA
Unfortunately there is no "easy" way of finding the domains you want to cull. - I spent an hour or so searching Google in the manner I mentioned above, which gave me a list of a dozen or so that I culled. I could go through the logfiles and pull all the records under the googlesyndication tag and then derive the domain names from those and then block those... but it's a very time-consuming way of going about things.
AND a problem with going through the logs also is that you're only getting a list of the domains that ARE sending you the traffic -- what about the ones that ARE NOT sending traffic.
Looking in my adwords account there is one adgroup in particular where the impressions are totally mental, and while I do get a fair number of clickthroughs from the group, my listings must also be running on some very popular yet irrelevant sites that are generating zero clickthroughs for me -- it would be great to be able to knock these kinds of sites off as all they do is skew my clickthrough rate.
Still if every user went through and barred a dozen sites, it's a good start!
Something like "14 advertisers have opted out of displaying advertisements on your site."
May give an impetus to some to improve their copy/content and also save them endless days banging their head against the wall trying to figure out why all they are getting is PSAs...
Just a thought.
For finding sites to potentially filter, the easiest thing I found was to go through my raw logs, and pull out all the referrals from content sites. Then I went and extracted the publisher page URL from those long Google URLs, and I then went through the process of visiting each and every one ;) And if you want to get right down to it, you can check your ROI conversions for each content site, and filter accordingly.
My biggest surprise? How few sites I actually ended up adding to the filter.
Is anyone considering creating a public (or semi public) ban list (and / or a list of "desirable" publishers)?
MG
1. Learn to use your system's filter ability.
2. Set the filter to only show visitors that include the URL pagead2.googlesyndication.com
3. Navigate to the page that contains referrers and orders (often a campaign settings page that lists ROI and referrer or orders and referrers).
4. Sort in reverse order from least to most orders.
5. Determine your cut off points (how many orders/$$/etc per visitor you want. If a site doesn't meet that criteria - block it).
There are probably ways to do this with log crunchers as well, however, I have limited familiarity with crunching logs to sort by orders.
I expect a few of the larger companies to release a few FAQs on this soon.
The question is, does Urchin explain this process? ;)
<?php
if(eregi("googlesyndication.com",$HTTP_REFERER)){
if ($QUERY_STRING!= ""){
$url = "http://".$SERVER_NAME.$PHP_SELF.'?'.$QUERY_STRING;
}else{
$url = "http://".$SERVER_NAME.$PHP_SELF;
}
$today = date("F j, Y, g:i a");
$host = gethostbyaddr($REMOTE_ADDR);
$logfile = @fopen('googlesyn_log.txt', 'a');
@fputs($logfile, "$today - Googlesyndication - $url - $host - $HTTP_REFERER\n");
@fclose($logfile);
}
?>
That will run all the entries into a logfile called googlesyn_log.txt on your site. You can then check it every day and decide which sites you want to keep and which you want to ban. The above stores the entire google string, if you could be bothered you could split it up to just grab the url value out of the query string so you'll get a smaller logfile.
If you really had no life you could get the above to email you every time a hit came in ;-)
Not perfect, but it beats going through 300meg of log files manually LOL.
There's a fairly widely accepted (proven?) proposition that people who type in "BlueWidgets.tld" are often focused on finding a quick way to the best source for blue widgets. A well optimized landing page for BlueWidgets.tld - one that presents a list of links to sites selling blue widgets or reviewing blue widgets, etc. - might just do the trick for your campaign.
Of course, the problem with a lot of PPC domain landing pages is that the landing pages present all manner of generic or off-topic links. However, some domain parking companies are much better than others, as they allow you to "force optimize" links onto the pages your domains trigger. For example, if I'm the registrant of BlueWidgets.tld you can bet that I'll optimize the page with landing page links that a direct nav surfer can click such as "Buy blue widgets", "Discount Blue Widgets", "Wholesale Blue Widgets", "Blue Widgets Reviews". Those links, in turn, will present a second page of, say, Google syndicated links, for sites that fit the link text.
The downside to parked domains is: 1) Many landing pages are not, in fact, optimized with on topic links that will trigger domain appropriate results, but most at least offer a fall back in the form of a search box; and, 2) Not every PPC company has a relationship with the best search feed providers, the one's that are able to feed the best and most relevant commercial quality website links. Also, not everyone typing in BlueWidgets.tld is looking to buy BlueWidgets. They may simply be looking for available domains or looking out of curiosity. However, most such visitors are likely to leave without costing you any money and even visitors that find your site by all other means share the same "just looking" characteristics.
Here's a little trade secret that some members may not want to see leaked: My advice, based upon my participation in domain landing pages, is that you research what's going on with the domains that represent the best descriptive keyword domains that relate to what your website is about and then, if the related keyword domains are parked and using search feeds 1) see if the page is actually well designed and optimized (automatically pulling up relevant results) and, 2) see who is behind the search feed. (You may have to dig around or go directly and ask.) If it's a good feed source ahem . . . such as G syndication you will then have an idea that the domain parking company (there's 3 or 4 major players) has passed the test of sending valuable traffic, that is, employ safeguards to filter out bogus traffic, bogus clicks, etc.
Hope this all helps a bit. If you want to see an example of how direct navigation on pretty well optimized PPC domain land pages can look and how they are designed to work sticky me. Not all domain landing pages are created equally and many, in my observation, are in fact crap.
A little research into specific parked domains is likely to yield some gems and some mother lodes of cheap, highly targeted traffic.
<Corrected typo>
[edited by: Webwork at 1:33 pm (utc) on May 6, 2005]
[edited by: eWhisper at 1:44 pm (utc) on May 6, 2005]
[edit reason] See [url=http://www.webmasterworld.com/forum81/5254.htm]Using Sample Links[/ur] [/edit]
1) Find spot on domains associated with your interests that are parked.
2) Determine if the domain parking program sponsoring the domain allows for the optimization of the landing page with keyword links, ones calculated to pull specific search results from a feed database. (I know of 2 major players that allow such forced optimization.)
3) Determine if the search feed for that domain includes results from a source that you trust.
4) Contact the domain registrant (or the parking company if it's an anonymous WhoIs) and ask that the landing page be optimized to include specific, highly focused, 2-3-4 word trigger links on the landing page - keyword phrases of your choice, best calculated to stimulate or filter direct navigators behavior to your advantage.
5) Buy those keyword phrase from the search feed provider, who will in turn send their feed to the domain parking sponsor. Your friends, the domain registrants, will have hopefully populated the relevant landing pages with the necessary trigger links and the circle is complete.
6) Profit (hopefully).
Food for thought for the creative and/or aggressive web marketer.
Caveat: Assuming I have a somewhat valuable traffic domain chances are I will not populate it with low paying keyword phrases if I'm limited (most are) to the number of optimized links I can put up. Secondly, most good domain parking programs will not allow domain registrants to populate a landing page with off-topic links.
Okay, so maybe $2.50 tip. You don't know until you try.
[edited by: Webwork at 1:55 pm (utc) on May 6, 2005]
Why are we limited to removing only 25 domains? Why can't I remove as many as I darned well please? If I remove too many and stop getting clicks, my bad, but Google shouldn't try to protect me from myself. Actually, I suppose they are just trying to protect their revenue stream, but in any case, I'd like to have removed all 86 sites I found. Oh well.
My guess to the above question: I would think that if advertisers were mico focusing and opting out of 100's of websites, Google would lose the other(quality) publishers who are say in a small niche type market. The cost per click would be to low to keep the publisher in the program. The more your money is spread out (eggs in a basket)the more secure it is.
[edited by: clearvision at 2:06 pm (utc) on May 6, 2005]
Thought about this: What if one of the content URLs is sending you people who visit then come back later to buy...are we deleting a potential profit center if they send a large amount of traffic our way, but it converts later :)
You raise an interesting point. A recent DoubleClick study on "search before the purchase" showed, among other things, that a majority of purchasers complete their online research at least two weeks before the "purchase event" and only 23.1% buy during the same session.
Still, if such behavior is general (as the DoubleClick study suggests that it is), we can assume that what matters for tracking purposes is the [i]difference[[i] between conversions for traffic from various sites, not the raw conversion rates. In other words, if Site A has a better conversion rate than Site B, it doesn't really matter that more than half of Site B's referrals come back later to buy, because that's likely to be true of Site A's referrals, too.
Also, a "conversion" for AdWords/AdSense purposes doesn't have to be a transaction. It can be any trackable "business action" that the advertiser defines. If you're a travel agent selling luxury cruises, the odds are remote that a prospect will come to your site and buy a $5,000 or $10,000 cruise on the spot. What really matters to you is the fact that Site A is producing leads (inquiries, for example) while Site B is referring users who look at the home page and bail out.
Side note: As an AdSense publisher who spent 20+ years in the ad business, I've been among those who have been lobbying for advertiser controls since 2003. It's nice to see that Google is finally starting to give advertisers the kind of domain blocking that publishers have enjoyed since the beginning. This may be a small step toward making the content network more appealing to skeptical advertisers, but it's an important one.