I am concerned that the nature of these changes is leading to many many look alike ads. I just searched on a term that I know is commericial valuable. For illustration I will refer to it as "expensive blue toy". Seven of the first 8 ads had no text in the title other than the exact phrase "expensive blue toy". Five of those 7 repeated the phrase "blue toy" on the first description line. Its seems likely to me that all of those ads had {keyword} as the sole entry in the top line. Plus with todays new "large bold font" highlighting of matching keywords, it looked like all 8 ads were basicly the same ad. A chocolate mess in my opinion.
I can only assume that the advertisers for that particularly phrase have all reached similar conclusions about which secret "other" factors increase quality score. Theoretically, there must be an optimal ad text that maximizes quality score. Whether people are arriving at that maximum quality score text through independent experimentation or just looking at the ads that are appearing above them and copying them I don't know.
But its hard for me to imagine that many of the 8 identical ads are going to be particularly eye catching. It reminds me of the banner blindness that has seriously damaged the effectiveness of that form of advertisement.
Earlier this year Google limited each domain name to appear only once rather than appear for each affilate of that domain who bid. But even though the domains were the same, the ads often could provide different pieces of relevent information. For example for a key phrase like "beatles albums", the following ads might appear:
Sgt. Peppers Lonely
Hearts Club Band. The
Beatles Biggest Hit Album.
www.beatles album world.com
Beatles Rubber Soul
The most critically
acclaimed Beatles Album.
www.beatles album world.com
Worlds Biggest Inventory
of Beatles Albums.
See the selection today.
www.beatles album world.com
Each of those ads might take the surfer to a completely different landing page within that site.
With the "improvements" this year, the surfer seems to be seeing multiple ads all of which have the text:
Beatles albums
We sell Beatle ablums.
See our Beatle albums.
www.nonredundanturl.com
with the only material difference in the text of the ads being the URL. It surprises me that this increases the user experience.
It seems to me that if Google knows the optimal text that maximizes user experience that maybe the bidders should only provide a display url, and landing page url. And then Google could provide the optimal text once and then just list the urls. For example:
Beatle Albums*
We Sell Beatle Albums.
See Beatle Albums at
xyz.com or
abc.com or
def.com or
ghi.com or
etc.com
etc1.com
etc2.com.
*Our robot scored the landing pages for each of these domains. We multiplied this score times the bid from each of these advertisers and sorted them accordingly.
If Googles only measure of whether these changes improved relevency is whether or not their revenue increases, they may be outthinking themselves. For example if Google just made the arbitrary decision to reduce the quality score for all advertisers whose names begin with the letters A-L, I suspect that Googles revenues would increase. That is because that 50% of the advertisers whose names start with A-L would squeeze another ounce out of ROI and increase there bids in order to compensate for the quality score decrease. Then the other bidders will raise their bids in attempt to displace the first group. Googles revenue goes up, but there would not be an increase in relevency simply because Google lowered the quality score of the A-L advertisers.
Everytime Google makes a change like this that helps some and hurts others they should see this dynamic take place. I hope they are not confusing this "shakeup" dynamic with thinking that because revenue increased following a change it must mean relevency was increased.
If they really want to improve the relevency of their ads, I think they need to retrace their steps and either use human editors or there algorithm solely to determine if the ad is deceptive or not with regards to the text of the ad. Then let the market determine by CTR, and real world conversion rates of the advertisers to determine the ad score for each ad.
Apologies for the long disjointed post.
[edited by: jim2003 at 1:40 am (utc) on Dec. 10, 2005]
There are many reasons why users eventually get ad blindness:
1. The novelty wears off
2. Experience shows that the landing pages are garbage.
3. The ads all look the same.
etc.
To an extent, it is the advertisers that need to figure out a way to make their ad stand out of the crowd. Although that is pretty difficult when an advertiser is limited to so few characters...
.
My ad copy may be unique, relevent and informative but 10 other bidders have "optimized" thier ad to appease the google adword algorithm, my ad will never get a chance. I don't mind bidding up for the chance on a level playing field, but I may not be able to bid significantly higher than the competition because of arbitrary quality score issues that I can't control for.
The nature of any quality score scheme will be to breed conformity. First it is for the ad text. Next it will be common SEO'd web pages resulting from the latest algo change. Conformity is what will breed ad blindness.
Google loves these {keyword} ads, because they are not very taxing on their system. They like accounts that are "keyword dense". The only way you can be keyword dense enough to please them (and get an increase in keyword limit) is to do these kind of content-free ads.
I think that Google may have reached some capacity limits that they don't want to talk about.