In fact it is the only way to "reclaim" some control over "exanded" broadmatching.
Many of our 3-worders perform well.
I suspect they are dealing with server load issues. Since they "broke" IMHO the old broadmatching lots of word combos are the only way to get some of that traffic back. I suspect many more advertisers are figuring this out and consequently they are haveing lots of accounts with thousands of words (ours has over 2k).
Of course, they could just "fix" broadmatching and all would be well again.
The whole reason they did "expanded" BM was to fill-in (and wring out more $$) from less skilled advertisers.
I know they view themselves (Google) as an 'engineering' company, but there are limits to how much they can "think" for their customers.
You watch, as advertisers understand this more and more, the "rules" will get ever more byzantine are ridiculous...
(it is like a fractal)
There is written:
1. Do not use variations that are more than three words long.
2. Do not use a three word keyword in multiple matching variations at one time: [buy red bicycle], "buy red bicycle," buy red bicycle.
and I read it this way:
1 AND 2 means:
use multiple matching variations only for one word or two words.
BUT I also read on the same page:
Do not use misspellings ...
Oh man, why? I (and Google too) make nice money on it!
I think the message behind this is:
"Hey, our machine became so complicated, please be nice to it!" In the other words - i agree with nyet:
they should minimize their "thinking" for the customers (BM problem, On Hold - On Trial problem atc.).
Of course, they could just "fix" broadmatching and all would be well again.
What exactly needs to be fixed--in what sense is broadmatching "broken"? What is IMHO? I use almost all broadmatching.... mostly because it is the default option and I always assumed it was the best (for me at least). I didn't know there were problems with it.
It is a matter of perspective. It is not officially "Broken" that is just my harsh characterization of the changes made when they went to "Expanded" broadmatching.
In the old days you could have Buy Widgets as a BM (broadmatch) as long as you maintained the appropriate CTR all was well. But the user had to be smart as well if there were a lot of matches which we not good you might have to add negative words which "protected" the CTR of your BM.
Now with EBM (expanded Broad Match) the Algo *adds* words to your terms that perhaps you didn't think of.
Problem is that the english lanuguage is not so simple. Just because there are lots of people searching "Buy SpecificWidget" does not mean there is another entire subset of people searching "Buy SomeOtherWidget" Related words but *different* products! So, since I cannot *see* the "expanded" words, I have no real idea *what* other words are being added.
So I had 9% CTR on words (with approproiate minuses) that went to .2% when G *added* what they thought were related words. Unfortunately they weren't because of the imprecise "Algorythm-ablity" of language and meaning. So we lost those words! Good words with high CTR and high ROI!
What to do?
Well we got out a spread sheet and turned 150 Broadmatches into 2000 More specifice terms with all three matching options:
Buy Red Specific Widget
Buy Blue Specific Widget
Purchase green specific widget
How do I find these widgets
etc.etc.
We got most of the traffic, CTR and ROI back, but went from 150 words to 2000!
If that happens on a large scale......computing issues.
IMHO they should have left it alone, If users were not saavy enough to figure out BM and neagative words then help them *become* more saavy, don't try to think *for* them!
I understand the english language *much* better than any algorythm, thank you very much. And since it is my $$ I have an incentive to keep on top of it as well.
Problem is that the english lanuguage is not so simple. Just because there are lots of people searching "Buy SpecificWidget" does not mean there is NOT another entire subset of people searching "Buy SomeOtherWidget" Related words but *different* products! So, since I cannot *see* the "expanded" words, I have no real idea *what* other words are being added.
Personally I think they actually made it harder for a newbie to succeed because they are no longer given all the information. As you mentioned, who knows what additional words Google is attaching to your new BM keywords?
Freq---
Just because nearly all the people who search "Michellan" are looking for tires does not mean that word might not also work very well for another product like "michellan donuts" as long as the appropriate negative words are applied.
It is like G wants all the Algo-Making for themselves. If I construct good words, good matching and good negative words *as long as I have good CTR* then bully for me!
Google: let us make our own effective "mini" algos (ie words, negative, matches) and find our own "mini" niches and *more* people can make money at the same time! Even Google!
The beauty (and I really mean Beauty) of all this is that it *empowers* people to enter the marketplace. So give us a little credit Google, don't hog all the Algo to yourself!
If the "new" broadmatching behaves like an exact match until the broadmatch gets impressions or clicks on its own, does it make any sense to use the same phrase as a broadmatch as well as an exact or phrase match?
I and many always thought that was a good strategy. However, now if your broadmatched phrase is also an exact or phrase match, the latter 2 will pick up all the impressions. The broadmatch will never get a chance to *prove* itself.
Now I've been experimenting and I tried one AdGroup with ONLY broadmatches and it got far fewer impressions than a similar "traditional" AdGroup of mine.
The mix and match had the advantage of spreading out the impressions so you wouldn'r hit the "danger zone" of 500 or 1000 or 10 or whatever it is now so soon and have your words disabled. It also could be cleverly employed to pick up a variety of searches too large to include in a keyword list:
Some will search precisely for [particular widget]. I want first crack at that.
Others will search for where can i find this "particular widget". I thought the phrase match would pick that up
Others may search for i want a widget of this particular type. I thought that was where my broadmatch of particular widget jumped in.
Now I find NO match which consists solely of particular widget will even get chosen for the search phrase the particular widget. I even miss out because of the article "a".
This is not true for all campaigns, only newer ones.
I agree, bring back the old broadmatch! There's an earlier post:
[webmasterworld.com...]
where many express their disappointment with the broad match of late.
The expanded broad match for my phrases which are eligible for it, turn up on some strange queries. Additionally, for a new broad match to every qualify as anything but an exact match, it has to make it as an exact match first.
There are two seperate problems with broadmatching of late.
patient2all
[edited by: patient2all at 7:35 pm (utc) on Mar. 15, 2005]
they are haveing lots of accounts with thousands of words (ours has over 2k).
Nyet,
FYI, there appears to be an absolute maximum of 50,000 active keywords per campaign. I hit that and could not add or edit any keywords. I have my reasons for having so many. I have about 700+ AdGroups all told and I was trying to cover all the possibliities. I advertise many different widgets and was trying to target every customer that searched. These were targeted keywords, not bulk uploaded. All entered by hand.
Of course, I had to get rid of some so I went into one AdGroup with about 800 keywords and figured I'd delete all with no impressions ever. Of course, I forgot to switch the view to "all time" and ended up deleting every keyword without an impression that day :)
Some may criticize me for having 50k keywords, but I have my reasons and I'm making money. Before the broadmatch change, I was able to get by with 1/4 of that.
Really, these days by computing standards, 50k keywords are not that many, even multiplied by the many advertisers. Just off the top, I think of how since the free email services have increased their capacity from 500k to 500mb or a gig per customer, I'll bet many people aren't as scrupulous about cleaning out their bulk mail folders, etc as they once were. That doesn't seem to be causing any internet resource drain.
Google brags they have 8 billion pages indexed in the SERPs and they have no problem serving those and many of those are useless. We pay good money for each of those words.
I want to keep far less of a database than 50k. As soon as Google gives some *specific* insight into how matching is performed lately, I'll be happy to dump thousands of them. More than 1/2 have impressions though and I create new AdGroups for particular widgets each day.
patient2all
2 things.
G will disable any words now which have no searches system wide after 90 days.
And the computing power problem they face is not storing 50k words, it is in running the logic on thousands of advertisers and millions of words on the fly.
If they have to "evaluate" your list to determine which of your ads to display, which of the multiple words you might have which apply, and you have 50k words and there might be 100o other advertisters who *also* need to be evaluated and they have 50k words.....you see the problem.
Scientists know that when you take a simple and elegant equation (BM) and it starts to get uglier and uglier and more byzantine (EBM and all the 'new' logic rules) you should start to rethink your equations.
You got me started on another one of my peeves about "modern" life.
I'm well aware of all the indexing that goes on in order to store our keywords; I've designed elegant solutions for databases with millions of entries indexed on hundreds of fields.
My point is that if they can devote the resources to manage spam ridden SERPs who pay them nothing, they should be able to accomodate well paying customers with sufficient resources.
I recall in the early days of computing, the only way to generate reports and updates were to run them overnight in "batch" mode. Then the concept of the 'index' evolved. We could have a few smaller files that could grab all the file items where field = x.
That started to get a little out of hand and rather than reserve indexing for those fields most frequently searched on, virtually every field had to support both its updates and updates to all the ancillary indexes too.
Then someone would reboot their PC mid-update and the CEO would say my $1,000.000.00 sale from February is missing. From that problem, evolved the concept of rebuilding the indexes, perhaps nightly to keep them in sync with the actual file.
Nowadays, computing power is sufficient in most cases that many of these embedded indexes are no longer needed. The file itself can be queried just as fast. Yet the indexes endure, because no one is going to rewrite complicated systems that may never have been properly documented.
We'd do better to get back to something simpler in the case of AdWords keywords. Reportedly, Google tracks the use of each broad match and keeps a load of stats on the particular query it appeared for. AWA indicated this the other day. Has that resulted in a better system? None of appear to see it.
It needn't be so complicated.....
patient2all
I am focused on making money for my company so I can only complain about the result.
Whether or not it is a "capacity" issue or not is really beside the point for me.
My sense is that they have tried to "do the thinking" for advertisers because I am sure *many* people are confounded by the logic(s) involved. Tried to make it easier, tried to increase people's relevant traffic, all that...
I just think where they have set border of where advertiser intelligence ends and Algo "intelligence" begins is in the wrong place. The Algo should not "do it for us" it should make it so "we can do it ourselves".
The great power of the internet marketplace is NOT in cornering the market in a big business way, it is in make myriad of smaller niches possible. Wringing much more efficiency out of the marketplace. To do that as many people need to be empowered to *find* as possible.
G knows this, it is practically their credo. But IMHO EBM *restricts* people from exploiting all the possible niches that there might be - it reduces their empowerment.
(sorry for the philosophy)
This is contrary to the general direction of the internet and ultimately will need to be fixed.
G will disable any words now which have no searches system wide after 90 days
Nyet,
You know what, that "feature" doesn't seem to be working any more either! I went back through some of my notes and was able to find *many* words that had sat 6 months with 0 impressions and have remained normal in status. And I wasn't even looking that hard. I got rid of them myself.
I have no quarrel with that purging method either. Too bad it can't be depended on.
patient2all
It does not delete words with zero impressions it deletes words with zero system wide impressions. Meaning that you may not have had impressions but if the words are searched upon you'll keep them.
But believe it or not I have a beef with this as well. There are words which our competition is allowed to use (and does) and we are not because we tried them earlier and they were disabled (forever). Our competition is using them now and because the words now get system wide use *they* can keep them.
Yes I recall that and reflecting upon it again, I have to wonder:
If the words got disabled forever because during the period you used them they received no impressions by you or anyone else using them, how could another advertiser add them to their campaign at a future time?
We know now that we can enter a keyword and have it immediately disabled, presumably because it's been tried before with zero success. So how come the competition didn't have them jump right into "disabled" when they tried to use them?
It's an absurd policy however you cut it, but the part that I can't grasp is how the competition was able to get them in there at all. Unless the disabled period has an expiration date, in which case they should have become useable to you too.
I'm missing something basic, right?
patient2all
It is my understanding that word are not immeadiately disabled because of prior poor perfromance of other advertisers but because of lack of impressions system wide.
Otherwise how could "new" words ever be added to the lexicon.
Are these statement of mine accurate?
If you ever had a word disabled because of poor system-wide performance, you may never use it again.
Even if the keyword should become popular in the future, new users can take advantage of it but anyone who used the keyword before cannot.
Is this not completely unfair?
patient2all
I'd like to know how if it is possible.
Everytime I have tried it, the words about 5 impressions and then it is disabled again. It never truly escapes its past account history.
Here is what I know. I have a word I added long ago which has received ZERO all-time impressions and is disbled. My new-to-the-market competition is allowed to use this word.
Why?
Tech support on the phone said because with the new 90 day rule, it had zero system-wide useage. Now, the competition has added it and now it must be getting some impressions.
without the 90 day rule, I would just go ahead and make some but I'd be in the same boat again.
The bottom line is that this rule prevents the merchant from getting too far ahead of the marketplace of competition in anticipating the language used for future searches as products and marketplaces develop.
That is pretty counter to standard business operation - "try NOT to get ahead".
Here is an easy fix: put future terms in a "word locker" and when those terms *start* to generate systemwide impressions, they are automatically migrated into your accout.