Forum Moderators: buckworks & skibum

Message Too Old, No Replies

Wasted Spend -- Still Hurts the Most

         

RhinoFish

6:28 pm on Oct 12, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



King Wasted Spend, still rules the Kingdom of ROAS!
[searchengineland.com...]

It is way more important to avoid inefficient spend, than anything else.
Quality Score is important, but Wasted Spend is more important.
Long live the King!

buckworks

7:39 am on Oct 13, 2017 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



>> avoid inefficient spend

Negative keywords are your friend.

So are automated bidding rules that keep you out of the most expensive positions.

Other thoughts?

engine

10:40 am on Oct 13, 2017 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



I totally agree about negative keywords.

I regularly see ads appearing for irrelevant searches, and that poor performance, whether clicked or not, is a waste.

Poor targeting will do nothing to help.

Also, the most competitive terms are not always the best to play in, especially if you'e trying to avoid clicks that are unlikely to convert. Let those with deep pockets fight over the top terms, especially if your budget is small, or you want to keep it under tight control.

RhinoFish

10:24 pm on Oct 14, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Search Term reports are good friends too!

buckworks

1:09 am on Oct 15, 2017 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



My biggest negative keyword list has over 2000 terms but even now I still find things to block when i check the search term reports.

tangor

2:45 am on Oct 15, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Just check what kind of clothes the emperor is wearing. Fashion of the day is always a clue....

RhinoFish

9:00 pm on Oct 15, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



When aiming for 4x, a $1 of wasted spend, is a loss of $4 in revenue.
Most settings are a question of degree, wasted spend is a question of survival.
The relentless pursuit of ad spend efficiency has been belly belly good to me.

buckworks

10:40 pm on Oct 16, 2017 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Just check what kind of clothes the emperor is wearing. Fashion of the day is always a clue....


I don't understand that. Could you rephrase it so your point has a better chance to penetrate my thick skull? ;)

tangor

7:08 pm on Oct 17, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Technology and economics can appear as magic to those without knowledge of "how it works" (or are deliberately kept in the dark).

ogletree

6:38 pm on Oct 30, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Be very careful with negative keywords. This is not unlike messing with disavow lists if you don't know what your doing you could easily throw the baby out with the bathwater.

1. Make negatives phrase match unless you have a well thought out good reason not to even if it is just one word. If you use the broad term FREE without being a phrase match you might stop freedom or freelance.
2. Don't just make something a negative because you think it is irrelevant. It could be relevant and you just don't know why. I never assume I understand searchers. It could be you should be making a new ad group, not a negative. I'm surprised every day how people search and I have been monitoring a lot of keywords since 2003. Not everybody thinks like you. As a matter of fact most people don't think like you or me or anybody.
3. Audit your negative list every once in a while. Things may have changed. A keywords that used to not work might work now. You might have been tired or drunk when you added one.
4. Be very careful when using conversion data to pick negative keywords. If you ever go look in your "Top Conversion Path" report in Google Analytics and set it to "search query" you will see that many times people try lots of different keywords before they buy something. If you have your attribution set to a single keyword like First or Last you might end up cutting out a very important part of the chain. If you then set the secondary dimension to "source" you might even notice that somebody used 6 keywords and some were from bing and some were from Google. Also set the look back window to 90 days. This can help you manage your keywords by intent. The important thing is to make sure any keyword you set as a negative was not part of a chain. It is ok to break the chain if it is a bad chain. Just make sure you have all the information before you ban that keyword.
5. Use a lowest common denominator approach. Don't do exact match negatives that are long. Find 2 or 3 words that are clearly bad as a phrase or broad match and use that. It will help keep your list manageable. You can block a thousand search queries by just using a good phrase match.

RhinoFish

11:13 pm on Oct 31, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



"If you use the broad term FREE without being a phrase match you might stop freedom or freelance."
Negative keywords don't work the same as positive keywords.
Learning the difference between a single word broad Neg and a multiple word broad Neg is critical.

More here: [support.google.com...]

Ogletree makes a GREAT point with #2, use data, try not to assume too much.
:-)

Mark_A

2:08 pm on Nov 1, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I just checked my cpc against my quality scores.

No that can't be right. Grrr ..

Mark_A

8:19 am on Nov 3, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



OK so I just checked my average cost per click by Quality score, there seems no correlation between quality score and average cpc.

In other words the theory that the higher the quality score the lower the average cost per click is not proven with my data anyhow.

RhinoFish

8:40 pm on Nov 3, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The QS they show you is for the exact match version, because anything else becomes mathematically complex.
Also, for each keyword, your actual CPC depends on the bid (and QS) of the bidder below you, which is going to vary considerably.
So the data you're trying to correlate, depends on many factors besides QS, making it complicated to unwind.

I've seen enough shifts in QS due to site issues, both up and down in direction, correlate to CPC, that I don't doubt the relationship.

Do something to shift your QS, something unrelated to bid or CTR, see what happens. (Okay, that's bad advice, I should have said shift your QS "up").

Mark_A

3:08 pm on Nov 4, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I don't think QS is my issue.

My issue is that only a small number of impressions result in clicks, and of those that do click a significant proportion bounce.

RhinoFish

8:55 pm on Nov 4, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



CTR can kill your QS, raising CPC, and tanking your efforts completely.
Low (relative) CTR can be from bad ads, or poor keyword to search relevancy, or both.

CTR is the most important element in the QS game, in my opinion.
It is a great barometer of quality and relevancy, and G uses it as such.

Turn on these columns under Keywords tab: Ad relevance, Landing page exp., Exp. CTR
Any general consensus in the data there?

Mark_A

9:20 am on Nov 6, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Turn on these columns under Keywords tab: Ad relevance, Landing page exp., Exp. CTR
Any general consensus in the data there?

Interesting columns RhinoFish, I didn't know about them before, thanks.

In some cases I have a good QS, above average Ad Relevance, but below Average Landing Page exp .. in those cases it seems straight forward what I have to do :-)

Other instances indicate other actions. It is interesting.

RhinoFish

7:58 pm on Nov 6, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Google likes to hide all of the good stuff. Hahaha!
When you look at the various scores and aspects, do think of CTR as the main thing that determines your QS, but that assumes your landing page is good.
If your landing page isn't good, and there are lots of things that can be wrong, your CTR won't save you.
In this regard, while CTR is very important, think of it as a linear scaling factor, but think of landing page experience as more of a exponential negative scaling factor, a QS over ride, or even an on/off switch.
If your landing page has issues, you're doomed regardless of everything else.

Mark_A

8:18 pm on Nov 6, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@RhinoFish I think one of the issues may be that I actually don't have any dedicated landing pages, I just point ads to the most relevant page on the site. But however good that page might be it might not be equally relevant for perhaps 7 slightly different ads.

I have hummed and haaa'rd about landing pages on here before, pondering if I should in fact have dedicated and detailed specific landing pages for each ad but I haven't yet done anything about it.

Going to do some more analysis tomorrow.

RhinoFish

7:13 pm on Nov 7, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I do PPC for eComm stores and I always just find the best store page for a particular search, meaning I don't create landing pages, I identify them on a functioning eComm website.
I like the ad viewer to land on a store, not a page - let them browse if they prefer, or buy now if they found the thing they were looking for.
This also means that when they return, the website is the same, easily navigable to their item of interest, but they can also find others items if they want.

1) Drop someone searching for cereal, in the cereal aisle at the grocery store, so they're standing in front of the cereal they're looking for.
2) Drop someone just inside the locked stage door for a timeshare presentation.
I prefer 1, seems others agree.
:-)

Mark_A

9:19 pm on Nov 7, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Our issue is that we may have a generic page about widgets which has sections on blue and green and yellow widgets. But we have dedicated adwords ads for blue or green or yellow widgets which all deliver clickers to the general page.

I probably should create stand alone pages for each - within the normal navigation of the site - because it seems people who don't find a page immediately specific to their search bounce.

Whatagreatdayitis

11:16 pm on Nov 7, 2017 (gmt 0)

10+ Year Member Top Contributors Of The Month



Is Adsense/Adwords arbitrage still a concern among publishers? Google now seems to be encouraging Adsense publishers to use Adwords to generate traffic, so the days of worrying about Google canceling an account due to arbitrage is a thing of the past, right? Does anyone have experience with this?

mack

6:58 pm on Nov 8, 2017 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



It also makes sense to ensure that people in different areas of the business know what each other is doing. I have carried out a search, clicked an advert and been taken to a page with an "Under maintenance" placeholder. This is a wasted click and opportunity.

Mack.

simplytheresa

7:40 pm on Nov 13, 2017 (gmt 0)

10+ Year Member



Thanks for the heads up on those hard to find columns, RhinoFish! I've been hovering over the status bubble to try to get those insights. This is much better!

Mark_A

10:57 am on Nov 14, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@RhinoFish.

I am working through my ad groups looking at landing page relevance because we have both click through and bounce issues. I notice that in some cases Google is assigning a below average landing page experience despite that we have never had a searcher actually click on that combination of key word and ad.

So G can't be basing landing page experience only on bounce, as on these particular key words there has been no visit and therefore no bounce. G must be rating the page through crawling it and might be doing something as basic as comparing search terms with words on the landing page?

RhinoFish

2:20 am on Nov 16, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



When you say you have "bounce issues", just know that I've heard that 1,000 times, and about 850 times, they imagined it was an issue.
Bounce is just like my mom's pretend friend, old bad Dorothy, she got blamed for most things, cuz the blaming was so easy.
Bounce is a relative measure, and it is very misunderstood and misinterpreted.
Sometimes a bounce event indicates you immediately and completely fulfilled the visitor's quest for info... so a high Bounce rate, and a following lack of further searches on that search issue, would be a good thing.

Generally, they are crawling and measuring.
[support.google.com...]
#1 is the game you are playing (2,3,4,5 must be good, regardless of relevancy).

In other words, if 2-5 are tanked up, they will wreck all of your keywords' QS.
And, 2-5, most often, are site-wide issues, not page or search specific.

The game is, how hard to push...
If you shotgun and aim wide, the costs for the incremental traffic (as relevancy decreases) gets far more expensive per click, the less relevant shots are costly.
If you rifle and aim narrow, the CPC stays low, and QS is your pal.

G achieves this by pushing down your QS when you shotgun it.
If your landing page lists 12 breeds of live rabbits for sale... and you bid on "apology gift"... it's buckshot baby!

buckworks

6:10 am on Nov 16, 2017 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Baby bunnies --> apology gift made emotional sense to me. :)

"Buckshot" is sometimes productive because human minds sometimes recognize relevancies that Google misses.

But segregate buckshot into its own campaign(s), and set tight limits on your bids.

Mark_A

11:25 am on Nov 16, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Thanks for that thoughtful post RhinoFish. It has caused a lot of reading and lots of learning! It had been easy for me to think a bounce rate of 70% means 70% of all visitors on that page leave but it isn't true. A bounce rate of 70% means that 70% of your arrivals / entrances on that page leave, and as you said, they could have been satisfied with what they saw.

So on a page where the bounce rate might be 70% that could be 70% of entrances, of which there might have been just 100, so 70 true bounces, whereas there could have been 700 who visited the page on their way through browsing the website.

RhinoFish

5:33 pm on Nov 16, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



"Buckshot" is sometimes productive because human minds sometimes recognize relevancies that Google misses.
But segregate buckshot into its own campaign(s), and set tight limits on your bids.

Headline should read... Buckworks Nails Buckshot, Makes Bucks in Buckets!

RhinoFish

5:48 pm on Nov 16, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Mark_A, on Bounce Rate, it also has to do with biz model, margin, and other factors.

I used to have a page that compared Dish Network to DirecTV, and I had affiliate links to installers of each.
A Bounce could mean they left, not liking the page at all.
Or a Bounce might mean they click my affiliate link and purchased.
When a metric can so easily represent two extreme outcomes in the same way, the metric can be misleading.

I also had a client recently, who noticed their Bounce Rate was 6%, and according to a class they attended, they were killing it.
Then I told them that their webmaster had accidentally installed the Analytics script twice, on many, many pages.

I have also seen redirects that break the internal page referrer and / or session, so people appear, in Analytics, as Direct / (Not Set), though they were actually on page 7 of a 12 page visit.
Breakage can artificially raise Bounce rate as well.

In Lead Gen, where people often have a one-page lander with a dynamic loading "thanks for submitting" confirmation notice (that does not reload the whole page or redirect to a receipt page URL, Bounce Rate is rendered completely insensitive to what is actually happening.

And with event triggered loading, like viewing an embedded video, there are ways to artificially raise or lower Bounce Rate.

In an eComm site with a multi-page checkout, Bounce can be a decent metric, but it can again be very misleading.
Many times a shopper lands on your product page, they like what they see (perhaps they were in Google Images, and then visited your page), then emailed a link to their friend or posted it on Facebook, but they are recorded as a Bounce.

Whether eComm or not, if you take phone calls and close biz, imagine how many people Google for "Acme Rockets 800 Number", land on your contact us page, and are a Bounce.

I trust Bounce like I trust the govt.
:-)
This 43 message thread spans 2 pages: 43