|Brainstorm: Improving Adwords/Adsense system|
Help Take MFAs out of the Advertising Equation
A recent post by RonS [webmasterworld.com] (msg 51) demonstrates how it would be in the financial interests of genuine publishers to help Google take pure MFAs out of the advertising equation. A post by Toomer in that same thread also illustrated to me that the idea of having a minimum bid on publisher sites would probably help MFAs more than it hurts them.
I suggest, therefore, that we have a brainstorm on: ways that Google could improve the quality of the Adwords/Adsense system - ultimately, this will be in both our and their interests.
To start the ball rolling, it seems to me that Google's "landing page quality" criteria is being circumvented in some way - perhaps by using cloaking techniques, etc., to present a different page to the Google bot as that presented to the surfer. Perhaps one way to catch this might be:
- each time an Adsense ad block is called by a page, the ad script could compare some key information about the page (eg; fileSize, fileModifiedDate, etc.) and compare that with the same information about the page stored on Google's server (collected at the time of analysis for 'landing page quality' purposes).
- if it differs, ads would still be served, but the Googlebot despatched to review the page
- if on review by the Googlebot it seems that the page has changed (ie the information provided at the time of the advert display is correct) then the Google database is updated, the landing page quality score is updated, and ads continue to display
- if the review shows that there is still a significant discrepancy, then the page is flagged for display of PSAs only.
This might be one way of stopping Adwords appearing on MFA sites that use tricks to deceive the Googlebot, but there might be circumstances where this catches genuine advertisers - eg: on a forum page that has additional messages. Therefore, pages that change with additional content would be allowed.
Also, there may be some pages that use cloaking in a constructive way, to provide locally-relevant content. Therefore, pages that have substantially similar content with minor variations would also be allowed.
Any other ideas?
One question about this.
I have read numerous threads on this forum along the lines of "google is showing cached pages from 2 years ago"
How would this overcome?
You connect "landing page quality" and "page freshness" too much to each other. OK, usually frequently updated pages tend to be of a higher quality but that can not be standardized. A page can still contain great info even if it isn't updated for 10 years. Also, it is my preference to disallow bots to cache my pages.
There is no connection being made in my idea between 'freshness' and landing page quality. It is simply to check that the page on which the ad appears is the same as the one shown to the bot. Whether it is 10 years old or in the cache (or not) doesn't matter.
What does matter is if cloaking is being used to show different pages to the surfer & bot to circumvent Google's 'landing page quality' analysis.
A bit off topic, but I suggest to have one (moderated. stickied?) thread where all WebmasterWorld can enter their feature requests.
There have been so many feature requests in the past, that Google may have difficulties following them all.
I have very dynamic pages, and the MediaBot (when I last looked) does often revisit after nearly every user page view.
Also, I do return *slightly* different pages to bots than to humans, partly because (bad) bots are up to 90% of my page views and I could not afford to serve them the full "expensive" bits of the page, and I also simplify the page to help AdSense targetting (AS can get tripped up by some of the more "left-field" links that can get inserted dynamically).
Far from being an MFA, I am a NFP (not for profit!), but your proposed scheme would hurt my revenue!
PS (Which is not to knock your idea, but just to suggest how clever an implementation of it would have to be!)
1.) Something for PSA ads. Something to clarify the content of a page, so no PSA ara shown.
2.) A hint system for AdLinks for pages where AdLinks show only the search for ad box
3.) A statistic about "ads appearing in channel and date range" with the possibility to block certain ads.
I've realised that my thread title wasn't specific enough. By 'improve quality' I didn't mean 'add new features', I meant 'get rid of junk ads'.
That is, what ideas can we come up with to improve the quality of ads in the system by getting rid of junk ads.
(1)Set a standard for adsense publishers.
(2)Define clearly what is MFA and what is not.
(1) Is possible. (2)Seems difficult to implement, but is possible (3) With both adwords and adsense running simultaneously, this is impossible.
Without worldwide competition, G is going to do nothing and that is the truth.
Competition is the key. Competition from others like YPN and MSN is the only way, G will improve.
We can keep shouting ourself hoarse, but G will not care till maket forces, force them to knock out so called pure MFA's..
|(1)Set a standard for adsense publishers. |
There already are some standards, the problem seems to be one of enforcement. For example, the policies state:
- No Google ad may be placed on any non-content-based pages
- No Google ad may be placed on pages published specifically for the purpose of showing ads, whether or not the page content is relevant.
- Site may not include:... Excessive advertising
There are plenty of MFA/junk sites that already violate these standards, yet still show Adsense.
|Also, I do return *slightly* different pages to bots than to humans |
I understand your reasons for doing this, Damon, but isn't it technically against the TOS, which say "Do not employ cloaking"? How could we tell between someone who uses cloaking for constructive purposes (such as your example, or geo-relevance) and someone who uses cloaking to circumvent the quality criteria?
I don't return significantly different content for bots/spiders cf humans/browsers, and I'm quite happy for anyone to see any version of the content (select the "lite" option for your session and you'll see the quicker/smaller pages that a bot does by default). Indeed, if the MediaBot comes in with exactly the same headers as the user that it follows, then the MediaBot will see *exactly* what the user does; for example I don't check IP addresses or UA headers other than to keep out (or at least keep in check) SPAMmers and bad bots.
This is primarily a mechanism to prevent my bandwidth and CPU being sucked dry by rogue bots which are ~90% of page requests. This is the same way I avoid inserting the keywords tag for any client other than a bot so that I can avoid wasting users' bandwidth, and get the breadcrumbs line to them faster.
The default slimmer/cheaper page that I show to bots does also seem to help the AdSense bot target better, but the core page content is unchanged.
This *is* *technically* cloaking, but it is not underhand (black hat) at all. It is simply a type of "peephole" optimisation that makes running the site in terms of bandwidth, etc, possible at all, exactly like blocking hotlinking, which I also so, since I cannot support tens of thousands of MySpace hotlinks per day either.
It happens to help with AdSense, but I hope not to be slapped down for that. Thus my original post about the sensitivity with which G would need to handle the apparently-black-and-white problem.
OK, Damon, I can see the rationale - as you suggest, any checks would have to be more fuzzy than straight comparisons, though this doesn't negate the principle of performing some form of check when an Adsense ad is displayed.
I'm sure we must have a significant collective IQ in this forum. Anyone else have any other practical, workable ideas for reducing pure MFAs without hurting bona fide, added-value or original content sites?
At the end, the only option will remain as a manual check by a Google employee.
IF Google really wanted to put a stop to MFA sites, they could eliminate them all without them even begin showing ads. The thing that must be changed, and as you can guess, which should have already thought by Google is the requirement of submission of every domain name for review, before the publisher is allowed to show ads on the domain name. Don't you think Google can hire thousands of more employees just for this task? I have no doubt.
Besides the first obvious reason why Google doesn't do this; a drop in profit and share values, the other reason is actually that those MFAs might really offer some good ROI to the advertisers.
So, this is a stick with "something smells bad" on both ends. If an MFA sells my product when I advertise on it using AdWords, it's OK for me. On the other hand, I very dislike to publish ads of MFA sites on my sites.
There's no one solution to this problem, there can't be, and the way Google solves it is assigning varying rules and showing tolerance to different publishers. I don't blame them. This is such a big business and justice may vary depending on the situation.
I'm fairly sure Google makes good balanced decisions for the prosperity of each *group.
*Group: advertisers, web users, publishers, Google itself, shareholders.
That's what I think and I'm not intending to brake in and say you're wasting your time. Good luck in the storm.
MFA 's perform another important function for Google.
With their increased CTR , they help in using up the budget of the advertisers. In many cases, they 'help' Google convert budgets to revenue.
Given this scenario, I suggest the only way we can tackle them is by continuous monitoring, trying to get a bigger filter and filtering them.
We need to fix them at our site level, as G is hardly going to cooperate...
So let us keep requesting G for a bigger, better filter ..
Maybe someday they will listen to us and oblige us.
|that those MFAs might really offer some good ROI to the advertisers |
There are some debates over at the Adwords forum about content network quality vs. search quality. Listening to them makes me feel that they set MFAs equal with content network these days, which is very, very bad for us as publishers.
I personally doubt that MFAs do convert well, but this is just a guess. To me, they just sit between my site and the genuine advertiser. But this is another discussion that -as you probably are fully aware of- has been made many times here.
|Anyone else have any other practical, workable ideas for reducing pure MFAs without hurting bona fide, added-value or original content sites? |
I think the answer lies in the ad approval process, and critiquing the quality of landing pages. As 21 suggested, there also needs to be some sort of follow up check that determines whether a page's content has changed from what Google initially saw...(and used to determine whether the ad would be accepted.)
If you are familiar with ad affiliate programs, some businesses will allow you to become a part of their network, while others will not. Over the years I have learned that those who disallow certain sites to display their ads, often base their critique on site traffic, or site content.
Meaning, if they do not like or approve of your content, they do not allow you to display their ads.
I think this is a crucial step in limiting (or elimating) MFA's, and one that Google should consider.