Seriously, how can a bot determine if a user has had a good experience when on that site. How can presenting duplicate content automatically mean the site is awful?
I've explained how these bots (actually algos) are built, and the process makes sense and works (although a webmaster may disagree with the results, and no algo is perfect. Sadly, when threads get hijacked, they tend to be deleted here, and that's the case with the last detailed explanation I posted, so I'm not going to post yet again.
As for duplicate content, absolutely no question it's used as an indicator by google in search results and in adwords, and that google sees duplicate pages as less valuable. It makes sense.
Google does not want to have dozens of ads all pointing to the same content, because that ads absolutely no value to the user, and it damages google's reputation long term.
The question is: How is having ad after ad displayed on a page, and pointing to exactly the same content, valuable to the visitor to google or to the ad reader? I don't know about you, but as a surfer, I don't need or want that, and if that's what they serve up, I just learn to ignore the ads.
The issue isn't awful or not - it's whether it's valuable or not (adding value). It's a core business concept off and online.
If something doesn't add value to the user, to google, etc, it's not sustainable over time. It may work for a while, but then it won't work anymore.
Okay sticky me how you think the algo and bot work then, and we can discuss, because I will be amazed if it is anywhere near as sophisticated and you are making us to believe. Remember - just a few years ago this was a bot that could not even read more than one line of html.
Plenty of big sites, including Google re hash content - look at Google News for instance - yet people still use it. So I don't buy that unique content thing. I see your point, but could give you 0000s of examples where sites have mashed content for the better, given a unique angle on it or something that differentiates it from others.
Then it's not duplicate content, is it?
It adds value, doesn't it?
It's more than a feed that is used by 100 other sites, right?
It's not bot produced, is it?
This is on G search. haven't tried it with clicking an adwords ads since I try not to do it.
When I click on a search result site and come back using the back button to search I'm getting a little message 'Was this site helpful' YES No
So today, for me anyway, they are measure search result quality.
I tried it on another computer and it does not show there.
Anyone click on an adwords ad and notice this when coming back to G search?
This has been tested a lot over the last 6 months if not more...a few discussions have centered on wether or not this could work and if it is open to abuse.....
Like me clicking 'no' for all of your sites ;-) for instance.
Yes okay, that doesn't demonstrate to me that you understand how the bot can gather and process the information required to evaluate that.
Ok, Nutshell, and for the absolute last time.
1) Gather a panel of reviewers (human)
2) Choose a sample of websites (let's say 500)
3) Have each person evaluate each website for value.
4) Classify websites into two categories - high value, low value.
That's first step. It's more complicated, but nutshell.
Next
5) Use discriminant analysis to identify variables from the vast data you have to determine what BEST discriminates between the two piles.
6) Optimize and weight the variables.
7) Create algo based on (6)
8) Implement and re-evaluate with human eyes to eliminate false positives and negatives.
9) Reimplement and go back to (8).
Google has acknowledged some of this publicly.
What's important is that the variables they use don't have to be logically related to quality. They only have to be mathematically related to discriminate between the sites classified as valuable, not valuable.
Final: WW2, they wanted to choose people to become pilots who would be good pilots. They developed a paper and pencil test (using discriminant analysis) to do so. One question was clearly the best at predicting who would wash out and who would be a good pilot.
The question was: Do you take showers or baths?
Notice there's no logical connection between bathing and pilot success. It's an EMPIRICAL AND MATHEMATIC CONNECTION THAT WORKS.
Thanks for the tip though, you got there in the end. Even though most disagree with you about adwords and selling! ;-)
[edited by: engine at 1:02 pm (utc) on Nov. 24, 2006]
[edit reason] specifics [/edit]
Ever considered a competitor may have just raised their bids? I cant believe QS has this much of an effect on such low traffic and priced keywords.
It has nothing to do with "low traffic and priced keywords" - its all algo controlled, the same rules apply, whatever they are.
The criteria for the Adwords Quality Score equal more or less the normal Google webmaster guidelines. So for me there was only one factor left to explain the difference and that was the text in my Adwords Ad.
What I did now was to simply put the exact text of my Ads somewhere on my landing pages. Several weeks later the minimum bids dropped significantly. Not to the level they were before but at least to a reasonable amount.
So if you want my opinion of the Google Quality Score: It's garbage. To evaluate a site on the basis of a ten word Adwords Ad is just plain stupid.
If Adwords is too expensive, simpy don't use it. There are alternatives.
Google's explanation of the "quality score" makes perfect sense. If an advert is relevant and has a high click-through-ratio it has a high quality score, and lower cost. This makes sense, I get sick of seeing "Buy widget on eBay" adverts for silly things, like abstract terms.
Matt
To evaluate a site on the basis of a ten word Adwords Ad is just plain stupid.
According to a google engineer quoted by ewhisper in a QS explanation session at pubcon2006, there are over 100 variables that can be (or are) used to calculate QS.
If you accept that information from a reliable source, then clearly a) it's IMPOSSIBLE to attribute QS to any SINGLE variable on its own, and b)it's also impossible to establish a firm cause-effect relationship between what you do, and what happens with QS.
Interestingly, I just found one of our test campaigns did get QS's to some degree recently. It was paused, and we made no changes to ad text, or in fact anything. I don't conclude anything from this except it's just those 100+ variables a rocking and interacting with each other.
According to a google engineer quoted by ewhisper in a QS explanation session at pubcon2006, there are over 100 variables that can be (or are) used to calculate QS.
Of course there are. But some variables are more important than others. And if you fulfill 99 of those variables but miss one of the more important ones this can mean your minimum bids rise.
And one of those more important variables seems to be that Keywords in your Ad text must match keywords on your landing page. At least that is my conclusion since this was the only change I made.
The only other plausible explanation would be that the reevaluation may have been a coincidence and it may have had nothing to do with changing the pages at all. Maybe they have changed the Algorithm and that brought the bids down again. Who knows.
rbacal:
* It makes algos exceedingly hard to game or manipulate* which makes figuring out possible metrics very hard
* According to a google engineer ... there are over 100 variables that can be (or are) used to calculate QS.
* it's also impossible to establish a firm cause-effect relationship between what you do, and what happens with QS
Yet, we have:
rbacal: I've explained how these bots (actually algos) are built, and the process makes sense and works
QS algo is simple and stupid. Many guys here have figured it out.
Arrgh. Pricing has NOTHING to do with competition any more.
This is getting pretty entertaining when people start to claim that bids have nothing to do with things, while others claim everything google does involves a cash grab. Contradictions?
In any event, again according to ewhisper's pubcon2006 presentation, here's the formula for ad rank.
Ad Rank = (Quality Score) X (Max CPC)
Many guys here have figured it out.
Can you name, let's say, ten, and have them talk about how they "figured it out"? Nah, didn't think so. You can appear to have it figured out over the short term, and by coincidence, and you might be able to "figure it out" enough to tweak your OWN site, but you can't generalize.
I guess you haven't figured out that you can understand HOW an algo is created, and how it works generally, but you still can't game it?