Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Influence of Brand and Usage on Google Serps

         

Nutterum

7:22 am on Apr 14, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month




System: The following 4 messages were cut out of thread at: http://www.webmasterworld.com/google/3003595.htm [webmasterworld.com] by goodroi - 7:57 am on Apr 14, 2015 (utc -5)


Brand Bias is real. Not because Google made it real, but because the algorithm works in a way to make it real. The more content you have and the bigger you as a business appear to be, the more you will climb the SERP ladder. I think you all agree with this notion and there are more than enough examples to back it up. The only thing that is still not affected by brand bias is "fresh content" tailored for specific stacks of long tail seasonal keywords (think events, concerts, etc. for example). Search for a red widget however and you will see nothing but amazon, regardless of how fresh, unique or superior your product or content is.

Machine learning has the fundamental flaw of using mathematics and the inherited "entropy laws" associated with it, where the big become bigger and the small, smaller.

Just my 2c on the topic.

RedBar

9:42 am on Apr 14, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The more content you have and the bigger you as a business appear to be, the more you will climb the SERP ladder.


I couldn't disagree with that more...unless one is an American company. In my widget sector there is a definite US company bias, outside of the US G seems to prefer spammy, scraped, keyword stuffed sites.

Kratos

10:36 am on Apr 14, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



One thing I have seen correlated with SERPs success is if people are looking for the actual brand or "brand" + keyword.

Some of the biggest increases in SERPs coverage that I have seen outside of major updates has been when a business had its name searched in Google. For example, harrysfastplumbing.com and people looking for "harrrys fast plumbing phone number" or "harrys plumbing fast price" (without the "").

I kept track over a year of a couple of businesses and noticed that the more people looked for the brand, the more the site appeared for strong keywords. Now this could have been simply that the business became more popular and thus got more links (hence increase in SERPs) but it could just as well be Google trusting the site more as more people directly look for the site. It's probably both variables affecting the correlation, but I cannot confirm it and I don't place much focus on it as you need a serious advertising campaign to get that amount of brand awareness. And no using automated bots to search for keywords plus business name and click on SERPs doesn't work, already tried it :-) (although it could work if I had been more detailed with the bot but again the benefit is minimal and it would be easy for Google to detect this manipulation as you're in their turf after all).

toidi

12:46 pm on Apr 14, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



The more content you have and the bigger you as a business appear to be, the more you will climb the SERP ladder.

i don't know. i am up against some pretty heavy hitters (my main site included) but i still manage to beat them for first page placement with 5 sites that each have less than 10 pages, next to zero content, less than 5 links, and iframes that keep viewers on a page for very long periods of time.

adder

2:01 pm on Apr 14, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



The more content you have and the bigger you as a business appear to be, the more you will climb the SERP ladder.

Yep, you're right but when it comes to branding signals, it's not so much about the content. I think the main reason why small brands are struggling is the volume. "Dell" - 673,000 monthly searches and 1,270,000,000 pages returned, "Amazon" - 55,600,000 monthly searches and 1,360,000,000 pages returned. Now beat this! And some small brands are killing it by making sure they understand their market. They're creating content that resonates with the right visitors, they're targeting the keywords that bring sales etc.

Search for a red widget however and you will see nothing but amazon, regardless of how fresh, unique or superior your product or content is.

Yeah, but Amazon hasn't got much valuable content, so it contradicts your first statement :)

rish3

3:05 pm on Apr 14, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



I've seen some oddness that would hint at some other factor.

An example:

The employee/company store for Berkshire Hathaway, for example, ranks well for several generic ecommerce terms for clothing. Berkshire's main website is laughably amateurish and small. The employee/company store is more modern, but still small. The only inbound links to the company store are from Berkshire Hathaway's main site.

This employee/company store is not attempting to sell to the general public. It will allow it, but it's not the purpose. The prices/merchandise aren't competitive either. I would imagine a very low dwell time for regular people that land there.

Edit: Also...what was the Vince update correcting, if brand bias is just a side effect of size, links, etc?

EditorialGuy

3:55 pm on Apr 14, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The more content you have and the bigger you as a business appear to be, the more you will climb the SERP ladder. I think you all agree with this notion and there are more than enough examples to back it up.


That may be true to some degree, but I think it's less true than it was a year ago.

Early in 2014, Matt Cutts said that Google was working on an algorithm change that would benefit sites or authors with authority on specific subjects, and there's some evidence that this change made its way into the May, 2014 Panda 4.0 update.

For the informational queries that I watch, a number of megasites have become noticeably less dominant than they were in the past.

Nutterum

7:27 am on Apr 20, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



My point is that if you grow as business you are bound to perform better on the SERP results. Of Course if you mess things up as a webmaster things can grow south very fast, but so long you have a decent team, you can have big presence at a fraction of the effort the small businesses make in order to climb/keep your good presence. The myth that many big businesses have millions in spending on SEO and content creation and marketing is just that. In reality, I personally know many many SEO professionals that have a small team of 1-4 people managing huge corporate websites and doing a good job doing it. Hell I personally witnessed that a company recently sold for over USD 400 million with a website of over 15,000 pages had only _one_ SEO specialist (granted the marketing team was around 20 people) and they owned the SERPs so hard that if they could their company logo would have appeared on the spot of the Google Logo.

Can small guys rank - Yes! Is there a brand bias created from the algorithm be it intentional or as consequence of the way it works - Yes!

RedBar

9:56 am on Apr 20, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



My point is that if you grow as business you are bound to perform better on the SERP results.


And once again I have to totally disagree. I have absolutely no idea in which sector you are however it certainly does not work like that in mine.

Nutterum

8:48 am on Apr 23, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



The examples I gave about big companies doing next to nothing in terms of SEO are all in the IT sector. From custom CMS-es to software downloads etc. The companies these SEO guys handle are medium sized to say the least( 500+ employees several offices worldwide etc.) yet the only thing that these companies spend budget(and I am talking in thousands not millions) wise is marketing and visual design. The SEO and webmaster part of the deal is handled by 1 guy. Their products are good enough to get backlinks very naturally and their admittedly very good design of the websites helps quite a bit, but in the end there are competing companies fighting teeth and nails to get a fraction of the SERP exposure, while these leading companies enjoy the free ride on 1st spot of Google for many high volume keywords.

martinibuster

12:29 pm on Apr 23, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Brand is simply name recognition and goodwill that allows you to charge more for a product or service. It is not a quality that any search engine looks for. I've read hundreds and hundreds of research papers and not a single one mentions brand or a preference for brand.

It's all fairly transparent. What they're looking for is authority and relevance. Then there is the flip side to that coin, what they're not looking for. What they're not looking for are pages that were designed with search queries in mind.

Further, it's no longer about ranking for search queries but about what the user expects to see, which can change. Here's an example, SERPs can change according to data gathered from the way users interact with the SERPs.

If you do not have a solid understanding of what implicit feedback is and how that affects your ranking, then move your chair to the front of the class and pay close attention because there's a world of SEO that I believe the industry does not yet fully comprehend. I think it's this gap in knowledge that partially explains the fallback to brand bias as a reason for the SERPs. Again, there are no scientific research papers that focus on brand as a quality. It is always about relevance to the user. The only instances where anything remotely close to brand is mentioned is in the context of when a user expects to see a specific site in the SERPs, like when someone types "WMW google update" in Google and expects to see WebmasterWorld in the SERPs. Try it.

Here's another example, certain queries (that aren't even local services) originating in different parts of the country. But it's not limited to IP. There are many other factors that can subtly personalize the SERPs, change them according to factors unique to you.

Beyond that I will assert that Google has killed on-page SEO. Almost everything the industry understands as "on-page SEO" is not only inconsequential but is now self-defeating. All the talk about "brand bias" is perceiving a tree trunk when all along it was an elephants legs you were touching. Your challenge, in order to survive, is to figure out what it is that replaced on-page SEO. Can you guess?

[edited by: martinibuster at 1:06 pm (utc) on Apr 23, 2015]

Nutterum

1:04 pm on Apr 23, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



I easily can. Good quality backlinks coming from good quality product or content. It is hard, very hard to obtain those but for some landing pages of the main website I work for I have seen the vast difference between having those links and not having them.

My original point however got derailed badly. What I was saying initially is that its a vicious circle. You make good content -> you get backlinks -> your serp increases -> you get relevant traffic -> that traffic links to you due to your relevancy established by Google - you get relevant traffic -> repeat until you get on TOP of the SERPs .

I am not referring to brands having it easy because they are brands. I am referring that the Google ALgo works in such a way that when it recognises you as a brand, or authority or whatever else you want to call it, you are given a free ride on the first positions for the keywords you rank for, never mind that your UI/UX is terrible, or you have ads above the fold, or you have many thin pages. You will still rank well from one point onward simply because the algo works like that, until another great content comes to move you back to the bottom.

martinibuster

4:05 pm on Apr 23, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



you are given a free ride


Kind of. But kind of not. The SERPs are not a listing of sites ranked 1 - 10 according to links. Links play a role, yes. But links only get you past the velvet rope. There is still more scrutiny before you get past the door and into the SERPs.

It's not like your page scored 50 points and your competitor scored 50 points and you're both tied for first position in the SERPs. There are no ties. There are no link counts versus link counts. No research paper ever discusses the SERPs in terms of ties that need breaking or winners based on just links. Current research for the past ten years has focused on content, understanding content. That's partially what Panda is about, quality, the ability for a machine to see your site with the same critical eyes a quality rater does.

Ever have a tough time cracking a position in the top ten or top five? Ever bounce up, enjoy traffic then the plug gets pulled? That could be implicit feedback working against you. This is what I'm talking about when I say that Google and Bing have pretty much killed on-page SEO.

Machines are quality raters now. Google and Bing do not depend on human quality raters. Did you know that? Human quality raters make conflicting judgments. Machines do not. Human quality raters cannot scale. Machines can scale.

The SERPs are not the best links ordered 1 - 10. Implicit and explicit feedback metrics are used for ranking and re-ranking, for quality control of the algorithm and to help determine what users in aggregate want to see in the SERPs.

EditorialGuy

4:45 pm on Apr 23, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



... and to help determine what users in aggregate want to see in the SERPs.

Different users--whether individually or as subgroups--have different tastes, preferences, and needs. How do you see personalization fitting into this "implicit and explicit feedback metrics being used for ranking [etc.]" scenario today or in the future)?

martinibuster

4:58 pm on Apr 23, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Of course, you're correct. There is no one single algorithm that is scoring sites with points against a checklist of ranking factors.

Search queries are treated differently according to user intent. What I meant by in aggregate is a group of users making the same query. It might not be what the individual user wants, which is why the lower SERP positions may look different than what's on the upper positions. Unless the user is signed in, we're not at the point yet where the engines can strictly personalize the SERPs. So by necessity it must be done in aggregate, to satisfy the intent of most people.

rish3

5:09 pm on Apr 23, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



Machines are quality raters now. Google and Bing do not depend on human quality raters. Did you know that? Human quality raters make conflicting judgments. Machines do not. Human quality raters cannot scale. Machines can scale.


I was under the impression that the human quality raters were still the source for bucketizing a "sample set" of pages into "good", "better", "best", "spam", etc buckets....using factors that currently, only a human could reliably fill in. (Like "would you trust this website with your credit card info")

Then, the machines take the result of that work, and use a completely different set of signals and factors to find criteria that correlates across sites in (or not in) specific buckets. Once that's done, they go about applying that concept to all websites.

Are you saying it's not working that way (roughly) any more?

martinibuster

6:19 pm on Apr 23, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Are you saying it's not working that way (roughly) any more?


That's exactly what I'm saying. Not just Google but Bing, too. We're talking about many things here, not just ranking algorithms but also quality algorithms like Panda. It started out with human quality raters but their judgments were used as a basis for creating a scaled approach. That's what Matt was talking about in 2011 in that famous Wired Magazine article: [wired.com]

...you look for signals that recreate that same intuition, that same experience that you have as an engineer and that users have....we actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side. And you can really see mathematical reasons...


The voodoo SEO crowd saw that as a smoking gun admission that Google prefers brands. But the truth is that Google was doing what is described in numerous scientific research papers, feeding a machine the data points that matched what the quality raters were seeing so that the judgments of the quality raters could be reproduced, scaled up.

Those who continue to promote the idea that Google prefers brands do a disservice to themselves and to those who listen to them. They are willfully keeping themselves in the dark. I'm working to get more of this information out to the community and hopefully one day the community will look back at the "Google prefers brands" phase of SEO that we are in today and rightly consider it the Dark Ages of SEO.

Here is an interesting paper from 2010 [research.microsoft.com] that can serve as an introduction to this kind of machine learning. This isn't THE algorithm. It's just an introduction, like drawing the shades open and letting a little light into the room. There are scores more papers on this topic that will take you out of the room into a larger world of science behind the rankings.

A new trend has recently arisen in document retrieval, particularly in web search, that is, to employ machine learning techniques to automatically construct the ranking model... In web search engines, a large amount of search log data, such as click through data, is accumulated. This makes it possible to derive training data from search log data and automatically create the ranking model. In fact, learning to rank has become one of the key technologies for modern web search.

rish3

8:18 pm on Apr 23, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



Those who continue to promote the idea that Google prefers brands do a disservice to themselves and to those who listen to them


That's a bit over the top to me. When you consider both the main algorithm, as well as pieces that work outside the main algorithm, there's tons of opportunity for either intentional, or unintentional brand bias.

If you mean to say that you don't believe there's a simplistic white list of sites, I think that's reasonable.

There's certainly brand bias culturally. For example, there are many documented cases of companies with connections getting manual penalties lifted in a few days, weeks, or months. That absolutely doesn't happen with SMB websites, ever.

martinibuster

9:36 pm on Apr 23, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



For example, there are many documented cases of companies with connections getting manual penalties lifted...


Now, now... Ahem... I'm trying to be polite... But I have heard that one before. That's a rhetorical trick. I'll give you the benefit of the doubt that you're simply repeating what someone else wrote. Here is the rebuttal to your example: I am talking about apples when discussing a ranking bias. When you talk about connections and the lifting of penalties, you are talking about oranges.

I am referencing the concept that Google has a ranking bias that prefers brands. That has nothing to do with penalties getting lifted.

rish3

9:53 pm on Apr 23, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



That's a rhetorical trick.

No, it's not. A culture that shows bias in one area is more likely to be exhibiting it in others than a company that avoids it across the board.

There was no attempt to trick anyone, or imply that they were related technically. I started with "There's certainly brand bias culturally". How's that a "trick" ?

I'll give you the benefit of the doubt that you're simply repeating what someone else wrote.

That's just flat out condescending. You can take that crap elsewhere.

rish3

10:03 pm on Apr 23, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



Anyhow, the truth of the matter is that lacking inside information, nobody is going to prove, or disprove brand bias.

Even with inside information, machine learning can exhibit unintentional bias. It's also notoriously difficult, with some machine learning algorithms, to reverse engineer exactly why they decide things the way they do.

I can think of lots of attributes that would be more common on a brand-run website (as opposed to a typical SMB website) that a crawler could see. Many of them would not be tied to quality, relevance, etc, directly.

Edit: Also, forgetting technical, and going back to culture. The leaked FTC report clearly showed deliberate tweaking of criteria and/or weighting for purposes other than "quality".

To me, it's not hard to imagine a similar scenario, where someone in Google's leadership specifically asked to have some criteria and/or weighting tweaked. Perhaps, for example, to bury spam...but knowing full well the change would have a high false-positive effect on smaller, but quality, sites.

martinibuster

10:25 pm on Apr 23, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



doesn't happen with SMB websites, ever.


It happens to SMBs who hire Virante, InternetMarketingNinjas, and BruceClay. SMBs have recourse to quality representation. All three of those companies are skilled and have years of experience serving SMBs. If your car is broken, bring it to a professional if you want it fixed right. If an SMB's site is penalized they also hire professionals who help them out.

In any case, whether SMBs have recourse to professional representation has nothing to do with the Voodoo SEO contention that Google has a ranking bias that favors brands. Two different topics. ;)

And I wrote an article in SEMPost that completely demolishes the idea, starting with every quote Voodoo SEOs have cited from Matt Cutts to Schmidt. The contentions are built on misrepresentations.

rish3

11:06 pm on Apr 23, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



It happens to SMBs who hire Virante, InternetMarketingNinjas, and BruceClay...

So, a small company can sometimes buy influence.
Two different topics.

Zooming up, I think company culture is at least as strong as indicator as anything technical as to whether they would be open (or blind) to unwanted bias.
You don't seem to think so. So, we disagree.
I wrote an article in SEMPost that completely demolishes the idea.

Well, for example, you provided the full context to what Schmidt said about brands. I don't personally think that demolishes the idea. It shows that he wasn't specifically talking about the algorithm. To me, it does nothing to change the idea that he feels that brands are how you "sort out the cesspool". How can you read that in any way other than "brands == quality"?

A couple of questions for you...

a) Do you feel like the leaked FTC documents shows a clear manipulation of the algorithm for purposes other than presenting the best results to the end searcher?

b) What do you think the Vince update was? What perceived issue was it fixing?

aristotle

11:08 pm on Apr 23, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



martinibuster --
I think many of the complaints about bias stem the observation that near-worthless pages from big brands often ran higher in Google's results than pages from small sites that have lots of valuable and useful content. Naturally people wonder why this happens, and the obvious explanation is an intentional bias in favor of big brands. So that's where the belief comes from.

EditorialGuy

11:21 pm on Apr 23, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



To me, it does nothing to change the idea that he feels that brands are how you "sort out the cesspool". How can you read that in any way other than "brands == quality"?


The "cesspool" quote needs to be understood in context. He was speaking to a group of magazine publishers, so his comment simply confirmed a reality that most of us would readily acknowledge: When people are confronted by trillions of pages of crap (and let's face it: Most of the content on the Web *is* crap), they're going to prefer publishers they trust (whether big, small, or in between) to publishers they've never heard of.

{Note that I said "publishers." Contrary to common belief, Schmidt wasn't talking about e-commerce sites. He was talking specifically about Web content.)

Nutterum

6:32 am on Apr 24, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



OK let`s agree there is no brand bias in the way it is preached preached by the flavor of the month SEO wanna-be personas. And let us agree that their rattles about how Google is playing against the small businesses to increase revenue is in actuality an inability of many oldschool SEO "professionals" to adapt in the current state of Google Search Ranking.

But then how do you explain then the plethora of search results showing the same domain on the first 8 slots? I see this behaviour of the algorithm more and more and to me as a user, this is no way showing relevant results. Even if I check the same query via machine checker it shows the same results. I can't accept the notion that there is no other relevant content eligible to be shown to the user, so something else must be in play.

I want to underscore, I to believe that whitelist or intentional authority is out of the question. What I argue here is that there is an unintentional one. Call it an artifact of machine learning if you will. Whatever the case it is there and we can all see it. So my question is what do think is causing this and how can we battle it?

rish3

5:20 pm on Apr 24, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



Here's a made up example:

Suppose the algorithm is fed all of the html markup from a bunch of sites in a bucket marked "high trust" as something it should analyse and later use to bucketize the web at large. As it happens, a very large percentage of the sites in the bucket are recognizable brands. The raters weren't told to pick recognizable brands, but it's an unconscious bias on their part.

Also, as it happens, those sites happen to have a lot doubleclick markup in them....specifically "DFP premium" markup, a product only available to well heeled, branded publishers.

Might not that, combined with other similar attributes, lead to brand bias? Further, if there's a knob you can turn to increase the importance of certain criteria (perhaps not at this level of granularity)...

Nutterum

1:51 pm on Apr 29, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



well @rish3 - this is still relevant to my notion of "artifact of machine learning" . It`s not that Google wants brand bias - its the way machine learning interacts with certain sites to create one, unwillingly. Of Course there are probably many examples where this statement can`t hold true, however if you back away and look at each vertical as hole it is there, whether you like it or not.

It is no longer about competition, it is about a SERP leader and the underdogs fighting for the leftovers. And sometimes some of the underdogs get a piece from the table.

martinibuster

3:06 pm on Apr 29, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



We can construct examples with the variables of our choosing and make them be whatever we want. Those are fictitious, anything we put into those examples will come out the way we engineered them. Let's take a look at this with whatever knowledge we possess of information retrieval algorithms. I think nutterum is onto something:

...this is still relevant to my notion of "artifact of machine learning" . It`s not that Google wants brand bias - its the way machine learning interacts with certain sites to create one, unwillingly.


Good! Let's review how machine learning works, using the research information that is publicly available, and try to keep awake at the same time! :)

The thing about machine learning is that the machines are testing the SERPs for user satisfaction. From the very beginning, which as far as we know is about 2004 (GoogleGuy confirmed that here on WebmasterWorld), CTR and other data was used for "quality control." When we discuss machine learning, a large part of it is discussing user satisfaction. So when they use CTR data, time on page, and other metrics to evaluate the SERPs, the machines are identifying user dissatisfaction and correcting the algo so that those kinds of sites don't pop up. Secondly they are identifying instances of user satisfaction, which may result in a re-ranking to promote lower ranking sites that are similar in content. I've been studying these research papers related to machine learning and re-ranking SERPs and the metrics used for those purposes and the focus has always been on user satisfaction.

Research on this has been ongoing for many years, far more than we have been discussing it on these boards. It's not well understood but I welcome the opportunity to share the little I know about this and introduce these concepts. For example, from 2006, Improving Web Search Ranking by Incorporating User Behavior Information [research-srv.microsoft.com].

We show that incorporating user behavior data can significantly improve ordering of top results in real web search setting. We examine alternatives for incorporating feedback into the ranking process and explore the contributions of user feedback compared to other common web search features. We report results of a large scale evaluation over 3,000 queries and 12 million user interactions with a popular web search engine. We show that incorporating implicit feedback can augment other features, improving the accuracy of a competitive web search ranking algorithms by as much as 31% relative to the original performance.


What were we discussing in 2006? Reciprocal linking? That paper from 2006 shows how user satisfaction metrics like CTR are used to improve "the accuracy of a competitive web search ranking algorithms..."

Machine learning is not about homogenizing the SERPs with brand type sites. To the contrary, it's the opposite. Search engineers create classifiers that introduce a certain homogeneity (call it "brandness" if you wish) but the machine learning algorithms kick in to measure user satisfaction and they will actually suggest dismantling ranking factors if they are leading to user dissatisfaction. So even if you build in a bias, if the bias results in user dissatisfaction then those biases that caused the dissatisfaction are revised or removed.

What you see in the SERPs can be said to be what users have generally voted up the SERPs with their clicks. Machine learning algorithms, I think, can be seen as data mining, they sift through the billions of user interactions and construct a suggestion of what it is the users themselves mean when they search for Jaguar, Prince Nymph, and Cheap Travel to Vegas.

rish3

5:19 pm on Apr 29, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



Machine learning is not about homogenizing the SERPs with brand type sites. To the contrary, it's the opposite.

Well, to be fair, machine learning is about whatever the implementer wants it to be. While I believe on the whole that Google wants the SERPS to reflect user preference, there are real examples (from the FTC report for example) where they deliberately tuned to devalue something they knew users liked.

but the machine learning algorithms kick in to measure user satisfaction and they will actually suggest dismantling ranking factors if they are leading to user dissatisfaction

In a perfect world, where the intent is pure, and the implementation is without flaw, sure.

What you see in the SERPs can be said to be what users have generally voted up the SERPs with their clicks.

I do believe that clicks are a factor, but I'm not convinced that on it's own, it's always more powerful than other signals...which would have to be true for the above to be true. There's also a complex interplay between click thru % and current rank.

At a high level, you seem to be arguing that there's no brand bias because it wouldn't be good for users....and that Google would never intentionally do something that wasn't good for users. Or, that they are so good at what they do that they would never let unintentional bias affect their product for an extended period of time. Forgetting the FTC, what they've done to the organics with aggressive ad placement, reducing from 10 to 9, to 8, to in some cases, 7 organic results, etc...seems at odds with that theory.
This 33 message thread spans 2 pages: 33