Welcome to WebmasterWorld Guest from 54.159.50.111

Message Too Old, No Replies

Google versus Bing - from someone who watches very closely

     
11:41 pm on Aug 16, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


There's a very interesting blog article comparing Google results to Bing results. We normally don't link to blog articles here, but this isn't just any old blog. It belongs to Tom Costello, CEO and founder of the Cuill search engine -- and husband to former Google engineer Anna Lynn Paterson (she's the one on Google's phrase-based indexing patents).

So this is a knowledgeable commentary from someone who sees a lot more data that most of us can even dream about. I've extracted four observations out of many, many more.

  • Bing had 2.9% spam, Google had 2.56% spam, while Yahoo had 4.9%
  • Bing prefers URL matches more
  • Bing seems to prefer pages where the term occurs with its first letter capitalized
  • Bing does less term-rewriting than Google.

Tom's Blog [cuil.com]

It's that last observation above that caught my eye the most. If Google is going to lose ground to Bing/Yahoo it will be in this area -- too much giving you what they THINK you mean instead of what you actually typed. We see related comments [webmasterworld.com] here quite often in recent times.

12:10 am on Aug 17, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Aug 5, 2009
posts:1232
votes: 136


Incredibly find. I appreciate you posting that. I won't get into what the "spam" definition is, that's another story all together.

There are two things I like from the four items. Bing perfers url matches more. This is huge actually. I would predict in the future, should they remain doing this, that you will see people putting up websites and removing those domains from Google altogether. I agree on your point about term-rewriting. Rewriting completely ignores the work that webmasters put into their sites. I've seen "cheap" and "buy" removed from my 2 or 3 term search phrases in Google recently. Another example of Google playing God. I certainly hope a competitor like Bing will change this apparent philosophy. Time will tell.

3:08 am on Aug 17, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Feb 27, 2003
posts:844
votes: 0


too much giving you what they THINK you mean instead of what you actually typed.

This has actually been my biggest problem with Google lately. There is no way a room full of programmers and PHD types know what I'm thinking other than the "exact phrase" I typed in at the time I'm searching.

I think, in the past people where not as savy as to what to type but today users know exactly what to type to find what they want and don't want some automated algo telling them, no you don't want that try this. This is indeed where they will lose ground to Bing. It's already happening and while I know that national reporting metrics say it's not happening a lot, in my circles of people and clients it's actually happening a lot more than the trends say. I would guess it's true as well outside my circles.

There are those that will remain die hard fans of Google even if they presented a page full of spam. But those that are not die hard fans and use it out of habit will try something new if what they are used to is not giving them what they are looking for anymore.

I would love to see an eventual 50/50 split for Bing/Google in searches and I think that day is going to be sooner than most would think.

This goes back to Google has over cooked the sauce and it's starting to spoil.

3:22 am on Aug 17, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Aug 29, 2006
posts:1312
votes: 0


"exact phrase"

If you put your "exact phrase" in quotes Google returns results for the "exact phrase".

In my experience, other searches tend to be either keyword based or borderline literate.

In both these cases Google's approach seems very effective to me.

...

4:05 am on Aug 17, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


Here's one counterintuitive result from Google that I see pretty often these days. I type in a query phrase and notice that the results aren't finely targeted enough to zero in on what I'm looking for. So I add a word or two to the query phrase. Historically that would mean the total number of results should be fewer, right? But now, the total number of results might go UP, because Google's programming has taken on such a fuzzy quality.

Another of Tom Costello's observations might be playing in: "Bing is weaker than Google where proximity is important." In recent weeks, if you want proximity on Google, you darmed well better use those quote marks.

As far as keyword-in-url (and from the examples given, I think he means keyword-in-domain) I've just about had enough of that easy crutch in the algorithm both for Google and for any other search engine that's still taking this cheap path.

Keyword domains probably do draw the click more often when they appear in the SERP - but that doesn't mean they are the better result. It just means the eye naturally tracks to those bolded letters, and that factor alone maybe distorting the user data for Google and the other search engines.

I call reliance on the keyword-domain ans a strong factor a "curtch" because that's how I see it. Yes, you can be pretty sure about basic relevance. But beyond that every page needs to "prove itself." Some definitely do - and many really don't.

That's why I say it's time for some better programming work from all teh search engineers in this area.

8:32 am on Aug 17, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member beedeedubbleu is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Feb 3, 2004
posts: 6099
votes: 6


...but this isn't just any old blog. It belongs to Tom Costello, CEO and founder of the Cuill search engine

Cuill? Is that supposed to add authority to this thread Ted? ;)

Actually the problem with search engine results comparisons now (as it always has been) is that that we can never remove subjectivity.

12:29 pm on Aug 17, 2009 (gmt 0)

Preferred Member

5+ Year Member

joined:Dec 19, 2007
posts:404
votes: 0


there are tolerances and parameters that can be attached to subjectivity allowing for a range where it becomes right or wrong. If i search for apples and all results are about bricks it is clearly wrong even to a simple eyes only observation. Its outside the range of blurred subjectivity. Therefore yes mathematically you can adjust for subjectivity.
3:27 pm on Aug 17, 2009 (gmt 0)

Senior Member

joined:July 3, 2008
posts:1553
votes: 0


IMHO, the differences between Google, Yahoo, and Bing results are pretty subtle, so the few users who switch from one to another are likely to be doing it for reasons other than minuscule differences in spam percentages or whether the search engine tries to deduce whether "free people" is about democracy or an e-commerce offer.

In the real world, most people aren't obsessive about comparing product details. If the product works reasonably well, is easy to use, and fits the user's self-image, the user isn't likely to have a compelling reason to change.

4:00 pm on Aug 17, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:May 6, 2008
posts:2011
votes: 0


It belongs to Tom Costello, CEO and founder of the Cuill search engine

Yeah, he's a real winner. Wonder how money he lost investors on that massive project of failure. The fact you didn't even spell Cuil right shows just how awful it all is(name, results, presentation, image matching, everything)

4:19 pm on Aug 17, 2009 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member

joined:Aug 11, 2008
posts:1269
votes: 44


In fairness, building a SE isnt the same as being able to crunch numbers. I dare say Quool's failure to create a brand, present (or even build) results or any other exectutive problem bares no relation to the technical abilities of its creator.

Indeed, I find most outsanding "presenters" to be lacking on the technical side, and your outstanding "technicians" unable to present. Used car salesmen aren't mechanics.

And Kuil's creator guy, as tedster says, will have (or have had) access to quantities and quality of data of which we can barely conceive.

Anyway, he probably thought ripping off mis-spelt words (Just like Googol) was pretty khul. No?

4:44 pm on Aug 17, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


cuill.com does redirect to cuil.com, by the way - however they use a 302!

I also liked this observation from the article:

First the bad news for Bing. It overlaps Google too much. On our test queries it overlapped Google 29% of the time, more than Yahoo (25%)

The point being, if Bing wants to grab market share, they've got to differentiate themselves.

5:01 pm on Aug 17, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 6, 2002
posts:1825
votes: 21


MSN has always prefered URL matches as far as I can remember, they loved keywords in URL and in folders.
5:07 pm on Aug 17, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:May 6, 2008
posts:2011
votes: 0


I dare say Quool's failure to create a brand, present (or even build) results or any other exectutive problem bares no relation to the technical abilities of its creator.

As the CEO he naturally takes the fall on everything that Cuil became. I understand I can't build a search engine to compete with Google but I'm not acquiring millions of dollars worth of investments to make a big 'ol failure.

6:09 pm on Aug 17, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:2684
votes: 95


I don't have the link, but I remember reading an article last year about Microsoft's Live Search, which said that it was making extensive use of user behavior in determining its rankings. But I don't know if this also applies to Bing, since I haven't seen anything about the relationship between Live Search and Bing.
6:26 pm on Aug 17, 2009 (gmt 0)

Senior Member

joined:July 3, 2008
posts:1553
votes: 0


The point being, if Bing wants to grab market share, they've got to differentiate themselves.

That comment brings up an interesting point: Is there a way that Bing can differentiate itself in its search results that attracts more users than it drives away? If Google is the standard for most people, isn't there a risk in producing SERPs that are too different from what those users expect?

For Bing, trying to distinguish its results from Google's could be a case of "damned if you do and damned if you don't."

7:25 pm on Aug 17, 2009 (gmt 0)

Preferred Member

10+ Year Member

joined:Aug 25, 2005
posts:419
votes: 0


For Bing, trying to distinguish its results from Google's could be a case of "damned if you do and damned if you don't."

In five years time someone will stumble across this thread and say "Bing... what the heck is that?"

7:41 pm on Aug 17, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member swa66 is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Aug 7, 2003
posts:4783
votes: 0


"I just googled that" vs. "I just binged that": your choice.
7:43 pm on Aug 17, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:July 29, 2007
posts:1525
votes: 9


There are two kinds of search engine rankers: those that are machine learned from testing data, and the hand built ones. The machine learned ones often outperform the hand built ones when you measure them against training data, but when tested by other metrics (like click-through, or time on landing page) they turn out to be worse.

This is because they learn what people say they like, not what people actually like.

Something to keep in mind when creating page titles and descriptions. You've got to choose your words wisely so as to please both types. Thanks Tedster.

7:50 pm on Aug 17, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Apr 29, 2005
posts:1871
votes: 41


Bing are on to a loser on all four of the initial points.

1. Spam, who wants spam.

2. URL matches. How pathetic can bing be to rate URL matches. In most cases a URL match simply means a website owner trying to rank high with a URL match. Nothing more. Bing needs to expend money and effort to analyse what the page is saying rather then the cheap and nasty solution of looking at the URL

3. First letter capitalised? They must be joking. Another very, very cheap and nasty way to provide SERPS. Spend more money Bing on real analysis of a web page.

4. Term re-writing. After 30 years in the computer industry, one fact sticks out like a flashing red beacon. Users of any system have very little idea of what they want. A little judicious term re-writing by Google goes a very long way to providing users with what they really want.

7:58 pm on Aug 17, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


First letter capitalised?

That may be just correlation rather than causation. If the Bing algo is using keyword prominence as one scoring factor, then the highest prominence would be for the first word in a sentence or paragraph - hence it would most often be capitalized. So this could be a side effect, rather than a direct part of the algo.

8:16 pm on Aug 17, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Nov 24, 2005
posts:1009
votes: 0


Seems everyone has forgotten the great points brought up in the original Bing! threads and gotten all hyper-geeky.

Bing doesn't NEED to be "recognizably" different (or better) to be successful.

Anymore than Pepsi needs to taste recognizably different (or better) than Coke.

They just need to gain enough market share with big money ADVERTISERS to be a true force in the debate.

And this is more a matter of market PERCEPTION (aka branding) than anything having to do with "best quality according to geeks" measurements.

It's my tired ole analogy of
"Who makes a better burger?
McDonald's?
or that great hometown diner on Main Street?
Yet who makes more money?"

As webmasters, we do not want Bing! to wipe Google off the face of the earth.
We DO want Bing! to grab enough marketshare (with the masses, not us geeks) to keep Goog from having a virtual monopoly on traffic.

--------------
I agree with tedster that term re-writing is the key.

Again, let's ignore the geekiness, and look at pop culture.

Comedians have been making fun of this aspect of Goog for years now.

And when something as geeky as term rewriting makes it into most comedians "pop culture observations" than it's something that resonates with the culture as a whole as a "problem".

Bing! has done a great job of realizing this and openly mocking it in their commercials
and at the same time,
subtly focusing on the big money advertisers.

Watch those commercials VERY CAREFULLY and see how they appeal to both
Goog's term rewriting arrogance
and
high paying CPC ad terms.

Very smart!

8:31 pm on Aug 17, 2009 (gmt 0)

Junior Member

joined:Jan 26, 2004
posts:143
votes: 0


"Bing had 2.9% spam, Google had 2.56% spam, while Yahoo had 4.9%"

This information is not enough :). I would ask how many legitimate websites had to go to make sure the spam index is low...

8:58 pm on Aug 17, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Jan 25, 2004
posts:1042
votes: 3


I would ask how many legitimate websites had to go to make sure the spam index is low...

Thatís a good point as well; penalties are one of the dominant themes these days when it comes to Google. Just look at the amount of threads and posts devoted to that subject here.

Google has become overly reliant on penalties to keep things clean. If Bing could keep the spam down without losing so many quality sites through collateral damage the way Google does, that would help give it a bit of a different flavor.

10:05 pm on Aug 17, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
posts:37301
votes: 0


Also the general user wants some consistency. Even if an engine does term rewriting, as long as they do it consistently - so the user can find the same site they found last week - then a dash of term rewriting can be part of the recipe.

However, Google's "everflux" (to say nothing of yo-yo rankings) has gotten the upper hand. Even my great aunt is complaining - she blames me, I think ;)

So there's the recipe. It will be expensive to grab some mindshare, but Bing already knew that and committed to it. Once they get my great aunt to visit, then just a bit of stability will make her happy - and a couple bajillion more just like her.

11:44 pm on Aug 17, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Jan 4, 2001
posts:1076
votes: 3


Bing prefers URL matches more

And how! In my niches, if the search term is not in the domain name and/or the page file name, then forget it. Older sites that predate naming conventions driven by SEO do not do well in Bing searches.

Until on-page relevance becomes more of a determining factor in Bing's search results, a lot of quality pages will remain out of sight for Bing users.

12:07 am on Aug 18, 2009 (gmt 0)

Junior Member

5+ Year Member

joined:Aug 11, 2008
posts:100
votes: 0


Here is some of my stats which I think is interesting:

I searched Google vs Bing for 337 keywords/phases for my industry and site and scored it this way:

If Bing listed me on Page 5 and Google had me on Page 6, then thats a win for Bing, they get 1 point and vice versa.
If both search engines had me on the same page, its a tie.

Results: Bing = 67...Google 60...Tie 221

It shows me that their results are very similar although I will note that Bing often had a better landing page url listed whereas Google would often have a misc page on my site which is not as good. I would say Bing won because of having the more appropriate URL in their results.

12:56 am on Aug 18, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member steveb is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:June 20, 2002
posts:4652
votes: 0


"The point being, if Bing wants to grab market share, they've got to differentiate themselves."

Not at all. The point is for Bing to grab market share they have to be *better* than Google at identifying sites searchers want to see. The overlap is irrelevant if it isn't 100%. What matters is listing better sites, not different ones.

1:22 am on Aug 18, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Dec 19, 2004
posts:1939
votes: 0


"Bing doesn't NEED to be "recognizably" different (or better) to be successful. "

I totally agree actually.

They only need to prove relevant enough to enough users, have better results consistently, and be better at marketing in the next 5 years to start to chase down inroads made by Google.

I already see Bing making inroads in advertising with the younger gen.

2:27 am on Aug 18, 2009 (gmt 0)

Senior Member from CA 

WebmasterWorld Senior Member 10+ Year Member

joined:June 18, 2005
posts:1692
votes: 3


If Bings keep doing referer spam, it's not going to win the mind share of webmasters.
3:11 am on Aug 18, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Feb 27, 2003
posts:844
votes: 0


If Bings keep doing referer spam, it's not going to win the mind share of webmasters.

Really? In case you haven't noticed, it already is winning the mind share of a good number of webmasters.

Bing becoming a bigger player will better for everybody.

This 118 message thread spans 4 pages: 118