Welcome to WebmasterWorld Guest from 188.8.131.52
So this is a knowledgeable commentary from someone who sees a lot more data that most of us can even dream about. I've extracted four observations out of many, many more.
- Bing had 2.9% spam, Google had 2.56% spam, while Yahoo had 4.9%
- Bing prefers URL matches more
- Bing seems to prefer pages where the term occurs with its first letter capitalized
- Bing does less term-rewriting than Google.
Tom's Blog [cuil.com]
It's that last observation above that caught my eye the most. If Google is going to lose ground to Bing/Yahoo it will be in this area -- too much giving you what they THINK you mean instead of what you actually typed. We see related comments [webmasterworld.com] here quite often in recent times.
There are two things I like from the four items. Bing perfers url matches more. This is huge actually. I would predict in the future, should they remain doing this, that you will see people putting up websites and removing those domains from Google altogether. I agree on your point about term-rewriting. Rewriting completely ignores the work that webmasters put into their sites. I've seen "cheap" and "buy" removed from my 2 or 3 term search phrases in Google recently. Another example of Google playing God. I certainly hope a competitor like Bing will change this apparent philosophy. Time will tell.
too much giving you what they THINK you mean instead of what you actually typed.
This has actually been my biggest problem with Google lately. There is no way a room full of programmers and PHD types know what I'm thinking other than the "exact phrase" I typed in at the time I'm searching.
I think, in the past people where not as savy as to what to type but today users know exactly what to type to find what they want and don't want some automated algo telling them, no you don't want that try this. This is indeed where they will lose ground to Bing. It's already happening and while I know that national reporting metrics say it's not happening a lot, in my circles of people and clients it's actually happening a lot more than the trends say. I would guess it's true as well outside my circles.
There are those that will remain die hard fans of Google even if they presented a page full of spam. But those that are not die hard fans and use it out of habit will try something new if what they are used to is not giving them what they are looking for anymore.
I would love to see an eventual 50/50 split for Bing/Google in searches and I think that day is going to be sooner than most would think.
This goes back to Google has over cooked the sauce and it's starting to spoil.
Another of Tom Costello's observations might be playing in: "Bing is weaker than Google where proximity is important." In recent weeks, if you want proximity on Google, you darmed well better use those quote marks.
As far as keyword-in-url (and from the examples given, I think he means keyword-in-domain) I've just about had enough of that easy crutch in the algorithm both for Google and for any other search engine that's still taking this cheap path.
Keyword domains probably do draw the click more often when they appear in the SERP - but that doesn't mean they are the better result. It just means the eye naturally tracks to those bolded letters, and that factor alone maybe distorting the user data for Google and the other search engines.
I call reliance on the keyword-domain ans a strong factor a "curtch" because that's how I see it. Yes, you can be pretty sure about basic relevance. But beyond that every page needs to "prove itself." Some definitely do - and many really don't.
That's why I say it's time for some better programming work from all teh search engineers in this area.
...but this isn't just any old blog. It belongs to Tom Costello, CEO and founder of the Cuill search engine
Cuill? Is that supposed to add authority to this thread Ted? ;)
Actually the problem with search engine results comparisons now (as it always has been) is that that we can never remove subjectivity.
In the real world, most people aren't obsessive about comparing product details. If the product works reasonably well, is easy to use, and fits the user's self-image, the user isn't likely to have a compelling reason to change.
Indeed, I find most outsanding "presenters" to be lacking on the technical side, and your outstanding "technicians" unable to present. Used car salesmen aren't mechanics.
And Kuil's creator guy, as tedster says, will have (or have had) access to quantities and quality of data of which we can barely conceive.
Anyway, he probably thought ripping off mis-spelt words (Just like Googol) was pretty khul. No?
I also liked this observation from the article:
First the bad news for Bing. It overlaps Google too much. On our test queries it overlapped Google 29% of the time, more than Yahoo (25%)
The point being, if Bing wants to grab market share, they've got to differentiate themselves.
I dare say Quool's failure to create a brand, present (or even build) results or any other exectutive problem bares no relation to the technical abilities of its creator.
As the CEO he naturally takes the fall on everything that Cuil became. I understand I can't build a search engine to compete with Google but I'm not acquiring millions of dollars worth of investments to make a big 'ol failure.
The point being, if Bing wants to grab market share, they've got to differentiate themselves.
That comment brings up an interesting point: Is there a way that Bing can differentiate itself in its search results that attracts more users than it drives away? If Google is the standard for most people, isn't there a risk in producing SERPs that are too different from what those users expect?
For Bing, trying to distinguish its results from Google's could be a case of "damned if you do and damned if you don't."
There are two kinds of search engine rankers: those that are machine learned from testing data, and the hand built ones. The machine learned ones often outperform the hand built ones when you measure them against training data, but when tested by other metrics (like click-through, or time on landing page) they turn out to be worse.
This is because they learn what people say they like, not what people actually like.
Something to keep in mind when creating page titles and descriptions. You've got to choose your words wisely so as to please both types. Thanks Tedster.
1. Spam, who wants spam.
2. URL matches. How pathetic can bing be to rate URL matches. In most cases a URL match simply means a website owner trying to rank high with a URL match. Nothing more. Bing needs to expend money and effort to analyse what the page is saying rather then the cheap and nasty solution of looking at the URL
3. First letter capitalised? They must be joking. Another very, very cheap and nasty way to provide SERPS. Spend more money Bing on real analysis of a web page.
4. Term re-writing. After 30 years in the computer industry, one fact sticks out like a flashing red beacon. Users of any system have very little idea of what they want. A little judicious term re-writing by Google goes a very long way to providing users with what they really want.
First letter capitalised?
That may be just correlation rather than causation. If the Bing algo is using keyword prominence as one scoring factor, then the highest prominence would be for the first word in a sentence or paragraph - hence it would most often be capitalized. So this could be a side effect, rather than a direct part of the algo.
Bing doesn't NEED to be "recognizably" different (or better) to be successful.
Anymore than Pepsi needs to taste recognizably different (or better) than Coke.
They just need to gain enough market share with big money ADVERTISERS to be a true force in the debate.
And this is more a matter of market PERCEPTION (aka branding) than anything having to do with "best quality according to geeks" measurements.
It's my tired ole analogy of
"Who makes a better burger?
or that great hometown diner on Main Street?
Yet who makes more money?"
As webmasters, we do not want Bing! to wipe Google off the face of the earth.
We DO want Bing! to grab enough marketshare (with the masses, not us geeks) to keep Goog from having a virtual monopoly on traffic.
I agree with tedster that term re-writing is the key.
Again, let's ignore the geekiness, and look at pop culture.
Comedians have been making fun of this aspect of Goog for years now.
And when something as geeky as term rewriting makes it into most comedians "pop culture observations" than it's something that resonates with the culture as a whole as a "problem".
Bing! has done a great job of realizing this and openly mocking it in their commercials
and at the same time,
subtly focusing on the big money advertisers.
Watch those commercials VERY CAREFULLY and see how they appeal to both
Goog's term rewriting arrogance
high paying CPC ad terms.
This information is not enough :). I would ask how many legitimate websites had to go to make sure the spam index is low...
I would ask how many legitimate websites had to go to make sure the spam index is low...
Thatís a good point as well; penalties are one of the dominant themes these days when it comes to Google. Just look at the amount of threads and posts devoted to that subject here.
Google has become overly reliant on penalties to keep things clean. If Bing could keep the spam down without losing so many quality sites through collateral damage the way Google does, that would help give it a bit of a different flavor.
However, Google's "everflux" (to say nothing of yo-yo rankings) has gotten the upper hand. Even my great aunt is complaining - she blames me, I think ;)
So there's the recipe. It will be expensive to grab some mindshare, but Bing already knew that and committed to it. Once they get my great aunt to visit, then just a bit of stability will make her happy - and a couple bajillion more just like her.
Bing prefers URL matches more
And how! In my niches, if the search term is not in the domain name and/or the page file name, then forget it. Older sites that predate naming conventions driven by SEO do not do well in Bing searches.
Until on-page relevance becomes more of a determining factor in Bing's search results, a lot of quality pages will remain out of sight for Bing users.
I searched Google vs Bing for 337 keywords/phases for my industry and site and scored it this way:
If Bing listed me on Page 5 and Google had me on Page 6, then thats a win for Bing, they get 1 point and vice versa.
If both search engines had me on the same page, its a tie.
Results: Bing = 67...Google 60...Tie 221
It shows me that their results are very similar although I will note that Bing often had a better landing page url listed whereas Google would often have a misc page on my site which is not as good. I would say Bing won because of having the more appropriate URL in their results.
Not at all. The point is for Bing to grab market share they have to be *better* than Google at identifying sites searchers want to see. The overlap is irrelevant if it isn't 100%. What matters is listing better sites, not different ones.
I totally agree actually.
They only need to prove relevant enough to enough users, have better results consistently, and be better at marketing in the next 5 years to start to chase down inroads made by Google.
I already see Bing making inroads in advertising with the younger gen.