Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: martinibuster
Here are the facts about one of my company's websites:
- We were included in one of the hand coded 1st page for 2 very competitive keywords - Which means Yahoo intentionally decided to have us on the 1st page...so we were quite happy, though it was totally legitimate and consistent with GG and MSN
- The 1st page of Yahoo for these 2 keywords did not move one inch for about 1 year :)...still good.
- Yesterday I queried Yahoo for these keywords and the site was totally gone along with 2 competitors, all replaced.
>>So far I would say fine, they decided that we had enough free traffic from their pitiful engine which is Ok. Even though they did not care to notice that only 12 websites really sell our products online, the rest are affiliates and apparently the 'editor' could not make the difference -> my point is if you review an industry and hand code results at least do it RIGHT.
What bothers me is both the intention and the results after 10+ positions.
The intention is dishonest, but they do what they want we do not rely on their traffic so much, MSN provides as well as them on this market and Google does so much better!
The 10+ results are showing a complete bunch of crap, totally irrelevant to the queries, for that matter I consider Yahoo like the worst search engine by far, they seem to be unable to make of anything except local searches and even then Yellow pages do good too, no need to be a so called 'search engine'.
I simply wonder when the mascarade will end. It's pitiful and dishonest.
Even on their end, their Yahoo Slurp hammering websites like no other bot ever did while not (or so little) refreshing their index must be ridiculously expensive for a total waste.
Possibly their business model is to keep webmasters busy thinking that eventually Yahoo will reflect their website on the SERP while what really requires resources, investment AND some brain activity on their end is not in their plans: searching for a real algo, undertanding how the internet works, building a real bot based on real site activity and popularity.
My 2 cents, It needed to be said one day, especially for those who did not know yet that Yahoo, is slow and hand codes popular searches.
Is that a joke? You've never noticed slurp hitting your sites like crazy and still not reflecting any change on Yahoo index?
That's weird cause quite a few people here and elsewhere say exactly the same.
For hand coded results, well yes very popular terms are - that's not really a secret with Yahoo anymore, as you said it seems that it is the best and ONLY way for Yahoo to get clean SERP's. That is fine with me, but in that case don't call Yahoo a search engine...it's not.
BTW thanks to the moderator for accepting this post, it is though on Yahoo although SO true -
I whish that they did not abandon the market, Google vs. MSN might be fun someday when MSN will get a little better, but with Yahoo in the game it would be so much better...yet apparently they are totally off.
Yesterday I queried Yahoo for these keywords and the site was totally gone along with 2 competitors, all replaced.
All three of the SEís have their own personalities and traits which is a big positive in my mind. People around here rate the ďqualityĒ of the results based on how they rank, its just human nature. They all have strengths and weaknesses. Iím glad thereís three, and wish there were a few more.
The central issue with Yahoo these days though is they aggressively encroach upon the individuality of the user. Itís their position they know better than the user, what the user is looking for or wants. This is where the hand coding comes in. What hand coding means is they think they know better than their own algorithm what the searcher wants. Hand coding is one mans thinking of what you, the user, wants to see. Does it potentially look better than those produced by their algorithm? Who knows, but itís a bad habit that gets worse over time as you do it more and more. It just keeps re-enforcing this concept that you know what people are looking for, better than they do. Fact is you canít. You might have cleaned up the mess a little, but now someone has decided what I get to view and what I donít.
Hand coding is only a small part of Yahoo embracing a culture that it knows best what the searcher wants better than the searcher. In addition to this, Yahoo has a very concerted penalty program. Every time you remove a site from the index, you make a decision for every single one of your users. That decision is we know better than you, what youíre looking for, and what you donít want. (and please note Iím not talking about trash, spam sites, auto generated pages, link farmed sites, ect, ect, ect, any of the well known junk that a lot of us have produced at one time) Itís an established fact Yahoo has removed a great deal of good sites for reasons ranging from hyphenated domains to the egregious fact of being an affiliate site. No SE has hand penalized more sites than Yahoo, and no search engine is more difficult to get re-included in.
Itís a great site, and a good search engine, I just wish they would give people a little more credit for being able to decide what sites they wish to visit than they do. If your site isnít showing, chances are someone at Yahoo didnít think too much of it, and thinks a little too highly of their own tastes.
I have to laugh where people #*$! at Slurp hitting their sites hard. Please. So you are getting extremely lucky. Count your blessings.
Besides that I have no clue what blessing you're talking about, how blessed am I when they waste my bandwidth while SERP's almost never change, explain?
In fact I would prefer them not to hit that hard but do a better job at as far as algo and index update but I just dream on...
MSN and Google hit the site much less while they reflect the slightest change in 1 to 5 days.
Besides, that level of crawling is trivial in terms of bandwidth, but awesome in terms of anything else. Once again, complaining about something wonderful is really bizarre. I'd love to have Slurp 1/4 as active as Googlebot (and no I don't mean just hitting robots.txt dozens of times day). That would be awesome.
What does the serps ever changing have to do with anything?
Not much from the webmaster's perspective as bandwidth is dirt cheap, however, as a shareholder, one has to wonder what on earth YAHOO is wasting all these resources for if they are not using new data compiled to score pages.
IN ADDITION, and this is as funny as it is ridiculous, we get great rankings on Yahoo UK - come one! The market is US based ONLY (the market does not exist anywhere else), the IP is in US, the site is linked from websites with US ip's at 95%, it is included in their directory in the US....explain me that! How relevant is that?
Well I am going to give the answer :: US competitive searches are HAND CODED - and even though I can understand (with difficulties, but I an) why they do it, it simply shows (eventually) that they are unable to build a reliable search engine.
They don't even trust their own engine, why should we?
-Note: this site receives an outstanding amount of traffic from fair to very good rankings on GG and MSN plus our brand is well know enought, people find the site typing the domain name very often (almost 40% of queries) so no I am not pissed because Yahoo's audience reach is not that big for us, Their whole concept is just kinda stup..d
why they do it, it simply shows (eventually) that they are unable to build a reliable search engine.
They don't even trust their own engine, why should we?
True. Of course they do not trust their algo. The hand coding is getting to be more and more prevalent - to the point that they are approaching full-on editorial directory status, now that they are hand editing the related searches as well.
It's pretty damn sad that yahoo sunk a quarter bill into search technology when they bought ink, etc...and later decided to bag it and go the directory route. If this the direction they are going (it appears to be), they should dedicate more resources to the hand edited areas. There is no possible excuse for waiting years to update certain areas.
Furthermore, they should disclose which serps are hand coded and give users some explanation on why they are seeing these results. Are they partners with Yahoo? Did they like the site design? Are the editors qualified in the areas they are picking sites in?
Seriously, put up a little FAQ about how they rank sites, like all SE's do (vague but something) and ADD the truth. "Often, when we feel our technology is not sufficent, we choose to select the ordering of pages".
I know it's coming.... so let's be clear on this - it's not about how good or bad the hand coded serps are, it is about the fact that they are not handling it properly. If they want to be a good directory, they HAVE to have employees, that are somehow qualified to be editors and they HAVE to occasionally update the serps.
Lastly, there is an awful lot of room for corruption here. They blatantly place sites where ever they see fit, not by software but by preference of their staff. One has to wonder how and if corruption is avoided....
Going down that road, hand sorting competitive searches, they can not get very poor SERP but how qualitfied these people are to tell you what you should see when they totally disregard the true popularity/quality/reality on the market.
We can b sad/upset at Google and MSN SERP's sometime, but NEVER we can accuse these two to be corrupted, incompetent, dishonest at least ...their rankings are based on algorithm (mostly) unlike Yahoo's.
When you think about it, it makes sense that their search engine is the SLOWEST ENGINE OF ALL TIMES since they rely on sneaky human review, they obviously don't have resources to review the whole web in a timely fashion:)
As it's been previously said, I would be holding shares in this company - I would seriously worry what the H... these guys are doing with my money!
What kinda search engine is that? what kinda professionalism is that. It's not only ineficient at a larger scale but totally wrong.
It means that all websites are not equal and are subject to sneaky penalties based on editors personal interests or taste. That's an issue when you want to call yourself a search engine.
steveb, face the truth, you dislike msn, that I know, but the reality is that Yahoo is worst - Without human review Yahoo could be the worst of all across the board.
Moreover a human review is a clear indication that technically they can not (or don't even try to) make right.
what kind of business decision is that? Investing money on research and ending up hand sorting website?
It isn't a purebred, that's for sure. It's kind of mongrelized, injecting editorialism like a directory into what's supposed to be search.
>>Paying people to slowly review websites and subjectively choose those they prefer?
Aside from the possibility that old habits die hard, I seriously doubt that it could be all subjective considerations. It took a long time for me to become convinced that it was really happening, but after a time watching it was impossible to deny any more, even to myself.
[edited by: martinibuster at 6:34 pm (utc) on Mar. 24, 2006]
[edit reason] TOS #4 & 19 [/edit]
By far the best one on the Internet. No results on any search engine can compare with the high quality hand sorted ones.
I simply can't understand how now assert that just because your site is now not among the hand sorted results that this makes it evil. get a grip.
Greater hand sorting is clearly the future, given the extremely poor quality of the engines results for high spam terms. And it is a great thing. Humans will always do a far better job than an algo created by humans.
"steveb, face the truth, you dislike msn, that I know, but the reality is that Yahoo is worst - Without human review Yahoo could be the worst of all across the board."
Seeing the handsorted results updated more often is certainly a good idea, but complaining that the top ten results are way better than the next ten doesn't make much sense! Yahoo's results should be better (they are way ahead of MSN but way behind Google), but the handsorted ones are outstanding, and they should be applauded, loudly, for taking the steps needed to do something user-friendly that increases the quality of their product.
Not when it's absolutely retail consumer terms and wholesalers who don't even sell product are put into top spots. They may be outstanding companies, but they're not relevant for consumers looking to buy widgets online.
If you disagree with someone's post, then offer a coherent and well reasoned explanation of why you believe the opposite is true. And remember, this isn't a winner take all debate. It's just a discussion. No one has to walk away from this thinking they won or lost anything.
As in most things that come down to an opinion that cannot be substantiated as well as a math equation (1+1=2), you must expect a divergence in opinion. It's to be expected. It's not the end of the world. :)
Just a friendly reminder to offer a well reasoned opinion and to please play nice with each other.
We all know the issues: The nature and structure of different categories (i.e., historical information sites versus widget shops) necessarily results in very different kinds of sites (form follows function...and marketing follows function too, btw...). Depending upon the category and nature of a site within a category, we might see differences in structure, navigation, content, internal and external linking patterns and a host of very important measures.
Then there are the issues of philosophy, i.e., which kinds of results are better? G has moved in the direction of favoring information over commerce. That makes some of their SERP's cleaner and more useful, but it makes some of their SERP's largely useless, especially when people are searching for products.
Set against these issues, one of the holy grails for any great SE is to be able, regardless of category, to feature the most relevant and useful sites for a given query.
IMO, this is all but impossible for a single algo. So SE's are faced with either hand coding, or segmenting algos in one way or another by category, or finding, if possible, one algo so keen and intuitive that it can show for any given query, excellent results, albeit not perfect ones.
A search engine, working well, should be able to give good if not excellent results across the largest number of categories possible. It should, and hopefully will, someday be able to do this with a minimum of human intervention.
When humans intervene with hand coded SERP's, the problems multiply exponentially. What standards are used? Who makes the choices? Are the choices evaluated by others? If so, who evaluates the choices? How is research done to verify that the choices are good ones? Are the choices regularly updated? Is so how does that process work? Is there a mechanism for the SE to identify worthy sites that were left out of a past hand edit but should probably be included? I'm sure I'm just scratching the surface with these questions.
Hand coding is not consistent, is not reliable, is not subject to sufficient review and vetting, and lacks the brute force credibility that algo's provide when assessing mountains of data without emotion or personal bias.
I'm not arguing against hand coding as a valid way of providing help to Web surfers. I love hand coding, when done in an above-board and consistent way. But when it's done that way, we call it, a directory. ;-)
You seem to think algos spring from the head of Zeus. Algos are created by humans to decide the exact things/problems that you say "multiply exponentially".
What standards are used?
Who makes the choices?
Are the choices evaluated by others?
If so, who evaluates the choices?
You are arguing no point at all. Every one of those decisions relates not at all to whether a serp is handsorted or algo sorted.
Handsorted serps though have the advantage of far greater accountability, which in itself should make them preferable to anyone who cares about quality and responsibility. Furthermore, unlike an algo, handsorting can far more easily include a diverce range of resources... ecommerce, info, historical, etc. Higher accountability, greater quality, better diversity, there is no downside anywhere, unless your stuff is low quality junk.
I'd love to have Slurp 1/4 as active as Googlebot (and no I don't mean just hitting robots.txt dozens of times day). That would be awesome.
Only if Slurp is hitting and the results are being reflected in listings. Otherwise hits by the bot are pointless.
You seem to think algos spring from the head of Zeus
Nah, I'm not that stupid. Zeus doesn't get involved in stuff like that. I'm pretty sure those decisions are made by some sort of engineer ... or maybe a couple of 'em? Or ... ummm ... maybe some Web researcher? College students getting paid to surf the Web? Is that wrong? ;-)
Well daaaaah, of course it's all man made in the end, but the algos need to be able to function scalably. Right now, things are too much left to someone's (or some group's?) indecipherable whim to determine which categories, and then which sites, get hand coded ... and why ... and in what order. Based on ... money? ... inability to control spam? ... other? Puhlease.
I can tell you, though I'm sure I don't have to, that there are a LOT of categories in Yahoo where the SERP's could probably be improved by hand coding. So what? It's a strawman argument, usually (though not always) employed by people who are currently hand coded into some hot SERP's.
This ain't no directory, this ain't no foolin' around. Hand coded SERP's are part directory. We've got directories already. Y especially. Anyone who wants to use Y's directory, or G's, or any other one, can do that.
The challenge here is to get the algo's right, not to shove a handful of directory listings at the top of some SERP's but not others, in an effort to fool some surfers into believing that the search engine is doing what it's supposed to be doing: Showing us at any given point in time, on an evolving Web, what the most relevant possibilities are for the queries we're entering.
Even I know that.
Geez, I thought everyone knew that. :p
1: There is a discussion of the quality and merits (or lack of merits) of a hand sorted results page. The OP questions the quality and accuracy of the hand sorted results, and refers to it as a masquerade. Which leads to the second discussion.
2: There is a discussion of whether hand sorted results pages should even be considered SERPS, and therefore should not even be compared to Google, MSN, etc.
Let's Define Search Engine
As I see it, the troublesome aspect of hand sorted serps is that by definition, a search engine is automated, it performs it's function (in this case sorting) without human intervention, much like any other engine performs it's function (for instance, combustion and locomotion).
Yes, you can clean your engine, and upgrade the sparkplugs, and add a new turbocharger to it, etc., but that doesn't change the fact that the engine will turn fuel into locomotion without human intervention apart from turning the key (which is analogous to making a search query).
In this case, it sorts a database by means of a software program, with the software program being the engine itself. So the moment you introduce hand sorting, it's no longer a search engine.
It's the difference between an automobile that runs under the power of it's engine, and the foot powered vehicles Fred Flintstone uses. Fred Flintstone's vehicle is not an automobile. It is a pedi-mobile.
Similarly, once Yahoo introduces hand sorting, it is no longer a search engine. When Yahoo introduces hand sorting, by definition it's a directory, or a hybrid of a directory.
Are Hand Sorted Results Pages Good or Bad?
The quality of the results is the other point we keep going around on. But imo, that point cannot be argued until it is acknowledged that we are comparing apples to oranges because hand sorted serps are not the product of an engine.
Search engines have always manually altered the results, most often by removing/banning sites. A search engine returns results for a query. How it does it is of no consequence to its mission. An engine is just a means of accomplishment. That is also what the word means.
Forget the grasping at semantics that aren't there, that's just a dead end. An engine that uses an algo to get a core of 1000 results, then hand removes blatant redirects is not any less of a search engine than one that doesn't, and will be a helluva lot better than the one that doesn't.
Whether the results blow so badly they have to hand sort is just a sign of how algos alone are going the way of the dodo. A search engine exists to provide practically useful results for users, not webmasters, not seos not search engineers, not linguists.
They can do it however they want, and hopefully will do whatever it takes to make the results better for users because that is the path that makes sense.
As for the quality issues, and to steveb's comments, and all kidding aside, I agree with the points that the algo's are not cleanly automated and that human intervention in various ways is a part of what happens. In the end it's all from the minds of humans and if the SERP's are not to human's liking, then things get tweaked. That signifies overall human control and judgement on an ongoing basis. Fair enough.
But I dont' agree that some of the practical matters associated with hand coding are unrelated to the discussion.
Yes, human judgement is a part of all of this. But when the process is working best, human intervention has more to do with defining kinds of sites, and site traits and link profiles that indicate quality products.
When the process is working badly - and as I see it almost as a last resort - people step in after the SE processing has occurred, toss up their hands, admit defeat, and drop hand coded results into the mix. And to make it worse, only in some categories. Seems to me almost inescapable that if they are doing it in some categories, then the algo is also producing less than ideal results in most categories, but hand coding is being employed on only a limited scale.
Hey why not hand code all the categories? Then the SERP's would be universally good!
Hand coding is not only a sign of trouble, it creates more problems than it solves.
Questions like "who does the hand coding," and "how is it checked," and "how often are they updated," are important, because in contrast to better functioning SE's making those determinations algorytmically and on the fly ... hand coded results are subject to not only all the benefits of human intervention, but all of the problems: intentional and unintentional bias, skewed perspectives, etc., and also are unable to keep pace with site and category evolution in the way that the algos can. So if hand coding is judged to be necessary, then constant review, assessment, and assessment of assessment are necessary. Problem is, those things are not happening and in all likelihood never will, because the SE's are businesses dealing with practical realities, and either cannot afford to do what it takes to make hand coding really work, or will not do it.
Solution: Get the algos to working better.
Algos are not getting better, not for any of the engines. Hand coding of results, then even rerunning algos based on these core results is certainly something the engines need to look at because they are just flat out failing at their jobs now.