Forum Moderators: Robert Charlton & goodroi
Ah, that clarifies everything. And self-important people, in your universe, don't create their own websites where they can promote themselves, but instead hide their importance by providing anonymous information where they can't promote themselves.
Here on planet earth it's mostly the other way around.
Actually doing things like posting on Wikipedia is EXACTLY how self-important and devious politicians do promote their views in the real world.
Propaganda is far more effective if you don't consciously realise it's propaganda.
Actually doing things like posting on Wikipedia is EXACTLY how self-important and devious politicians do promote their views in the real world.Propaganda is far more effective if you don't consciously realise it's propaganda.
Give this man/woman a gold star! This is EXACTLY the problem with Wikipedia. Wikipedia is a perfect venue for spreading one's propaganda/biased views/disinformation, without having to worry about being outed as the real source of the information. There are some topics where knowing who the messenger is exceedingly important. The big three examples of this are science, politics and religion. In these examples information has NO value without knowing who the source of the information is. I publish the name of my authors in relation to the articles they write for me and I provide biographies on those writers. If someone questions what was written they can do a Google search on the writer's name and find out more information about that writer thus gaining insight as to what that writer is saying elsewhere.
In another example, it is important to know when an article about global warming is by the likes of Al Gore or a coal industry executive. This information is fundamentally important. Without this information the article is less than worthless, it can actually be damaging because disinformation from a biased writer with a hidden agenda can become quoted as fact.
Sorry to disagree but that is ALL the google algorithm is. It is an extension of the company and/or creators preferences and feelings in an automated form based on whatever factors they feel should equate to quality, merit, etc. Links are subjective opinions, as you say "popular votes", which do not mean quality. They can mean quite the opposite in many instances. This subjectivity is a part of the Google algo which further shows you why I disagree. The algorithm is pretty much devoid of complete understanding on a human level.
Although...humans are not logical beings and understanding of our world comes from emotions skewing our own "logical" thinking processes. Humans aren't great at complete understanding as we think we are, we just do it better than most life forms on this planet. Keep in mind that we always will have some sort of biased intent and perception based on our emotional needs, beliefs, etc. at any given time - Authority or not.
As far as wiki goes. Well they should rank well only if they have the content to back it - Google cannot know if the information is absolutely correct or not that would require Google to algorithmically understand each and every subject known to man and update that understanding when facts and evidence of facts change which then leads to problems on opinion pieces. Much of that can be left up to the reader and their decision to remain ignorant or analyze the subject further.
Ranking for nothing more than a word on a page is a bit over the top especially if there are other sites more worthy of that position. I guess we all got to learn to do what we do better. Capture the minds (and links) of our visitors and rise into internet greatness.
The current difference?
Nofollow.
Wikipedia has become a victim of the nofollow tag. Google has no choice but to give them all the credits.
Um, and that makes wiki different from what human source exactly how?You can easily find people with academic degrees who are idiots (in their chosen field) but maintain impeccable academic credentials. Encarta is just a bad joke of course, but it's not at all hard to find errors in Britannica. Don't even get me started on TV network news or the big Urban News. And all the other human authorities are equally constrained. If you don't know of a way to verify the information you get, then ... just figure it might as well be wrong if it isn't already, and live in a fantasy world.... and that's true of ALL the information you get from EVERYWHERE.
Thanks for spelling this out hutcheson. I just had to repeat it. ;)
Sad that they had to resort to such measures. Yet now we know we all can just do a nofollow tag on any outbound links and keep credits for ourselves. Wonder what would happen to the Google algo as far as links go if every website did the same?
Which part of the point. Forget the links to the main page. Wikipedia articles will often have many less links, of lower PR and lower quality, than leading niche sites, which means their ranking is the result of other algo factors, specifically misapplied "trust". That's not to say Wikipedia doesn't deserve some trust, just that even though it is less trustable than niche heavyweights, it manages to outrank them (often) because of Google's algo that favors generic over on-topic.
That's not to say Wikipedia doesn't deserve some trust, just that even though it is less trustable than niche heavyweights, it manages to outrank them (often) because of Google's algo that favors generic over on-topic.
What exactly do you mean by favoring "generic" over "on-topic"? Label one extreme black and the other white, and you'll have countless grey shades in between.
Also, even if we could agree on definitions of "generic" and "on-topic," and even if all of us were willing to assume that Google favored the former over the latter, we'd be stuck with the reality that there are many other factors in Google's algorithm that determine search rankings.
disinformation from a biased writer with a hidden agenda
Just because someone signs their name to an article and has a biography published most definitely does not mean that they are unbiased, agenda free or even accurate in what they say. This is especially the case when it comes to scientific publication.
The standard in scientific publication, certainly when it comes to using material for citation, is publication following peer review, not simply an editorial process.
The standard in scientific publication, certainly when it comes to using material for citation, is publication following peer review, not simply an editorial process.
What does that have to do with Google search rankings? Google isn't a search engine for academic libraries; its mission is simply to "organize the world's information and make it universally accessible."
Um, and that makes wiki different from what human source exactly how?You can easily find people with academic degrees who are idiots (in their chosen field) but maintain impeccable academic credentials. Encarta is just a bad joke of course, but it's not at all hard to find errors in Britannica. Don't even get me started on TV network news or the big Urban News. And all the other human authorities are equally constrained. If you don't know of a way to verify the information you get, then ... just figure it might as well be wrong if it isn't already, and live in a fantasy world.... and that's true of ALL the information you get from EVERYWHERE.
That humans make mistakes is a trivial statement.
To minimise errors human societies has over centuries developed degrees. That saves most of us having to go to quacks that sell miracolous cures in accepted hospitals.
Wikipedia dilutes the process of education. So we will basically ride on confidence instead of ability in the future. I guess there are millions that think they are as clever as any expert. The difference with degrees and advanced degrees, like PhDs, is that you actually have to prove it, that you are not just generating overconfident hot air.
To somehow validate this excerpt up there you would need to somehow prove, with a sufficient big example, that on average people in all disciplines with a degree make more or equal as many mistakes as John Doe from Humdrum.
Yes occasionally John Doe might be cleverer, but is this case so often that it is scientificantly significant?
[edited by: mattg3 at 2:43 am (utc) on Feb. 28, 2007]
bmw.com is an authority site on BMW's, Wikipedia is not. caranddriver.com is niche, Wikipedia is generic. Let's not get this thread even sillier with nonsense like this. Niche authority is plain. General interest sites are too.
Joe-blows-car-blog.com is niche, too, and Johann-brauns-bmw-site.com is even "nichier." That doesn't mean either is better than Wikipedia's articles on cars or BMWs. More to the point, if either site is better, there's no way for Google to know that except by counting and analyzing inbound links. Google is a search engine, not a human-edited directory--and in any case, one man's meat is another man's poison (or vice versa), as we've seen in this thread.
<rel="nofollow">[en.wikipedia.org ]
There is a lot more going on there than meets the eye. When you have that much content about almost everything, you are going to rule. Not to mention those 2.5 million links mentioned earlier. ;)
However...
When I check wikipedia's stats at alexa.com (hope that site reference is OK, Tedster), I do not notice any real jump in their traffic that would correspond to a recent increase in top positions in the SERPs.
(A curious drop near Christmas 2006, but I suspect that is more a relative drop due to the temporary advance / interest of other shopping-related sites...)
Other than that, the present numbers seem about the same as the pre-Christmas 2006 numbers. Just a natural growth due to a natural growing popularity of the site, but nothing dramatic that might suggest a sudden shift in google SERPs favoring wiki...
I was also beginning to subscribe to the conspiracy theory centering around wikipedia, but now I don't think so.
I'm sure I'll get slapped on the hand for this, but it's theme is prolly the most completely optimized than any other site. Dmoz has pages, but wiki has PAGES. So it doesn't matter how accurate the info is the keywords being in the right place gives it power. And the 2 million plus links give it speed. (it's a right brain thing)
Structurally wiki is made in the image of it's father(cybernetic web). So in a way, it's not competing for the top spot. It's occupying it's space by right of birth.
Just because someone signs their name to an article and has a biography published most definitely does not mean that they are unbiased, agenda free or even accurate in what they say.
By knowing who the writer is the reader can filter what the writer stated based who the writer is and what their bias might be.
The standard in scientific publication, certainly when it comes to using material for citation, is publication following peer review, not simply an editorial process.Agreed, and Wikipedia IS NOT a peer review process because again everything is anonymous and thus we have no idea of who and how many actually reviewed a given article and we have no way of knowing what their qualifications were to review said article.
They are not generally in my market.....they do show in the top ten from time to time!
The answer is links, and the idiots that encourage them.
In years to come they could be a significant player in my market, however due to their non commerical nature they are only likely to upset my buyers by wasting their time on intellectual nonsense!
Worst case senerio, I have 5 Wiki links above me.....all worthless.....when someone hits the 6th search result they are much more likely to buy.....simply because they have run out of patience with dumb search results by then!
Even Worst case senerio, the Internet become unusable because the search results are junk!
Maybe time to concentrate on the things they just can't do well, because of technological restraints or their non-commercial ..... nature.
This is a good suggestion and is something I have been doing with parts of my site. It can be highly effective. There are just some things that Wikipedia can not do for technical reasons. This can provide a weakness that can be exploited with great success when developing one's own site.
In years to come they could be a significant player in my market, however due to their non commerical nature they are only likely to upset my buyers by wasting their time on intellectual nonsense!
Like it or not, Google's stated corporate mission is to "organize the world's information [or 'intellectual nonsense,' to use your term) and make it universally accessible."
Wikipedia's mission as an encyclopedia is to provide users with information on a vast range of topics.
Your mission is to sell things.
Which mission (providing information or selling things) is likely to fit in best with "organizing the world's information and making it universally accessible"?
If you have used the Wiki site, you know that they have a lot of information about a lot of subjects. It is like having an amazingly large encyclopedia at your fingertips. However, you know that it is just a starting point for any real research, because the information you find may not be accurate or probably change the next time you visit it.
The people at Google are obviously aware that Wiki is consistently on top of the Serps. G seems to be playing around with its algo lately and Wiki is still holding strong. So the question still remains - why does Google WANT Wikipedia.org on top of the Serps? [conspiracy theory] or to be phrased another way -- why why is Google's algo ALLOWING them to be on top? [is Wiki the perfect model for Google that we should all be following?]
I don't want to see wiki's in the search results for products and services and I don't want to compete with them for eyeballs when trying to sell my products and services. The more off-target sites, such as About and Amazon (few services there) the more users get frustrated and have to click on ads. Some might think that is ok but Google has forced a lot of us advertisers out. So we are f'd.
Don't know if this issue is the cause but my Y and MSN visitors have tripled in the last 4 months.