| This 50 message thread spans 2 pages: < < 50 ( 1  ) || |
|Search Engine Algorithms, Inbound Links, Website Rot|
The Difficulties in Ranking as an Authority
| 6:26 am on Mar 23, 2006 (gmt 0)|
One of the biggest challenges a webmaster faces is to build and maintain an “up-to-date”, content rich site ... and achieve the recognition the site deserves as an authority site.
Aging of Authority Sites and the Putrification Factor
As the internet ages, so do established websites.
The unfortunate part is that there's a lot of old, very old information rotting on the vine and putrefying the web. The information contained on some very old sites is long past being useful. Site owners who may have moved on to bigger and better things or who do not have the time or inclination to update the info, just keep throwing money at the registrar in order to keep the URL live because it happens to be a great URL and they may have plans to sell the URL or intentions of updating the info “someday”.
In their day, these so called “authority sites” were very useful and therefore had many, many inbound links which they have managed to keep because human beings are inherently lazy. Very few webmasters delete links to old sites.
Along comes a web author looking for authority sites in a specific field, about which they plan to write. They take a cursory look at the sites which pop up in their keyword searches and because they don’t know beans about the topic (and because they are too lazy to go past page one of the search results) they erroneously make the assumption that because a particular site comes up for many different keyword searches, that it must be “the” authority site for that particular topic.
After writing their article (including old or erroneous information) they gleaned from the very outdated site(s) they used as reference, they proceed to link to them, perpetuating the myth that this particular site is “the authority"!
... And the Rich get Richer!
It is infuriating to work your butt off, writing a truly content rich site which is clearly head and shoulders above the old, "established" sites ... only to have it beaten consistently by these outdated sites containing erroneous information, broken links and photos which no longer represent the person, place or thing to which they are referring. In my sector, some so called “authority sites” need only mention the the keyword or phrase in order to come up number one in the search engines!
My site on the other hand may have 15 pages of in depth information, photos, data and links for a specific topic ... yet it is relegated to fourth or fifth place (or even further down the list) for certain keyword searches behind these very old sites with virtually no content at all.
I am in the travel industry and own a niche site for a very small country. There is no question (in anyone’s mind) that my site contains much more information than any other site (including the Tourist Board, other government sites and other more established sites) for this particular little bit of the world. Yet older, so called "authority" sites and sites with keyword targeted URL's and not much else to offer, often beat me in the search results.
Years ago, another site in my little area, had the foresight to buy “THE” perfect URL ... even before the government ever dreamt of having their own website. Very smart indeed. Years ago, it contained some very useful information and attained many, many inbound links. This was back in the day when webmasters actually willingly “gave away” links to any sites they deemed useful. But times have changed!
This site (with the perfect URL) called itself a "directory" and claimed that it represented every company in the country. That was a flat out lie. The truth is that they charged $400.00 annually for all outbound links and represented less than 20% of the companies in the country. It too was allowed to rot on the vine for many years with little or no new content of consequence. It was a cash cow!
Yet another "authority" site is currently charging $1,700.00 per annum for a link from its PR7 site! This site belongs to an official organization to which my company belongs and to which I pay annual membership fees. Sorry, but their site has WAY less information than my site does ... and I simply refuse to pay them for a link!
How to Compete if Yours is the New Site on the block?
So ... along comes a small business owner (me), desperately trying to do business on the web. I can’t afford to pay the outrageous amounts some “authority sites” are demanding in return for a link. What to do?
I was told by those in the know here at WebMaster World to “build content and they will come”. And its true! A good content provider will be recognized in time ... lots and lots of time!
The search engines slowly but surely began to discover my site and listed it “somewhere” in the SERPS. Respectable ranks were achieved (in time) and my “new site” began to receive traffic and a modicum of recognition from other webmasters. Mostly though, it received a lot of attention from travellers on travel forums looking for information. I thought that was great and expected the site to soar in the rankings ... but it never really happened because forums are too easy to spam and the search engines know it.
I am just guessing here, but I think links from forums are not deemed terribly important by the search engines ... and therein lies the tragedy! The very people you are targeting are "voting" for your site by linking to it, but the search engines do not give those links any sort of "real" value because forums are too damned easy to spam. Sigh!
My site has never seemed to measure up to the “big boys on the block”, in regards to ranking. Why? Because the web has changed and webmasters expect something in return for a link or choose to steal your content. Its a vicious circle!
Now what should I do if I still can't afford to or do not wish to buy links? Build more content of course! And away I go ... writing fiendishly, spending months doing research, spending thousands on camera equipment, trying to learn as much as possible about photography, spending more money travelling to various places to take photos, hiring planes and helicopters to take aerial photos ... more writing, more research, interviews with people, travellers and other business owners, locals who have lived in the area for 50 years or more ... and so on, adding content, lots of original content. Now surely, my site will be recognized as an “authority site”? Well ... maybe not!
Inbound and Outbound Links
I link to all sites in the territory for free because I still believe we should all be looking out for the greater good and we (webmasters) should link "freely" to anyone and everyone with content which our readers and potential clients may find useful.
Despite what Google or any other search engine thinks, I even link to sites which have been deemed "bad neighbourhoods". Where I live, very few business owners are webmasters, nor do they understand the first thing about search engine rules and penalties. As a result, they are subject to mistakes made by their webmasters ... through no fault of their own.
I will link to anyone who is deserving of a mention on my site. Full stop. I don't care what the search engines think about this practice. I only care about what my potential clients and readers think.
Once again, I live in a very small country and the people here are not necessarily very worldly or educated in the ways of the internet. It is unforgivable to penalize them for something their webmasters (who are not very internet savvy themselves) for breaking some unwritten rule of which they are completely unaware!
Call me silly or stupid if you wish, but I will continue to do what I think is right and fair in this regard and if my site gets penalized for doing so, then so be it!
In a perfect world
The truth of the matter is that ranking a website based on inbound links is an outdated premise which must be abandoned. In theory, and in a perfect world, it should work ... but what the search engines did not take into account (back in the day) was that the world is not perfect and human beings are flawed. We are greedy and (to a degree) morally corrupt. If that weren’t true ... there would be no need for organized religion or laws of any kind.
So of course, as soon as webmasters began to understand the value of an incoming link ... it became a type of currency which could be bought, sold or traded. The true value of the website being linked to often doesn’t even come into consideration when webmasters choose to link or not to link.
It is not the fault of the search engines that my site and hundreds of thousands of other deserving sites do not achieve the recognition they deserve. It is the fault of lazy and/or greedy and/or corrupt webmasters! Many (lazy) webmasters do not refresh their links on a routine basis, leaving links to outdated sites in place and failing to look for new, better or more deserving sites to which they might link. Some webmasters choose to steal content rather than providing a link to the site which rightfully owns the content.
Stealing content creates a mathematical nightmare for search engines (who use logic) to determine which site owns the content and which site is just regurgitating the original content.
One thing I have found is that I can write content, and two years later, someone steals it. I then update my page and "some" search engines suddenly "assume" that because my page is newer than the other ... the site with the oldest date must be the original owner! This is really annoying.
Google on the other hand assumes that the site with the most inbound links should be recognized above the original owner! This is particularly infuriating!
The End User
Because the world wide web is the single largest entity containing an almost infinite amount of information, the challenge posed to any search engine is to be able to determine (algorithmically) which site is more deserving to be recognized as an "authority" than another. In fairness to all search engines and taking into account that human beings are inherently greedy, this is pretty much an impossible task.
The best any search engine can ever hope to do is "come close" to delivering that for which a surfer is wanting to find. That's why there are often millions of results for any given keyword or phrase. It is up to the searcher to determine which sites they want to look at.
So how do searchers determine which sites to look at within all those sites being presented? The search engines can only "suggest" which sites their algorithm has determined to be the most relevant based on their limited, mathematical ability to make such a determination determination. Right? ... right.
The various elements included with each listing are intended to help a searcher determine which site they'd like to visit. Those elements are: Title
But once again, searchers (being people instead of machines) are inherently lazy and they will almost always click on the first result regardless of the elements contained in the search results being offered. As a result, if they don't find what they want immediately, they may look at the next few results or eventually refine their keywords out of frustration.
Each element is important (to a degree) and are pretty much self explanatory.
One would "hope" that the webmaster responsible for creating the page will carefully craft his/her title in order to describe in 7 words or less, the key factors a particular page is about.
Note: 7 words is debateable ... but let's assume that this is the magic number for the sake of argument rather than getting into a discussion about how many words work best on which engine. OK?
Many believe that meta tag "descriptions" are useless and outdated. I disagree. Most of my descriptions are used on all search engines. It is intended to be an accurate synopsis of what the surfer will find on a given page.
I happen to use URL's like www.mysite.com/photos_this_topic.html or www.mysite.com/information_this_topic.html
These are key elements used by google to help determine what a page is about and they do factor into search results ... though I don't know what weight they are given.
This is where Google falls flat on its face! If you are searching for something which may not have to do with the main topic of a site but the site happens to contain very relevant info for the topic you searched ... if you click on "similar pages" you will likely find absolutely nothing about the topic you searched for!
This is because your site has been "classified" with your main topic through various other web resources including DMOZ and other directories.
This is a very basic and immense flaw in the current system which I have been waiting years for Google to address!
If you own a website which sells car parts, but you write the absolute best and very detailed history of the automobile ever written ... you may never achieve your goal of being recognized as the best and most authoritative website on the internet in regards to the history of the automobile simply because your site is classified under "car parts".
The "similar pages" link is where Google and all search engines and directories are really lacking in "intuitive or smart" cross categorization for medium to large sites with exceptional content.
In Defense of Google
I am a widget broker. I built my own web site (with the help of the folks here at WebMasterWorld) because I could no longer afford to pay webmasters to do it for me and was within days of becoming homeless when my website tanked on Inktomi. Back then, Ink was "the" force within the search engine industry. I believe that Google has recognized the inherent flaw in its original premise of ranking sites based on "popularity votes" via inbound links from other web sites. Links have become a "currency" which can be sold, bought and traded in order to manipulate search results. As a result, manipulation of search engine results has flourished and must be stopped in order for search engines to survive and remain useful on any level.
The following are assumptions on my part based upon my very limited and possibly flawed understanding of how search engines work ... so please take that into consideration prior to slagging me off for defending Google.
For some time, they (Google) have been grasping at straws, trying to prevent outside manipulation of search results (by unfair means) by implementing penalties, filters, etc. I believe Google has recognized the inherent flaws associated with search engine results based on any one or even three or four values being used to determine relevance. As a result, they have decided to experiment with many different factors in an attempt to emulate the deductive reasoning of the human brain.
We all know that (so far) humans can still outdo any machine in determining the true value of a website by simply surfing to it and reading the content. But it doesn't matter how big and wealthy a search engine becomes, there is no way they can hire enough humans to do the job! No company on earth could afford to do that! It must be done through an algorithm and that's a pretty lofty goal!
However, human beings are (as already noted) ... inherently flawed. We are greedy and not always morally concerned when it comes to money!
So my assumption is that Google's goal and newest algorithm/infrastructure will be based upon many and varied factors including many of those things a human being (given sufficient data) would be able to detect or take into account if intimately familiar with a given topic and reviewing a website. Those things would include: PR or Page Rank [webmasterworld.com] representing votes from other websites. TR or Trust Rank (which has yet to be added to the WebmasterWorld glossary) but which is based upon the age of a site as well as the quality of sites linking in. Stemming [webmasterworld.com] which has to do with variations of the root word used on a web page such as boat, boating, boater, boats or yacht, yachting or yachts. Anchor Text [webmasterworld.com] or Off page Criteria [webmasterworld.com] Cloaking, Stealth or Obfuscation [webmasterworld.com] Cross Linking and blatent webmaster abuse by related sites [webmasterworld.com] Dead Links [webmasterworld.com] The use of doorway pages or domain doorways [webmasterworld.com] Splash or entry pages [webmasterworld.com] Filter Words [webmasterworld.com] and Stop Words [webmasterworld.com] Hidden Text or Hidden Links [webmasterworld.com] Keyword Stuffing [webmasterworld.com] Link Farms [webmasterworld.com] Link Rot [webmasterworld.com] Mirror Sites or pages [webmasterworld.com] Page Jacking [http] Reciprocal Links [http] Referrer Strings [webmasterworld.com] Spamdexing [webmasterworld.com] IP Spoofing [webmasterworld.com]
Much of the above is related to guerrilla marketing techniques intended to manipulate search results. Ultimately Google wants to put a stop to the manipulation of their algorithm so that they can level the playing field for all available "content" versus those who choose to steal, manipulate and ultimately control any given field of information.
I for one applaud Google's efforts and although many have suffered tremendously throughout the growing pains, Google and other search engines have supplied many of us many years worth of free traffic.
They have all undergone massive changes over the years and from time to time, there are those of us who think Google, Inktomi, Yahoo or MSN just plain suck ... and you are entitled to your opinion just as much as I am entitled to mine.
I don't think any of the search engines suck because all of them send me the traffic I need to make a living. Some send more than others and some drive me crazy because I just can't figure out what they want ... but the fact of the matter is, this is how I make my living and I accept the fact that I will just have to do my best to figure it out the best I can!
I think the folks at Google are the most incredibly focused group of people on the planet who have one goal ... and that is to gather all the written information in the entire world, categorize it and deliver the most relevent search results possible for any specific search term.
If you think about that for even five minutes, anyone will realize what a monumentally impossible task that really is. Kudos to the people at Google for even making such an attempt ... and many thanks for the free traffic you deliver to my site daily! It is greatly appreciated.
If my traffic ever dries up (again) ... I promise, I will not be slagging Google (or any other search engine) off as I once did. I "get it" now ... and am in awe of what they are all attempting to do.
I hope that eventually, they are each able to stop outside, biased and greed driven manipulation of the search results. Even if my website should suffer throughout all the growing pains, I believe in what they are doing and will accept that they have to do what they have to do to achieve their goals!
I hope at least some of you will agree ... but even if you don't, that's OK with me. I'm used to my opinions being quite some distance from the accepted norm.
Now ... if Google or any other search engine were to require payment to be considered for a listing in their search results, I would not be nearly as idealistic. I would paid whatever fee was required and then I would employ every guerrilla marketing tactic available to man in order to fight and claw my way to the top!
The way I look at doing business on the internet
I work very hard to provide the content the search engines like to use (for free) and for their own profit to supply their surfers with the answers they are looking for. I don't charge the search engines to spider my site and in fact I welcome them!
In return, the search engines send me the traffic I need to make a living. We (the search engines and myself) enjoy a symbiotic relationship. There are no short cuts to being successful (free of charge) on the internet which I find acceptable. Provide the content and you will be rewarded with free traffic. Try to cheat me out of the traffic I have worked so hard to cultivate by working hand in hand with the search engines ... and I will report you through any means possible! I am sick to death of content thieves and will take any and all infringers to task!
Surfers want what they want and I do my best to deliver it for my specified field. I get traffic in return from the search engines. What I do with the free traffic rec'd from search engines is up to me and hopefully, once I get an inquiry, I will perform my job well enough to convert surfers to buyers!
Jeeze I'm tired now! :)
[edited by: trillianjedi at 5:25 pm (utc) on Mar. 24, 2006]
| 5:11 am on Mar 25, 2006 (gmt 0)|
I agree with you, Liane.
I have thought for awhile now that the link system that the search engines use is outdated. The search engine results are no longer "created" by people who use the web, they are created by people who make the web. Even if the world were perfect and everyone played nice (no se spam), the way the SEs use linking is outdated.
Back in the day, it was easy for anyone on the web to start a website. The people that made websites were really the only people who used the internet.
That has changed. Yeah, yeah, I know that anyone who wants can start a blog... but that is not the same. Webmasters now decide what pops up in search engines, but webmasters only make up a very small part of the internet population. From a marketing standpoint, this is a very bad thing for the SEs.
This maybe why so many web developers feel that the results are bad. We know in our heart of hearts that it's not really what the public wants to see.
| 6:15 am on Mar 25, 2006 (gmt 0)|
|So my school report on Google is: Nice personality, means well, trying hard, but failing to meet required standard. |
Sorry nedguy, but your post exemplifies the reason I beagn this thread! It appears you have missed my point entirely ... OR, perhaps you have simply chosen to ignore it and took this opportunity to reiterate what others have written in countless theads on WebmasterWorld.
Its OK ... we don't have to agree. I was more or less trying to say:
Webmasters are generally nice people who mean well. "Some" work hard while others hardly work ... preferring instead to "borrow" content from others, use dodgy SE ranking tactics, fail to maintain up to date info on their sites and fail to meet many accepted professional standards.
| 9:33 am on Mar 25, 2006 (gmt 0)|
>>"It appears you have missed my point entirely ... OR, perhaps you have simply chosen to ignore it and took this opportunity to reiterate what others have written in countless theads on WebmasterWorld."
Yes, I think I missed your point entirely.
You were talking about the stale & duplicated nature of web content and how well Google deals with it. I was talking about Google and searchers, and how badly Googles deals with them.
MY point is that your point is made irrelevant when Google gets the basics of its job so wrong.
| 11:25 am on Mar 25, 2006 (gmt 0)|
Every search engine/user experience conflict has a solution, and in this case a very simple one.
If you must link to a bad neighbourhood to complete the user experience, do so with a scripted link, not an HTML link.
You don't have to let Google do all the thinking.
| 5:46 pm on Mar 25, 2006 (gmt 0)|
|In my sector, some so called “authority sites” need only mention the the keyword or phrase in order to come up number one in the search engines! |
It isn't just long-established "authority sites." For some Google searches, I see keyword-driven, template-based, computer-generated pages ranking at or near the top in Google searches for which they have little or no information. This was a bigger problem a number of months ago, but it's resurfaced in at least some BigDaddy searches (where, with luck, it will be a temporary problem).
Note: My comment above is pretty objective, I think. I do well in Google, for example (better than the button-pusher sites for most keyphrases that I track), so my observation wasn't made out of bitterness.
As for the unduly high rankings of "authority sites" that are never updated, I think there's some truth in what Liane says, but I also think that seldom- or never-updated "authority sites" can be more useful than newer sites in many cases--especially in this era of Adsense-powered, crank-out-millions-of-pages button-pusher sites, when only a small fraction of the new pages being published on the Web are likely to be of any value to the user. Most of the time, I'd rather read a five-year-old page at [a European tourist capital].org than a newer computer-generated page at a button-pusher site, if only because the latter is likely to consist of nothing but a headline and an invidation to submit a review.
| 6:21 pm on Mar 25, 2006 (gmt 0)|
I'd like to take the discussion back a couple of steps to the point where it sounded like putting good content out there should come first and the googles of the world will get it right eventually, although perhaps a long time off.
|I work very hard to provide the content the search engines like to use (for free) and for their own profit to supply their surfers with the answers they are looking for. I don't charge the search engines to spider my site and in fact I welcome them! |
Regarding my testimonial suggestion, Liane wrote:
|Actually, as much as I appreciate your viewpoint, I simply disagree with the whole "patting oneself on the back" sort of thing some websites include ... including "testimonials" from clients. |
I agree, but I still see an opportunity. Someone, who knows a bit about what you are offering, but has never done it, would probably appreciate reading something written by someone else, who has tried it. The added benefit of someone else's written account on a site is that new visitors to the site may feel that the site is more personal than a site that is completely written by the (often anonymous) webmaster.
I put my picture on my site as a way of building trust with my audience. I like to think it gives the site a warmer, friendlier feel. You'd be surprised how many people, who call me on the phone, ask me if that's really me in the picture. They sound appreciative when I say yes.
| 11:06 pm on Mar 25, 2006 (gmt 0)|
|If you must link to a bad neighbourhood to complete the user experience, do so with a scripted link, not an HTML link. You don't have to let Google do all the thinking. |
|Most of the time, I'd rather read a five-year-old page at [a European tourist capital].org than a newer computer-generated page at a button-pusher site, if only because the latter is likely to consist of nothing but a headline and an invidation to submit a review. |
Granted and your point is well taken EFV. I see this with some of the larger "review" type sites in our sector too. They are (almost) completely useless ... yet they seem to rank well. Sigh.
|I put my picture on my site as a way of building trust with my audience. |
Jeeze ... I don't want to scare off potential customers! (Just kidding) It's a good suggestion and after I lose about 20 poinds I will probably do it! ;)
|Someone, who knows a bit about what you are offering, but has never done it, would probably appreciate reading something written by someone else, who has tried it. The added benefit of someone else's written account on a site is that new visitors to the site may feel that the site is more personal than a site that is completely written by the (often anonymous) webmaster. |
Well ... you may have convinced me. Rather than redirect the thread, I have started a new thread about testimonials [webmasterworld.com].
| 12:13 am on Mar 26, 2006 (gmt 0)|
|Jeeze ... I don't want to scare off potential customers! (Just kidding) It's a good suggestion and after I lose about 20 poinds I will probably do it! ;) |
No need to wait, that's why they invented Photoshop! (trust me, I know, my photos on my site look much younger and thinner than the real life version)
|Well ... you may have convinced me. Rather than redirect the thread, I have started a new thread about testimonials. |
You're right about testimonials often being a bunch of BS. I looked at the new thread and people seem to agree, but the photo angle is another story...
What I'm talking about is more like an account of a cruising vacation adventure. People love reading them, and others love writing them. Check out a site called noonsite, a global site for cruising sailors, and read letters from Jimmy Cornell, or the accounts of Dee Caffari, a woman doing a solo sail around the world the wrong way right now, or recall the numerous articles about Ellen MacArthur making her record (she's in the news again along with Dee Caffari on sailing.org).
I'm sure your clients don't have life and death struggles in wild ocean conditions, but I bet that they have some great stories to tell, and even more importantly some great pictures to go along with them that you could display. Giving them a forum to show off all that good stuff might be a winner. You'd give them a way to show off their trip to their friends, and a way to generate some new business for you.
| 12:49 am on Mar 26, 2006 (gmt 0)|
|No need to wait, that's why they invented Photoshop! (trust me, I know, my photos on my site look much younger and thinner than the real life version) |
Yes, but does photoshop have a "fun house" mirror type setting which automatically transforms the person in the mirror? ;)
|Giving them a forum to show off all that good stuff might be a winner. You'd give them a way to show off their trip to their friends, and a way to generate some new business for you. |
Ermmm, there are already several sailing and travel forums specific to my part of the world which have trip reports, photos, etc. I don't want to reinvent the wheel or appear to be stealing their thunder. There is one very active community in particular which always support me but which would view that sort of thing with less than amusement.
I'd have to be very careful if I strayed into that arena. Not saying its not possible ... just that it could get dicey.
| 1:42 am on Mar 27, 2006 (gmt 0)|
Nice post with plenty of food for thought.
My reflections: The Law of Buzzword Compliance dictates that a search engine will soon get with "Web 2.0" and start using user feedback to adjust search position. If they do it right, it can overcome many of the limits inherent in a machine algorithm and revitalize the idea that search rank reflects popularity. The original page rank algo took the position that a link was a vote; this made it too easy to stuff the ballot box.
The most useful function of user feedback would be to enable consumers to demote lousy sites that are good at playing SEO - since most casual users look at the top ranked pages, the feedback would be concentrated among the most valuable real estate. A 'rate this site' built into a toolbar would be very nice.
There are a few sites that could evolve into this (stumbleupon comes to mind).
side note: I don't think that 'we webmasters' or the "webmaster community" is to blame ... I sure didn't contribute to the problem. Not sure that anyone can be blamed, although it is easy to throw darts at black-hat SEOs. The financial rewards of getting traffic make it a system begging for abuse.
| 3:24 pm on Mar 27, 2006 (gmt 0)|
|The Law of Buzzword Compliance dictates that a search engine will soon get with "Web 2.0" and start using user feedback to adjust search position. If they do it right, it can overcome many of the limits inherent in a machine algorithm and revitalize the idea that search rank reflects popularity. The original page rank algo took the position that a link was a vote; this made it too easy to stuff the ballot box. |
|A 'rate this site' built into a toolbar would be very nice. |
Nice idea...but wouldn't it just be as simple as it becoming a "positive user feedback stuffing" then a "ballot box stuffing"?
Whether search engines or webmasters like it or not...links will always be a natural and key way of determining a website's popularity/weight/importance/etc.
I think that improving the quality of the link algorithm, and how they weigh a link and analyze its importance/relevance is what should be most looked into.
For instance, I think that link age is a bit overated. I don't agree that a site that has the same set of links for a long time should have more weight, or any advantange in importance then a site that get alot of new links...it really depends how they are acquired...etc.
....not saying that the entire idea of user feedback should be disregarded either... ;)
| 4:14 pm on Mar 27, 2006 (gmt 0)|
Check out Google's patent algorithm.
They look at the staleness of contents.
Of, in bound links, of outbound links.
They also look at the content.
So they consider more than just in bound links.
We shouldn't assume that google does the same thing to all sets of results. They may have different strategies for different verticals. Some verticals like 'real estate' are more prone to spamming. Some verticals like 'how to fly to the moon' may not.
I know a law firm that's on the top 5 slots for two very highly competitive keywords in the industry. There are virtually no in-bound links to the site for those keywords. But the site is really and truly, genuinely informative on the subject.
Google is getting smarter.
| 4:45 pm on Mar 27, 2006 (gmt 0)|
|Check out Google's patent algorithm. |
Don't put too much stock into patent filings, as I'm sure Google is not going to put the most important of their ranking techniques in a public document available to all of its competitors.
| 5:00 pm on Mar 27, 2006 (gmt 0)|
if u think about the factors needed to determine the importance of a page, independently from google, in a brain storming fashion, you get more or less what they came up with.
obviously, the part they have hidden is HOW those factors come into play or work in conjunction with each other. like the example of the law firm. how did they decide his website deserved to b e on top without in bound links?
furthermore, on some subjects there are no inbound links. no one is linking to each other. in those affairs, u gotto consider more factors than simple links.
my advise is this: don't take PR too literally.
| 6:15 pm on Mar 27, 2006 (gmt 0)|
|Nice idea...but wouldn't it just be as simple as it becoming a "positive user feedback stuffing" then a "ballot box stuffing"? |
Not necessarily ... each vote could be tagged with an anonymous user ID. If someone gives shlock sites high votes, all votes made using that ID could be discounted or dropped altogether. New users would start with a low trust score and would not have much influence on the voting. As they ranked more sites and it was determined that they weren't stuffing the ballot box, their trust ranking (and the value of their vote) would be increased.
Of course, there will always be people who invest time to game the system, and they might suceed to a degree. And any formula will have limits. There will never be a perfect system. But there could be a better one.
| 5:15 am on Mar 28, 2006 (gmt 0)|
|ranking a website based on inbound links is an outdated premise which must be abandoned. |
While I agree with much of what the original poster said, this one line sums up the problem(s) with search engines (mostly Google).
Nedguy's post about the abhorrently bad user experience we are forced to endure simply because people don't know any better, is also right on the money.
The internet in 1999 was fun, exciting, wide open, challenging and not overcrowded. 7 years on, we have a spam-laden, corporate-controlled, overhyped pile of #*$!.
(I read a sports article on Yahoo which made me want to alternately puke and write letters to both the writer and his editor. Big surprise! No way to contact either. Yes, you will accept our garbage and not complain.)
But it's not just the internet that's suffering from mediocrity. Take a look around your world:
Do you think your government (local or national) is anything more than a feeding trough for useless political hacks?
How about the content of network television? I won't even go into the absurdly vacuous news content, I'd just like to watch a comedy that makes me laugh or a drama that isn't another reguritated cops and robbers plot with different characters.
I find the most interesting sites on the internet, not with much help from any of the search engines, but mostly from my own curiosity and following links from other good sites.
I find myself visiting the same sites more and more often --- shouldn't repeat traffic be just as or more important than links? Makes sense to me.
| 3:21 pm on Mar 28, 2006 (gmt 0)|
"I find the most interesting sites on the internet, not with much help from any of the search engines, but mostly from my own curiosity and following links from other good sites. "
| 3:53 am on Mar 29, 2006 (gmt 0)|
AdSense and Analytics are here to help Google to better index the web.
Say you are not the "authority" for something, with few inbound links and low SERPs. But Google knows, throught AdSense and Analytics, that your visitors average a higher pageview per visit and they keep coming back.
| 3:59 pm on Mar 29, 2006 (gmt 0)|
A vary interesting discussion taking place here, made me realize few facts and made my head a lot clear about SEO.
My point of view will be a lot different then most of you, since my site is more of a homepage then anything else. For businesses, site ranking means a lot more then for a non profit site. But satisfaction of seeing you work, top in search engine ranking is some thing that is common to every one and I am no different from any one else.
Most of my sites traffic comes from motorcycle and travel forums and groups. But I have seen traffic coming through by word of mouth and other members of the forums going ahead and posting my sites link in another forum. Not because they are friends of mine or I am going to give the some thing in return, just because they liked what they saw.
I guess traffic coming from this kind of source is much more important then the traffic that you will get from spam or other marketing tactics. Take it this way, a friend of your customer is going to be interested in your product/service because he/she trusts your customer. That means turning a visit into a sale is going to be much easier then with a person lured in with false keywords.
But then again we don’t live in a perfect world and most will do anything need to make that extra buck.
just my 2cents
| 1:20 am on Apr 1, 2006 (gmt 0)|
Liane, very nice indeed.
I'd like to add.
Google's algorythm is flawed, and unfortunately we know it and spammers know it. Hence, ease of abuse. In 1999 there was a premise that a Search Engine can be built that will rank sites based on links, just like books are based on number of references. Unfortunately, it takes to publish a book to make a reference (daunting task), and it takes 30 seconds to create thousadns of spam links. Flawed assumption.
Once spammers recognized it, there's no stopping. Google can patch all they want, come up with filters and such, but the basic idea, the premise is now DEAD. You can't distinguish a supermodel from an old mama based on number of links.
Now, back to what you are trying to achieve. You want that #1 listing you think your site honestly deserves. Why? If just for pride, then I suggest to swallow it, the world is not perfect. If there's a substantial $ reward behind, well...you didn't assume that by reading Brett's outdated 16 steps you can become rich, did you? There would be many millionaires on this board then. Yet, most of the folks here are habitual webmasters that make their living elsewhere. Why didn't 16 steps help them? Becase you need to get dirty to make $$$. How about you spam the hell out of Google for these 1000 links that you need to be #1?...
| This 50 message thread spans 2 pages: < < 50 ( 1  ) |