| This 41 message thread spans 2 pages: < < 41 ( 1  ) || |
|Parked Domain Ranks #2 of 26,800,000 Results for Single Keyword|
Google Algo and Expired Domains
| 10:37 pm on May 9, 2003 (gmt 0)|
Despite being an expired domain (former discount broker) showing 0 backlinks, the #2 result on main Google server for single keyword search "discount" is a generic Network Solutions page. Shouldn't the algo have spit this page out already? As a user, this is an awful search result with 0 relevance.
Google serp shows:
|Web Page Under Construction |
Network Solutions - Original domain name registration and reservation services with variety of internet-related business offerings. Quick, dependable and ...
And it still ranks #4 on -sj server.
| 2:18 am on May 14, 2003 (gmt 0)|
I went and looked, and there are a number of higher PR sites that link to ndb, with the word discount in the link text... so in a sense, the results are not that surprising, when you think about it.
It is relevant because so many high PR sites link to it - regardless of the fact that it is a nothing site.
Googlebot needs to learn about default server / under construction / parked domain pages and open directories, and list neither. That would get rid of this little problem.
| 2:34 am on May 14, 2003 (gmt 0)|
When nativenewyorkerposted this thread on May 9th - the site in question had no backlinks in WWW and was number 2 for discount in www Toolbar showed PR0
This morning its no longer in www but is still number 4 in www-sj where it has 260 backlinks.
| 2:43 am on May 14, 2003 (gmt 0)|
|I don't exactly see how one bad result on the front page of a few SERPs is going to destroy google. |
The only people they matter to are those of you looking to find little faults, and the google hackers that are trying to make the best search engine they can.
I couldn't agree more. Google does a better job than anyone and you see WEBMASTERS complaining all the time.
What you don't see is USERS complaining. I offered a challenge before to find ONE example in the mainstream press of a NON webmaster complaining about google results being filled with useless pages.
Wow - there is a bad page when you search for discount brokers - big deal - anyone can see plenty of other discount brokers on the same page and anyone with half a brain wouldn't click on "index of donald so and so"
Oh my gosh - napster shows up as number 9 for mp3 - the sky is falling.
OH WAIT - #7 at MSN
NUMBER 11 AT LYCOS
NUMBER 14 at TEOMA
NUMBER 9 at ALL THE WEB
Plenty of search engines still list napster - yeah google could have done better - so could the others.
| 2:59 am on May 14, 2003 (gmt 0)|
>>I offered a challenge before to find ONE example in the mainstream press of a NON webmaster complaining about google results being filled with useless pages.
We are all users. We use Google daily to find stuff.
But I'll jsut mention myself here.. Sometiems it takes me hours to find what I'm looking for on google, and consider the fact that this is my 7th year on the net.
Google definitely has lots of space for improvements.
| 3:28 am on May 14, 2003 (gmt 0)|
I don't think Chris_R or myself are saying there isn't any room for improvement. Those pages simpley should not be there. What we are saying is that there is a tendancy, by webmasters, to over-react to minor problems.
The SERPs would be better without ndb in there, but going out to page 5 I only find 2 problem results, including the ndb that has now been removed. <added> It is in fact one of the cleanest set of results that I have ever seen in a commercial field.</added>
Read post 24 and apply our responses to that over-reaction.
| 3:46 am on May 14, 2003 (gmt 0)|
>Plenty of search engines still list napster - yeah google could have done better - so could the others.
It should be noted that, technically, Napster coming up page 1 for "mp3" IS a valid SERP. If you notice, Roxio still sells Napster T-shirts. There is an actual page up at napster.com. OK, it may not be much. However, Googlebot sees lots of pages linking to napster.com with "mp3" in the anchor text, and assumes that it is important to that page. There is NO way an algo could catch this. Only human editing of results would be able to handle this.
| 4:12 am on May 14, 2003 (gmt 0)|
Google PR people are not panicking and beating down the doors to WebmasterWorld on the "Google is broken" story because it's a beat-up.
The only people saying it are panicking frustrated SEO's...
It should be very obvious now to all that Google's main algorigthm cannot rank successfully certain queries. Its true of many commercial or shopping queries. It's also true of queries that no "practical" searcher as opposed to "theorist" searchers like us would use. Such as "discount", "I never thought i would be Sadaams bagman", even "mp3". Hey you get the results you deserve! What were you expecting when you type in "mp3"? Surely you need to qualify that with at least an artists name or a location?
When you think of it, of course an algo that is based on link patterns and on-page text analysis cannot keep up with the mass of curiosities, spam and smart CEO.
I dont see any evidence that Google is trying to "game" webmasters to advertise in Adwords instead by deliberately having bad results. It's just that some highly competitive, very vague, or isolated queires can not be addresses effectively by google's content based, link pop, "democratic" algo. Google algo is great at ranking informational pages. Thats what people go to Google for and thats what it does best. To expect Google to rank shopping sites is naive. Thats why they have got froogle. To expect Google to rank news items intelligently within lots of other info sites is also naive. Thats why they have news.google.
In highly competitive or commercial cases Adwords provides a MUCH better solution to providing useful returns for such a query. As someone noted earlier, the Adwords results are much better than the main algo for that term.
Several years ago i thought that the time of the "mega" database was dead and we would have a web full of specialist Search site or "vortals" each with their own reputation - some for shopping, some for e-commerce, some for academic research, some for Upper Mongolia for example. Google did prove me wrong, but it's move to Adwords as a more effective and maybe even a more cost-effective method for webmasters to promote their sales sites was just a ntaural progression to both add a killer revenue stream AND provide better results overall to users with the combo.
I suggest that those who think "Google is broken" may just not have woken up to the fact yet that the google algo will never be able to rank commercially oriented sites which have a large spam and SEO quotient as they do information sites.
To state it clearly. We have to purge the notion developed hstorically that free SE listings are a standard promotion vehicle for primarily commercial or money making sites. Online commerce is a big business now - not the cottage industry of 5 years ago when SEO really WAS king.
Its very clear to users i think. If you want news go to news.google, if you want info look on the left hand side, if you want to buy something look on the right hand side, if you want to compare consumer product prices go to froogle.
They each use the most appropriate method to provide the best results in each case.
| 2:17 pm on May 14, 2003 (gmt 0)|
Chiyo, you are missing my points.
The site that interests me was an information site that told all about various online brokerage options and compared prices. Google has failed with the information on this site. There are thousands of other sites with empty directories still carried by Google.
The reason it has failed is because despite all the flip references to theming, Google has yet to implement on-page analysis to any significant extent. The entire game at the Googleplex appears to be link popularity plus anchor text.
I assume that once a page shows more than an empty directory, then word frequency and proximity, and font size, come into play. But it's an afterthought, and it clearly has a very low priority. How else do you explain empty directories that do well? How can you justify the restriction of content analysis to the anchor text on backlinks? Isn't this taking the easy way out? Does it even deserve to be called "content analysis"?
Compared to the several engines that can actually do clustering, Google is indeed broken. Well, I guess "broken" is the wrong word. It never appeared in "fixed" form to begin with, so technically it's not broken.
There's a massive amount of stuff that Google could be doing. I remember an engine a couple of years ago, which allowed you to paste in a paragraph, or more, into the search box, and it would analyze the content and return pages that were close. It worked very well. I can't even remember the name of that engine now; it was from Europe. Vivisimo and Teoma are doing useful clustering. Good stuff like this has a hard time making it when Google is so dominant.
Okay, I retract the word "broken." But Larry and Sergey are still spending way too much time on their Segway scooters. And while Google as an ad agency will make everyone there rich as soon as they file an IPO, I don't have to like it.
| 4:57 am on May 13, 2003 (gmt 0)|
Well I agree that Google is no better or worse than other search engines in this regard but it should be relatively simple (from a laymans point of view) to ensure that the link text is reflected somewhere in the page content.
| 5:27 am on May 13, 2003 (gmt 0)|
Hey, I haven't chimed in on this thread yet because people like Chris_R and chiyo are doing a very good job of making the points that I would make.
Is Google perfect? Of course not. There's still a lot of room for improvement in search. That's part of what's so exciting about being at Google to me--search isn't a "solved problem" by any means. With 200M searches a day, there will be some search results that are less than ideal. Personally, I think it's good to leave the result up for discount, because that result serves as a reminder that there are types of searches that we need to do better on. But I also don't think it's a disastrous result either, and we have to prioritize our efforts on what we think the biggest wins in quality will be.
The queries already mentioned are examples of stale info on the web. Stale info happens--I remember when doing a search for something like president of the United States brought up a Bill Clinton webpage a few months after Bush was elected. The perfect search engine would know that a page was stale for a query on election night and correct the results. Is any search engine anywhere near that? Nope. But that's no reason to stop trying.
So I wouldn't at all say that Google is standing still; we come to work every day thinking about how to make things better. We have to pick and choose what to work on, and I feel pretty good about the choices we've made already, and the things we're working on now at the Googleplex.
| 5:41 am on May 13, 2003 (gmt 0)|
I think this thread started out highlighting an interesting issue - and then denegrated.
As I've already said before in this thread - "when nativenewyorkerposted this thread on May 9th - the site in question had no backlinks in WWW and was number 2 for discount in www. Toolbar showed PR0
This morning its no longer in www but is still number 4 in www-sj where it has 260 backlinks."
ie you had to be there last weekend - we saw a page, just before it was removed, at a point where nothing made sense (how can you get to be number #2 in www.google with PR0 on a competitive single word?). Once the page was removed - it made more sense to me.
This thread was never about this pages ranking in other search engines - where other search engines have links pointing at it and 'reasons' - though outdated or stale - Google had already 'discounted' : ) the links. This was about 'why' - which was answered as the page got vaporised...
The page has now been gone from www. for a few days. Nothing more to see here.......
Keep up the good work GG - it must be a thankless job (especially around this time of the month) - so:
| This 41 message thread spans 2 pages: < < 41 ( 1  ) |