Forum Moderators: Robert Charlton & goodroi
complain about a real problem
Can't see the sandbox being a "real" problem, or a problem at all. It's there to stop the SPAM. If I start a new company called Mesothelioma Lawyers Ltd you reckon I should show up in the top 500 purely because that's my company name?
Sure, the "sandbox", whatever it is, hurts some. It hurts people who are creating sites for free traffic. Many of them are spammers/freeloaders. It also hurts others. They - particularly anyone starting a new business with a business plan that relies on free SE traffic - are probably better off staying unemployed (or employed if they can find a job). Any new site starting off on the premise that free traffic will sustain it deserves to fail.
So, if you remove the sandbox as a reasonable cause for complaint, and remove most of the other whining, we'd reduce this thread to one page and those that can't even be bothered to read it will get a personal reply from Googleguy because he owes them.
I realise ODP has its place, although it has not been all that the founders perhaps had hoped for.
My point it that while it may seem a good idea to hand edit the SERPS it is not a new one.
Yahoo! are doing it right now, and ODP of course
Are they on the right track?
Well, I think that the numbers will always be against them, but more worryingly human nature will also come into play as with any kind of voting or voluntary editing system by the puplic and will eventually cause more problems than it solves. IMO.
Google are in control of their SERPS at the moment and their algorithm (however flawed). I don't think they would (or should) hand it over to anyone.
The "outsourcing" idea some say Google are already using would not be a suprise, however, having been on the receiving end of some misguided Yahoo! editor recently I really cannot support the idea at all.
A monkey hand editing the serps might do nearly as well as MSN.
It is not a big job to have a dozen or so quality inspectors looking over the top 100 results for a handful of the most popular queries in broad niches, and correcting clear problems like hijackings or identifying webs of redirects or duplicate garbage. The engine that goes most strongly in this direction will become the dominate player in the market, if only because algos today are so weak.
Google usually has the best search results of the three engines, but far and away the best results on the Internet are those hand ranked on Yahoo for some few specific terms.
I agree too but that doesn't make a difference ;). We need a few G engineers to tweak the algo so innocent sites aren't banned (if you don't rank in top 100-500 for your name.com, it's essentially a ban).
If G pushed these results live I believe 99% of the issues with this update would be solved. Is everyone else satisfied with the results being displayed on these dc's?
If G pushed these results live I believe 99% of the issues with this update would be solved. Is everyone else satisfied with the results being displayed on these dc's?
I aggree 100%!
"The index I saw about 10 minutes ago at 66.102.7.104 is, I believe, the 'final' index in the process of being built. It is one of 3 different indices I was seeing as early as Tuesday and Wednesday (not counting a 4th much smaller and very elusive index)."
I still like it, not that that means diddly to Google.
But if too many folks like it too much, it will most likely change for the worse...
[edited by: BigUns at 10:08 pm (utc) on Feb. 13, 2005]
[edited by: illusionist at 10:16 pm (utc) on Feb. 13, 2005]
I see movement too...for the worst though. Only 6 DCs left.
And I have no clue why g1smd keeps calling this "big index" as it returns far less results for anything I check...
webmaster 131 to 126
webhost 1.35 to .696
widgets 2430 to 1620
furniture 71.2 to 41.2
www 4680 to 4470
64.233.189.104 is a smaller index.
That search term has been 41 million on google.com for at least several weeks. Only 6 months ago it was something like only 12 million and has just grown and grown. It added 50% a few days before Google announced they had now indexed 8 billion pages, and has expanded in big jumps at least twice more since then.
I also use &num=100&filter=0 on most searches. Note that the reported numbers can vary quite a bit depending on whether you add this or leave it off.
" number of zeros: 49
average of nonzeros: 1.0"
I see movement too...for the worst though. Only 6 DCs left.
Tee hee. No movement. I just found 4 new IPs and added them: 64.233.187.104, 64.233.187.106, 64.233.187.107, 64.233.187.99
Total is now 59 IPs on that tool. I don't use 64.233.189.104, and I can't use 64.233.161.104 because something is blocking it upstream of my server. That's all the IPs I know about.
[edited by: ciml at 11:04 am (utc) on Feb. 14, 2005]
[edit reason] See StickyMail [/edit]
Google is feeding these with the bigger index
This was not happening yesterday. Wishful thinking that maybe they are going to go live in the next few days with the better SERPS
Maybe tomorrow Google will finally get a clean index with the 12 billion pages I suspect they are trying to roll out.
Maybe tomorrow an asteroid will strike the Earth making all this irrelevant.
Person A: Remember, it's always darkest before dawn.
Person B: What does that have to do with the Google Allegra update?
Person A: Absolutely Nothing.
I'm #1, #5 and #10 on one search for MyDomainName; with directories and an empty no content page ranked higher on the latter two.
I like your joke.
and Google is probably thinking that they fixed the probelms.
Yes, I agree, but I've already noted massive index differences in previous posts in recent days. I only wanted to make note that some indices show me in the same wierd situation as many others - directories that link to me are ranked higher.
One thing I'm hoping for when this update is resolved is for most of the spam / fake directories, link farms, and other Black Hat scum pages to be wiped out. As someone who is 99.99% searcher, I'd rather have a 4 billion clean index than a dirty 8,9,10...index.
I just have a feeling that Google is trying for a minimum 10 billion index - twice the 5 billion* that MSN Search claims.
* Rumor has it that the MSN Search 5 billion figure is actually 1 billion external web documents + 4 billion spam emails Bill Gates has received in recent years ...
[66.102.7.104...] 1 (for mydomain.com)
[66.102.7.105...] 1
[66.102.7.106...] 1
[66.102.7.107...] 1
[66.102.7.147...] 1
[66.102.7.99...] 1
On the other ones I'm not found.
[edited by: ciml at 11:06 am (utc) on Feb. 14, 2005]
[edit reason] Examplified [/edit]
Size doesn't matter here, since if one has more hijacking/scraper sites while the other has a similar amount of hijacked/disappeared sites the number comparison is irrelevant to the difference in index quality.
We should pool our resources and promote a new search engine....
G has us by the short and curlies...I dont like it, but it is what it is...
These new results ..suck...I'm better off creating articles and spamming G with phony content than improving my actual site,,
G,
you're a joke
These new results ..suck...I'm better off creating articles and spamming G with phony content than improving my actual site
Yeah, that' what really hurts.
My shiny new black hat arrived last week. I haven't taken it out of the box yet but I have read the instruction manual and it's getting very tempting... just to quickly try it on... try on the preciousss...
The "outsourcing" idea some say Google are already using would not be a suprise, however, having been on the receiving end of some misguided Yahoo! editor recently I really cannot support the idea at all.
Are you completely sure that he or she was "misguided"?
We have to be honest here and I wonder what percentage of collateral damage is caused by Yahoo's manual editors compared to Google's algo?
I recently created a non profit site in for a club that was formed as a tribute to a poet. It featured for about three or four weeks then Allegra placed it in its black hole. This would not have happened to a perfectly innocent and useful site with a manual editor.