Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: bakedjake
Looks impressive so far indeed. I'm really curious about any increase/decrease in relevance, once there's a significant number of sites indexed.
A few things to note, most of which you probably know already:
If robots.txt wasn't obeying your site's rules, there was a bug, but it should work now.
I've put together a page of logos and page designs that I'd like everyone to view and if you feel so moved as to give positive or constructive feedback, please don't hesitate!
And thanks for all the comments so far, i've really found and fixed a lot of bugs!
BTW : I think your page is decent, not to say the posters page design was not good. It was pretty too, better actually, but I yet feel you should stick to the fast loading one.
One question to better understand the numbers:
the current hardware i have should hold somewhere between 200-250 million web page
On the Gigablast about-page i read
scales to 200 billion full pages
Is my assumption right, 200 billion is the range what the software can do? To reach it, you need "a little bit" more of hardware, right?
Maybe it's for that reason that the only design on or linked from that page that I like is the second one; the red lettering with the gray and black in the background.
The first one is cool too, except the backwards "a" looks like another "b" that got blown up. If you left the "a" going the right direction, but tilted and lowered it slightly, it would probably look a lot better.
The lighting bolt ones look kinda cheezy, like they might appear on the box of a store brand knock-off cereal. (that was the first thing to come to mind when I saw them)
I hope that was helpful :)
and, yes, gigablast does scale to 200 billion pages (200,000,000,000).
and, yes, i would need more hardware.
my current setup only goes to about 200-250 million, so i'd need 1,000 machines times what i have, which is actually very doable.
joined:July 3, 2001
stickysauce is best because its slickest/quickest looking
the lightening is a no no for me, nice work but doesn't suit a searchengine.
the neon layout is good, but green = kiss of death i think (despite dmoz)
most shades of green are not appealing, plus the lightening bolt doesn't quite work.
I don't mean to be nasty - just critical of something wrong that may seem like nothing but I think makes a difference.
Yep, it is only MY opinion. Take it or leave it. I think I still like the engine and the simplicity, just like googles, I give a thumbs up to.
Why is it so necessary for spiders to go past the index page. I can't think of many sites that need to be spidered so 'deeply' - all they do is clutter up engines with a multitude of pages making it more difficult to find the others. I clicked on one of the recent searches and was presented with an entire page of links to different pages for ONE site - very UNimpressive.
Why don't spiders take notice of metatags - it really peeves me to make an effort to describe my sites and have it all ignored and something quite irrelevant (or less meaningful) placed in the description area.
Is there such a thing as a human edited search engine as opposed to a human edited *laugh* directory like dmoz used to be.
Maybe this board needs a discussion on what a search engine should be. Maybe someone will read it and build a new engine that people will actually enjoy using 100%.
>>I clicked on one of the recent searches and was presented with an entire page of links to different pages for ONE site
Matt has mentioned that clustering is not yet being done, but it will be.
Regarding the logo and colors, you'll also see that he's asked for and recieved several suggested designs; likely that will be changed.
Remember that the site is still in the early development stages.
>>Why don't spiders take notice of metatags - it really peeves me to make an effort to describe my sites
Because while you may use them to accurately describe your sites, many other people have used them inaccurately to spam search engines.
My reasoning is that often after you make your first search you realize that you need to refine it somewhat...
2 more cents,
I just did a search which made me wonder about the relevance of gigablast searches because I was given as many irrelevant results as valid ones. On closer inspection I noticed my search term seemed to be giving me results for another term.
The terms are 'gay pics' and 'tin cans'.
Any reason this might be happening? However apart for that the search results were ok.
Something missing from the Gigablast results - the ability to pick a page 'deeper' in the pack like google has - a dozen or more pages you can pick from instead of just 'next' or 'previous'.
I added the URL of my site. Within seconds it apparently had spidered my site (200 pages or so...) as well as a few hundred other related sites I have listed in my directory, judging from the "date spidered".