| 10:24 pm on Sep 30, 2005 (gmt 0)|
I Just looked at the Linknz search engine.
My first question is to the develop which is how come you chose to develop the software yourself rather than use existing products outthere namely Nutch , lucene and what ever other free search software there is outthere.
Secondly its a general question in that has anyone here used it? if so can anyone comment on it? review it? and how does it compare to the other local search engines?
| 10:29 pm on Sep 30, 2005 (gmt 0)|
|how come you chose to develop the software yourself rather than use existing products |
If all you do is to use existing products that anybody else can use, then how would you differentiate yourself?
The easier path you choose, the easier it will be not just for you, but also for your competitors.
| 10:29 am on Oct 4, 2005 (gmt 0)|
I've just searched for Christchurch. It returned 5081 results in 9.54 seconds. A bit slow.
The claim on the home page, that it has an index of 37 million NZ web pages seems wildly inaccurate. I'd estimate about 1 or 2 million.
It seems to be using PhpDig.
| 7:20 pm on Oct 6, 2005 (gmt 0)|
It returned 5081 results in 9.54 seconds.!
That must have been the busy period..
Yes it does look like Phpdig doesn't it but it has been designed to throw off people trying to get hold of the code, so I borrowed the headers from Phpdig software and used that, some other search engines come in using a webspider that doesn't even try to find the robots.txt file, they are simply trying to download some of the files especially those that I have written about the page ranking that I have written for it based on Content/Word meanings.
Just protecting my assets my friend!
I often get worried when the search results pages from some search engines, start to include files excluded by the robots.txt file and that are on the server in a protected directory and not linked anywhere, so how did they do that?
Seems like a heap of people have gone down the same pathway trying to work it out? Perhaps they should ask me!
Having spent a year developing the search engine from scratch I am aware that people may start to wonder, the claims you make are slightly wrong, the new website that is being designed for Linknz will enable people to check the database, see what's indexed and also webmasters will be able to see a sites results so that they can change the code to get a higher score.
But thanks for trying the engine, it's getting around 67,000 searches day which isn't bad.
Just phpdig file names with a few borrowed headers, added snippets of code just for other webspiders to find when thay come a looking.
Oh the index this morning contains Index : 38101959 Entries so the forty million pages will be completed within the next week.
Heaps of regards
| 8:02 am on Nov 1, 2005 (gmt 0)|
"If all you do is to use existing products that anybody else can use, then how would you differentiate yourself?"
If my understanding is correct .. crawler, algorithim, filter, classification ..
| 11:56 pm on Dec 2, 2005 (gmt 0)|
My partner would answer it by saying "If I can't do it on my own it's not worth doing"
But don't mention other things she can do on her own.. Grin..
Well the software we are running has now indexed sixty million webpages into the search engine. Which is not bad considering we put it all together over a five month period, we have made adjustments to it since but nothing major.
I t is always good to try and work it out yourself providing it can be done and isn't outside ones knowledge.
| 8:26 am on Dec 5, 2005 (gmt 0)|
How do you get 67k searches/ day?! I hardly get that many all year, thou my site is a directory and that makes a world of difference... Still, that many searches something ain't right...
I don't know you guys, but I've heard other owners brag about how much traffic they get, then I go ahead and get listed in their database and lo and behold, I am lucky if I see 1 visitor/month from them...
So much for the hype, way I figure it.