Forum Moderators: bakedjake
A lot of people on these boards like FDSE (fluid dynamics search engine) and i'm one of them.
Spider your site when you want.
Design your own templates
once you pay for the script, no more fees
The script is also available as a shareware version with a link back to the scripts home page.
3000 pages it well within the limits of FDSE. I curently have it indexing 14000 pages and still works well.
Should you decide however to use a remote search service, I don't think you'll find one better than AtomZ.
My guess is, though, that they think that that kind of offering would cannibalize customers for their more costly offerings - we know that the free product is great, so chances are all except the biggest enterprise users could end up with their "low end" offering. Too bad...
I'm using their Prime Search service, which I could have sworn costs $2,400 now. It indexes up to 2,500 pages. It works well - better than Verity, which I use in places to get around the 2,500-page limit. But you can't just ignore it. It needs to be constantly tweaked, and if you want to customize the search results it can get pretty complicated because you have so much control over the output.
Excellent ASP service, but it's still an ASP, which means there are occasional problems.
Bart
Adding a search engine to our site is something I know I will have to do at some point -- probably next year. So I'm also looking out for some ideas and checking out my options. At this point I'm still thinking about what functions I want to be able to support. Currently, I'm thinking of:
* maintaining a database of links to sites of interest to my organization's members
* having these links classified as in a directory -- for browsing by topic
* also, having a "Search" interface for our users where they could retrieve relevant content from our website (publications, articles, forum threads, etc.) and also get the links relevant to their query [I suppose that this implies a parallel structure to the directory taxonomy and query keywords?]
* using the links database to drive a spider that would run periodically run to validate links
* ultimately, having the spider detect page changes and be smart enough to pick out the associated text which would be presented to our site visitors as news
Any thoughts on these functional goals? Are they achievable? How? Also, isn't it true that directories and spiders are two different techniques for maintaining and organizing links? Isn't it best to use them in combination? How are they combined?
I ran into a solution today that looked particularly interesting: phplinks. It's opensource and available at freshmeat. It seems to fit part of my requirements. It's claimed capabilities include:
full search capabilities
recursive multilevel site categorization
full referrer tracking
site reviews
site ratings
link validation
related categories
category searching
search term tracking
HTH and thanks for any comments,
Peter Hollings
If add your site search page (find.html?id=) to your site map it will be indexed by Google. Set up the page with some applicable content and a few links to your site.
This could be the easiest way to earn 5 or 10 fully customizable and risk-free PR8 backlinks for your site.