Forum Moderators: bakedjake
I have used the dmoz rdf dump, parsing it and searching the parsed results...using my own criteria. The volume of use has risen over time to be quite profitable.
I now am considering expanding to spidering the sites within the rdf dump...and adding a submit site page...however that worries me.
I have so far operated on the theory that all operations must be software controlled (this makes it possible for one little old fat retired guy to do everything). My software for adding sites gives me many controls on what can be added, but it still does not have eyeballs.
Does anyone feel that it is possible to place adequate software checks to prevent gross spamming?
Should require a link or button?
My system works so smooth now, I almost hate to expand.
BUT, I am empire oriented.
I can't help you with the software end of it since I do most things manually. But I do know a bit about this. I send out a confirmation email to everyone accepted. I ask, politely, for a link back and give them a URL to my "linktous" page. But I never require a link. Over time you will start getting a certain percentage of people either linking back or adding your searchbox.
Some people demand link-backs and I think this build resentment plus creates a review nightmare.
Also: if they feel they really got a lot of value out of submitting their URL to you they are more likely to submit. Give them some sort of free goodies as a thank you. The more jaded pro web designers are unlikely to be moved by what you do unless you send them tons of traffic, but the amateurs can be increadable loyal customers if you cater to them and help them out. Long term they can send you traffic and exposure.
Last: copy the smaller directories and give them an *incentive* to put a button on their site. Like a vote button or rate this site.
Right now I give out a rate this site code that will very slowly improve a site's standing in the SERP's. It's a way for a hoo-hum site to improve it's placement and gain more exposure.
You and I are on the same wavelength, Dumpy.... nothing better than an empire.
Yes, I'd allow additions but you are wise to be on guard for spam from the git-go. If I had the programming expertise, I'd add the ransom-note password, a 2nd submittal button on a proofread/confirmation page, and require the form to be submitted from your domain. I'd also disable html in the body of the post and scan for objectionable keywords.
Brad's point about rewards is certainly valid. But, if you want to reward the savvy, just give them a good, solid, spiderable link.
<added>
On my directory, I also only allow the root index to be submitted. Deep-linking is allowed only with my approval. No free sites, geocities, tripod, etc.
This is tricky. You need to look at what market is there and what you want to go after.
For some directories listing free hosted sites is their very purpose - to get traffic to those sites that are not likely to know much about search engine placement. But as rcjordan indicates, there is a lot of churn in the free hosted sites so they do create some extra work.
Allowing deep linking is also tricky and depends upon the accepted practices for your market. (I like the ODP philosophy on that.)
<aside>
Of course if enough SE's/Directories refuse to list free sites or third level domains that can create a new market for a SE that will.
</aside>
I used the idea of only one site per domain...if you add another site it just wipes out the one before it. I appreciate the suggestions.