homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Search Engines / Directories
Forum Library, Charter, Moderators: Webwork & skibum

Directories Forum

How to eliminate the backlog!

 12:44 am on Oct 14, 2005 (gmt 0)


Any Advice on how to keep the backlog of submissions in a free submittal directory from growing on a daily basis?

Revenue is not high enough to afford paid editors and the small revenue would make it ethically ineligible to use volunteer editors.



 2:09 am on Oct 19, 2005 (gmt 0)

...the small revenue would make it ethically ineligible to use volunteer editors.

Not sure what you mean here.


 3:13 am on Oct 19, 2005 (gmt 0)

Let the software check how many sites are submitted on a day. Set an upper limit and when this limit is reached don't accept any new submissions for that day.


 3:36 am on Oct 19, 2005 (gmt 0)

And have some software to remove the junk listings...


 3:40 am on Oct 19, 2005 (gmt 0)

Hi skibum,
Thank you ;)


I think You can only justify not paying editors if your site is not for profit.

That would eliminate the backlog, but wouldn't cure the ill of not being able to process as much submissions as you get. There has to be a better way. With the popularity of submissions to directories specially the free ones, I'm sure many directories have the same problem. I thought a brainstorming might help. I have thought about this quite a bit.
IMO the ultimate solution would be some sort of an AI review. The decisions I make as a result of a review seem technically possible to be programmed into some code. For example semantics could be used to verify relevance to the submitted category and look for bad words. In any case I declare my backlog of a few months but people still choose to submit. I have put a lot of work into my directory and it gives me the blues when I see it has turned into a loosing battle.


 1:14 am on Oct 25, 2005 (gmt 0)

I think the best solution would have to include both automated algo and human review.

We receive an average of 12,000 submissions per day. About 40% of them are eliminated by simply applying a filter to the submitted URL's. These are mostly easy to spot porn URL's.

We then "pre-crawl" the rest, eliminating nearly another 40% by applying a type of Bayesian filter.

What is left is cached and is human-reviewed, with several functions available to the editors which can wipe out hundreds of submissions in one swipe if a pattern is seen where someone is abusing the system. We try to only have to actually visit 1% of the sites submitted as what is usually reviewed is a cache of the web page.

This results in about 2,500 URL's being approved per day, which I believe is a pretty respectable number and would take a small army if not for the automated procedures.


 1:59 am on Oct 25, 2005 (gmt 0)

Hi dataguy,
That is an impressive number. Thank you for sharing the routine. I have just started programming caches and it saves me time eliminating the image loads. I patch up automation ideas as they come along. My current objective is to ask for a relevant category when an unrelated submission is detected. I don't have much problem with porn sites probably because on the submission form page large font states that we do not link to such. I think once your submission routine is good enough to eliminate most of the unacceptable submissions, a random quality check and some ongoing fine tuning will do.
One feature that has helped me(but not to the extent of eliminating the backlog) is the approval form is designed to easily browse through the directory and include in the appropriate category.
Apologies for the bad grammar, but I think that the directory owners will understand.


 6:01 pm on Oct 25, 2005 (gmt 0)

One feature that has helped me(but not to the extent of eliminating the backlog) is the approval form is designed to easily browse through the directory and include in the appropriate category.

This is very important, but with us, most of our submissions are coming from submission services, so the webmasters often don't see our submission page. For example, we have a kids-only site which 90% of the submissions are for porn sites or non-English sites. Obviously the webmasters aren't seeing where they are submitting to.


 11:52 pm on Oct 25, 2005 (gmt 0)

Are you using a captcha? I think the easiest way to cut down on the backlog is to eliminate automated entries.

You can put in word filters and domain filters, and use various methods to cut down on spam. But ultimately you might need to ask yourself if you are being over-ambitious. There are a lot of webmasters out there looking for listings, and many of them have good sites that are worth listing. If you don't have time to deal with them all, narrow your niche.

You can do this by excluding troublesome categories, or specifying a smaller region for your directory. You could make one part of the directory free, but all the other sections paid. Asking for money is easily the best way to stop most submitters in their tracks.


 6:05 am on Oct 26, 2005 (gmt 0)

When free submissions reach an unmanageable number we simply ask for paid inclusion... that cuts down the workload and raises submission quality - dramatically

We found that by setting a decent add to price we get only quality businesses adding - which equals better quality links and higher satisfaction for our users...

If your directory is that popular shouldn't you be asking for paid inclusion?

Global Options:
 top home search open messages active posts  

Home / Forums Index / Search Engines / Directories
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved