homepage Welcome to WebmasterWorld Guest from 54.243.23.129
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Website
Visit PubCon.com
Home / Forums Index / Search Engines / Directories
Forum Library, Charter, Moderators: Webwork & skibum

Directories Forum

This 36 message thread spans 2 pages: 36 ( [1] 2 > >     
DMOZ Submission Review Delays and Automating "Community" Approvals
Going forward would it make sense to change the system?
Webwork




msg:3184394
 5:37 pm on Dec 10, 2006 (gmt 0)

One of the consistent gripes that has appeared in this forum takes the form of "I applied months ago . . . "

There is always an explanation or response.

Yet, I pause to wonder: Is there the possibility of a better way of processing submissions AND is that better way going to be programmed into the new (hopefully) release of the ODP?

Why not default submissions to approval and listing after 60-90 days?

Why not allow default listings to appear, after that time, with a marking that it "has not been reviewed"?

Why not allow "approved at large reviewers" to vote on default listings, which will automatically throw them back into the queue if enough at large reviews confirm that either a) the website is NOT what it purports to be; or, b) it doesn't fit in the category?

Isn't it time to move towards a Web 2.0 model of listings?

 

hutcheson




msg:3184432
 7:18 pm on Dec 10, 2006 (gmt 0)

>Why not default submissions to approval and listing after 60-90 days?

Ah, this is an easy one. Because (1) over 90% of the suggestions are pure toxic sludge; and (2) editors cherry-pick the suggestion pool, so that as the sludge gets older, it gets more toxic.

So, if your intention was to find the collection of websites least likely to have significant value, that's the second-best idea that's ever been heard.

The only collection of websites more likely to be worthless than the, say, "180-day-old suggestion pool" would, of course, be the "rejected suggestion pool" (which has 99+% toxicity.) You haven't made the suggestion that rejected suggestions be given a special priority, so I won't dump on that idea.

hutcheson




msg:3184437
 7:31 pm on Dec 10, 2006 (gmt 0)

>Why not allow "approved at large reviewers" to vote on default listings, which will automatically throw them back into the queue if enough at large reviews confirm that either a) the website is NOT what it purports to be; or, b) it doesn't fit in the category?

In the internal forums, we've been around and around on the idea of tracking the reputation of people who make suggestions. But it has always come back to the same thing: our term for "approved at large reviewer" is "editor."

And that really corresponds with reality. In my experience, there are very few people with an enthusiasm for the project mission, who have done enough work to get a reputation for good judgment in making suggestions from the outside, then haven't simply become an editor.

gpmgroup




msg:3184533
 10:16 pm on Dec 10, 2006 (gmt 0)

Perhaps the new DMOZ could include some simple stats at the bottom of each cat/page.

No of submissions waiting in the queue.
No of rejected submissions
+ the date each of these last changed

This would help the outside world see the pressures the editors in some sections are under as I'm sure the pressure/interest isn't homogeneous.

Webwork




msg:3184557
 10:48 pm on Dec 10, 2006 (gmt 0)

I agree with gpm that a message "There are X# of submissions in the queue for review in this forum" plus the date of the last reviewed site being admitted would be a considerate way of alerting the potential user/sumitter that they may be wasting their time . . or that a result may be a long time in coming.

[edited by: Webwork at 10:50 pm (utc) on Dec. 10, 2006]

vite_rts




msg:3184575
 11:37 pm on Dec 10, 2006 (gmt 0)

@ Google editors

Whereas , i do not necessarily concur with all the suggestions in this thread, might I ask the ODP editors a number of questions

1, Do editors as a whole believe the current odp model is as good as it can be , ergo no improvements are,, concievable

2, if 1 above is not true, is it possible that a revised model for odp might come from outside the odp hierachy.

3, would we be having this conversation if sitemap.org got a twin brother, say SearchEngineHumanReviewDirectory.org, appropriately financed, organised and controlled by the top 8 se's

Quadrille




msg:3184624
 1:09 am on Dec 11, 2006 (gmt 0)

We've heard that 90% of submissions are sludge; probably a reasonable estimate; we all know that it isn't far out.

And let us not forget that 90% of that sludge knows, in its heart of hearts, that (a) it's sludge and (b) it hasn't got a hope.

So why even ask ODP to pander to these people? It's been stated repeatedly, here and everywhere else "submit once to the appropriate category, and get on with your life" - and reasonable people have no difficulty with this.

Why would WebMasterWorld seek to encourage the obsessive directory spammers of this world. I really cannot see how this assists anyone with an interest in directories.

As for 'default listing' such sludge .. please tell me you were joking - I thought you shared the dream of building quality directories ... How wrong could I conceivably be?

But if you were joking (please say you were), then I unreservedy apologise for my sense of humour bypass :)

gpmgroup




msg:3184650
 1:48 am on Dec 11, 2006 (gmt 0)


Why would WebMasterWorld seek to encourage the obsessive directory spammers of this world. I really cannot see how this assists anyone with an interest in directories.

It isn't about encouraging spammers its about being transparent, at the moment the Spam issue is used time and time again to cloud any constructive suggestions.

ODP's goal is defined as


The ODP's goal is two-fold: to create the most comprehensive and definitive directory of the Web, and to create a high quality, content rich resource that the general public considers useful and indispensable. In short, editors should select quality sites and lots of them.

In certain areas of the directory this hasn't been happening for a long time and the spam issue is also all to often used as a cover to prevent discussion of these failings.

hutcheson




msg:3184685
 3:03 am on Dec 11, 2006 (gmt 0)

>Perhaps the new DMOZ could include some simple stats at the bottom of each cat/page.

>No of submissions waiting in the queue.
>No of rejected submissions
>+ the date each of these last changed

On the face of it, this seems like a reasonable suggestion. Indeed in the beginnings of our status check forum, some of us hoped that this would be a way of helping people form more realistic expectations; so some of us gave, at least, rough estimates of these numbers.

I suppose I shouldn't have been, but I was shocked and awed by the conclusions that were lept to, from that simple information. I never saw ANYONE leap to a correct conclusion! So, long before we gave up altogether on status checks, we very firmly STOPPED giving out those data points -- because, with our best will and efforts, we simply could not keep them from turning into misinformation the instant they went from our keyboard to the forum.

It's not likely we'll get over that shell-shock soon.

What I started recommending instead, and what I still think gives much better information from a much better perspective, is this:

Go to Google, and search for sites like yours. Count the number of sites, divide by ten thousand sites considered daily, and multiply by the half million categories those ten thousand sites are distributed over. You'll get the number of days required to review that topic thoroughly: assuming, of course, that that topic would benefit by that kind of thorough review.

But, you say, not all those sites have been "suggested." EXACTLY! Suggested sites don't have priority; they merely have persistence.

That anyone can do, it requires no special data processing from the ODP; it depends on no special trust given to ODP sources; in fact, ultimately it's based on the raw data that would have been processed by the ODP to give a meaningful answer to the question.

hutcheson




msg:3184701
 3:36 am on Dec 11, 2006 (gmt 0)

1, Do editors as a whole believe the current odp model is as good as it can be , ergo no improvements are,, concievable

2, if 1 above is not true, is it possible that a revised model for odp might come from outside the odp hierachy.

3, would we be having this conversation if sitemap.org got a twin brother, say SearchEngineHumanReviewDirectory.org, appropriately financed, organised and controlled by the top 8 se's

1) I'm not sure what you mean by "model". What I would call the "model", I would say is locally-optimal (in a mathematical sense): that is, it simply could not be changed because any change that might be drastic enough to be an improvement would be suicidally disruptive.

What I would call the "processes", editors are constantly discussing ways of improving, and I can speak for the community when I say that we're sure that just because we do them better than anyone else ever has, doesn't mean we can't improve them further.

What I would call the "product", of course the mere act of editing is a testimony to the possibility of improvement.

2) I think that a revised "model" can ONLY come from outside the ODP community (whether from people with ODP experience is a separate question). Because of the risk, it isn't going to happen without a demonstration that it's possible: which means, basically, someone's going to have to do it and show that it can work. Therefore, it can only be implemented outside the ODP community (although there is no reason it couldn't include some people with ODP experience.)

But I think revised "processes" can ONLY come from INSIDE the ODP community, as the people who are doing the work recognize aspects of their own work is inefficient, ineffective, or futile: and discuss ways of eliminating those aspects. People who don't KNOW the process aren't going to improve it, any more than someone who's never opened the hood of a truck is going to be telling General Motors how to make their diesel engines more efficient.

Improved "product" is something that anyone can help with. And here again I can speak for the editing community, that has worked so hard to provide feedback processes, or asked so urgently for the ODP techies to provide other feedback processes.

I think there's a feedback mechanism here: as I said, outsiders aren't going to be designing sane processes, it just won't happen. But outsiders absolutely CAN show (and have shown) systemic patterns of problems in the PRODUCT. And when that happens, at that point editors can begin to look at the PROCESSES to see what might be tuned or added to help keep that problem to a minimum. It was an outsider who forced us to look at the problem of "lead generator" sites -- not by whining vaguely about "bad sites getting listed", not by accusing editors of unspecified abuse, not by demanding changes in models or processes, not even by asking how our models or processes worked, and certainly not by threatening legal action.

How then did he do it?

He searched the ODP for fraudulent lead generator sites, made up a nice little list of the sites he found, described how he searched for them and how he identified, and sent it to an editor.

I got one such note; I looked at the sites, said "wow, thanks!" and fragged them.

Next week he searched the ODP for FLGS, made up a NLL complete with analysis, and sent it to me.

I looked, said "wow, thanks!" and fragged them.

Next week he STO'ed for FLGS, made up a NLLCWA, and sent to me.

I said "wow, thanks!" and told other editors about it, looked and fragged them.

Next week he STO'ed for FLGS, and sent me a NLLCWA.

I said "WT", looked, and fragged.

Next week he STO'ed for FLGS, and sent me a NLLCWA.

The editors started talking about it more seriously. In the meantime I SWT, L, and F'ed.

Next week he STO'ed for FLGS, and sent me a NLLCWA. We talked some more. and I SWT&L&F'ed.

Next week ... Well, I don't know how long it'll take you to spot the pattern. We talked for months, and achieved consensus that LGS could not be considered listable sites--under our CURRENT MODEL. So we changed the guidelines (PROCESS) to warn editors about such sites. And we got more aggressive about doing our own quality checks on our own initiative (using, yes, some of the techniques that the outsider had described). Process changed -- because our understanding of the web had been changed as we learned exactly how our old process didn't suit our old model.

The user never tried to change our process: he couldn't have done that, as he simply didn't know what it was. He didn't claim that the ODP process was fallible--and therefore he was an infallible process designer. (We wouldn't have believed him if he had claimed that!) He simply showed us that our current process was continuing to give a product that didn't live up to our model. And that mattered to us, so we figured out how our process had to change.

A few months later, in one of the webmaster or SEO forums, I noticed someone saying that he used to be able to get all of his sites into the ODP, and now he couldn't get any of them in. And I immediately thought, "BINGO! I bet there's one of the FLGS spammers."

Are there other problems like that in the ODP? Could be. You'd be our friend for life if you would point them out. Of course, anyone who just says there "must be problems like that somewhere", gets written off as malicious, moronic, or mendacious (2 out of three, we can't be sure which two).

3) Yahoo! exists, and the conversation continues. Beyond that ... I'd prefer to stay closer to reality than to the more remote bounds of inconceivable hypothetica.

skibum




msg:3184781
 5:55 am on Dec 11, 2006 (gmt 0)

It isn't about encouraging spammers its about being transparent, at the moment the Spam issue is used time and time again to cloud any constructive suggestions.


Maybe it's just hard to fathom how large the Spam issue is at a major directory, especailly a free one. To auto aprove submissions after a certain amount of time in unreviewed would flood the directory with Spam in the live listings. For a small niche site, it is probably much easier to deal with. Seems like any project that has grown this large does need more automated processes at least to help weed out the flood of spam that comes in. In order to assure that quality sites do make it through <eventually> is the pure human review the best way to go? Maybe so.

If automated filters were in place to filter out spam as it came in, then surely there would be a flood of complaints from people who got snagged by those filters but maybe shouldn't have.

Maybe an automated test in order to submit sites like the Zeal model had would help but then you'd have people complaining that they took the test, then submitted a bunch of "great sites" and after passing the limit x number of inappropriate submissions the submission account was blocked.

Without a profit motive <and IMHO it's nice to have a site out there that is not focused on that> it's harder to come up with a solution that will fix perceived problems instead of slapping a band-aid on them and having them pop up somewhere else in the chain of events.

Speculation can go on all day and night for years <and it has> but ultimately it's going to come down to what AOL does, what resources they decide to throw at it, and what can be done with those resources without alienating everyone who has put endless time and effort into building the ODP into what it is today.

gotvape




msg:3184871
 8:38 am on Dec 11, 2006 (gmt 0)

I've been waiting on Two Years now, when I had came up for approval after 6 monthes, I was informed they closed submissions.

pagode




msg:3185359
 6:06 pm on Dec 11, 2006 (gmt 0)

O my what a lot of misunderstanding

> I've been waiting on Two Years now,
OK, although I can't understand why people are waiting for their site to be reviewed. This is not an abnormal waiting time with DMOZ (we - that is some editors - would also like to see the time between suggestion and review smaller, but we also know it is not viable)

> when I had came up for approval after 6 monthes,
That is impossible. There is no time after which a site comes up for approval.

> I was informed they closed submissions.
We never closed submissions (unless you count the current technical problems).

I guess that you refer to the status request we used to answer at Resource Zone. You were allowed to ask for status every 6 months. After a few years we noticed that giving status updates was of no use to DMOZ, the editors and the honest/real webmasters. So we decided to stop giving status updates.
But answering status requests never had any relation with the review proces or the time between a site being suggested and being reviewed.

hutcheson




msg:3185376
 6:28 pm on Dec 11, 2006 (gmt 0)

I think the real problem here is that same old attitudinal divide:

Some site suggestors think "I suggested a site: that is my demand for service, or my claim that I have an intrinsic right for service -- and now all the volunteers have got to service me on my choice of time frames. I'll be 'reasonable' and give them a few days/weeks/months, then I have a right to get angry."

Editors think, "We're ALL volunteers in here. Some people have offered credentials and been accepted as TRUSTED volunteers ("editors"). Some people ("outside suggestors") haven't offered any evidence of trustworthiness, but still volunteer to make suggestions. Nobody tells outside volunteers what categories to suggest to; nobody tells trusted volunteers what categories to work in. Each volunteer does what he thinks is most important."

So, if you suggest a site, and it's been six months and no listing, then (at best) you obviously have not been suggesting sites where the trusted volunteers wanted you to; or (at worst) you've been wasting the trusted volunteers' time making worthless suggestions.

So what should a site suggestor do? Must he go off and repent in sackcloth and ashes for, at best, not being sensitive enough to what other people think is important?

No, there is no need. And there is no need to feel guilty. We DON'T MIND if you haven't adjusted your priorities as a volunteer to correspond to what the trusted people think is important! It's really OK. You are really allowed to volunteer help on your own schedule, not ours. It's OK.

The only caveat is this: you MUST, absolutely MUST, allow other volunteers, EVEN THE VOLUNTEERS THAT HAVE OFFERED EVIDENCE OF TRUSTWORTHINESS, the same courtesy.

Webwork




msg:3185532
 8:40 pm on Dec 11, 2006 (gmt 0)

Hmmm . . maybe I've got to go back and re-read but I'm left with the impression that the two germane questions remain unanswered:

1. Is there the possibility of a better way of processing submissions AND is that better way going to be programmed into the new (hopefully) release of the ODP?

2. Isn't it time to move towards a Web 2.0 model of listings?

It's my rough impression that for wont of an approach that would enable more people to have input on allowing, editing and removing submissions the DMOZ is getting hamstrung.

Isn't it time for DMOZopedia? Not quite like Wikipedia but close, at least in terms of enabling user inputs?

gpmgroup




msg:3185600
 9:53 pm on Dec 11, 2006 (gmt 0)

Maybe it's just hard to fathom how large the Spam issue is at a major directory, especailly a free one.

I agree scalability is a major issue and processes often need to be designed from the ground up with this in mind.

Our networked peaked at 9500 spam emails per hour in November. If we allowed even a fraction of that spam to land in end user inboxes our users would quickly become demoralised and inefficient. With careful management an average of 1.5 spams lands in user’s inboxes per week. The end user has no spam to distract, report or complain about, therefore focus, productivity and motivation are greatly improved.

Here are some suggestions on how you may be able to cut the deluge of spam in the first instant.

1) Add a “captcha” to the submission page (Human submitters for Human reviewers)
2) Send out an email to the submitter requiring confirmation
3) Allow submitters to have a unique username or even better (more coding) a submitters account.

This should instantly cause the volume of unsuitable sites submitted to fall dramatically. It will also allow you to profile the submitters.

Some people who are not editors and who have a passion for their subject may turn up a batch of sites which would take an age to find or review individually.

If they could login and see submission times for “their sites” you may be able to end up with a whole new class of “external submitters”

hutcheson




msg:3185611
 9:57 pm on Dec 11, 2006 (gmt 0)

1. Is there the possibility of a better way of processing submissions AND is that better way going to be programmed into the new (hopefully) release of the ODP?

(1) I think the definition of "better" is going to be a sticking point here. "Better" to an editor means "wasting less time on the unprofitable parts, and being able to focus on the most profitable parts" which to a typical commercial site owner means "suggestions exercise less influence on listings."

To focus on the "ideal" -- from the editors' point of view, if we are doing our job perfectly, the presence of a site suggestion will have absolutely zero effect on the site's speed of review, and on its chances of being listed. We are, clearly, VERY far from perfect, and so site suggestions do help a site get reviewed and listed. But movement towards the ideal process will be movement away from any dependence on processing suggestions.

Presumably, that is not what you think "better" means: you probably would define it something like "giving suggestions more influence over priorities and guidelines."

As for the new (hopefully shortly) coming release, I understand it is strictly an engine replacement, with no intentional changes to the decorative trim or the driver's control buttons.

2. Isn't it time to move towards a Web 2.0 model of listings?

I avoid buzzwords like "web 2.0" because they mean so many different things to different people that they end up meaning nothing.

But, to parse some of the possible ideas that might be involved:

(a) In terms of "mashups": The ODP, through its RDF, has since 1998 provided data that could be mashed by ANYONE. There's still nothing else equivalent in the directory niche. In this we're still leading the way.

(b) In terms of anonymous public participation, the limitations (specifically with respect to website promotion abuse) are being recognized in some of the more populist efforts. Wikipedia is turning back from allowing just anyone to add external links: it would be foolish of the ODP to follow their failed experiment down the wrong fork.

(c) In terms of "buzz": that's not the product of a beehive; it's merely sound pollution incidental to the work. We focus on the honey.

Whether it's time for a different model to build a directory: I don't know that time has anything at all to do with it. I tend to think a good idea isn't chronologically limited: all times are good for a new model. Google is already trying something that may be "web 2-ish" for some value of "2": more power to them. I won't form a preconceived notion of how it'll work, I'll watch and see.

I do NOT believe it's EVER a time for monopoly, for only one model to be permitted to exist. Even now, some projects (Wikipedia, Gutenberg DP) are trying to improve quality by encapsulating the old ODP concept of "trusted contributor." And that is a noble goal! But the fact that "web 2.0" is not universally applicable doesn't mean it is worthless. And the fact that it sorta works for parts of encyclopedias doesn't mean it would work at all for link lists.

I do NOT believe that the ODP will be moving towards more "untrusted" participation. I think we'll be looking harder for participants in places where trustworthy people might tend to gather.

And I do NOT believe that the ODP will make drastic changes in its model. I would definitely like to see other people trying to invent useful models -- but not just anyone! I want to see people with the courage of their convictions, who'll think of something and try it out themselves, rather than diluting it with snake oil and trying to sell it to people who already HAVE a model. Everywhere I go on the net, I see other ODP volunteers, volunteering elsewhere. I'm sure that if someone comes up with a REAL idea, an idea good enough to invest their OWN spare time in, they'll have some help from people with ODP experience. And we might all learn something. But nobody learns NOTHING from uninformed speculation. What I believe really doesn't matter much. It's what really happens that matters. And I'll be excited to see what really happens.

hyperkik




msg:3185700
 11:46 pm on Dec 11, 2006 (gmt 0)

I do NOT believe that the ODP will be moving towards more "untrusted" participation.

Nor do I, but for different reasons. One unfortunate aspect of community-based Internet projects owned by for-profit corporations is that the corporation tends to view everything about the inner workings and business model of the project as proprietary, with no vision shared with the community and little to no interest in entertaining ideas from the community. Another is that corporate executives neither trust nor understand members of the community, and the higher you go up the corporate hierarchy the less trust and understanding you find.

AOL has a hard enough time trusting DMOZ editors under the byzantine rules that have proliferated since it acquired the project. It would take an inspired person within the corporation to convince them to follow a different, more trusting approach - and why would somebody do that? When a project is marginalized and all-but-forgotten within a corporation, the best workers will move on, voluntarily or involuntarily, to other projects or jobs. Why spend your time thinking about or improving a project which the corporation plainly regards as an afterthought?

Count the community-based projects owned by corporations which have been summarily terminated, then count those which, once recognized as dated or on a track toward obsolescence, have been carefully rethought and redeveloped into viable, ongoing ventures. I must be missing something, I hope, because I can't presently think of anything which falls into the second category.

hutcheson




msg:3185722
 12:07 am on Dec 12, 2006 (gmt 0)

From AOL's point of view, there are three levels of trust:

(1) Trusting volunteer developers within their own network servers: this is the trickiest bit, but doesn't affect most of us. However, AOL has managed to let a few technically astute meta-editors behind the firewall -- a pretty impressive demonstration of trust, I think.

(2) Trusting volunteer editors: this is the slightest level, because it's just a matter of using forms to modify databases: with accesses logged, things can't get too bad (at least, so goes the theory.)

(3) Trusting the community to manage itself: this has always been a goal for the sponsors, and historically they've done very well moving towards it. I remember when the three founder/proprietors were the only people who accepted new editors. Now, volunteer admins select people (according to their judgment) who can in turn allow anyone access to editing permissions.

Currently, AOL seems to be investing time and money: and to be fair, the three-legged stool I've been talking about has FOUR legs: community, model, product, and SPONSOR. And the fact is, sponsors (corporate and individual) have been known to retire without allowing for a successor. And a new competitive model would need a sponsor in order to scale for growth -- the ODP server network involves about a dozen high-end Sun servers, IIRC, and that is not cheap to buy or run. A serious competitor could not get by with much less.

The good news is here, I don't know the details, but apparently the people who have dealt with the sponsorship issues believe that sponsorship will not be an issue for the ODP in the foreseeable future.

Webwork




msg:3185742
 12:36 am on Dec 12, 2006 (gmt 0)

Alrighty, so . . it's not my project to manage . . but what about a two stage process to deal with the spamming issue?

Stage One: Submitted sites get dumped to (a) general queue(s) that are 1) botchecked for the presence of "bad words/stuff" (I know, cloaking, etc.) AND, if the bot checks out okay then they continue in the general queue for pre-approval by volunteers who don't quite qualify as (or want to be) editors who give preliminary "thumbs up" or "thumbs down" votes. Pass this point and the submission moves to stage two. Fail at this point on it's off to heck for the domain.

Stage Two: Actual editor check and, if pass, into the directory.

Either change the model to make it more accessible and participatory and get with the masses OR start to wither and die. It seems to me that right about now - with the server and the architecture being examined (I presume) - would be a good time to go with at least a test of 2 or 3 alternative submission protocols.

hutcheson




msg:3185763
 12:57 am on Dec 12, 2006 (gmt 0)

>Either change the model to make it more accessible and participatory and get with the masses OR start to wither and die. It seems to me that right about now - with the server and the architecture being examined (I presume) - would be a good time to go with at least a test of 2 or 3 alternative submission protocols.

The server and architecture isn't being examined at this point. Getting it back together with the failsafe server wired in, is surely higher priority than anything else.

But you're still thinking in terms of "improvements" == "priorities and deadlines associated with suggestions". And the editors are, I think I can promise, not going to be thinking in those terms. The editing goal is "getting good sites listed" -- and that can be done efficiently without using suggestions at all!

The concept of "submission protocol" doesn't correspond to anything in the reality of the ODP process, so we can't be discussing alternatives to something that simply doesn't exist.

The ODP "protocol" is very simple, and almost infinitely flexible. It goes like this: editor looks for sites, editor picks site to review, editor reviews site, editor decides whether to list site, rinse and repeat. Because the protocol is so flexible, you don't need "suggestions" on improving it. Any editor can tweak his own protocol, optimising for efficiency or effectiveness or comprehensiveness or personal entertainment value or whatever. Any editor can demonstrate that his own protocol is more effective at optimizing for whatever; and when that happens, that protocol spreads in the internal forums--to any editor who's interested in it, that is. There's nobody (internally or of course especially externally) who has the right to impose a different way of working. Any way of working that builds the directory, and doesn't interfere with other directory builders, is good by definition. And nobody--not ODP admins, not metas or editalls or senior editors or anyone--has a right to tell anyone otherwise.

What intrinsic place does Google have in the protocol? None whatsoever, of course: Google didn't exist when the ODP started out. But many editors use Google (and have their own tricks for getting the most use out of Google).

So then, do we need a "universal mandatory Google protocol"? The very question is absurd.

What I can't figure out, is why the idea of a "submittal protocol" isn't everywhere recognized as equally absurd!

Webwork




msg:3185777
 1:43 am on Dec 12, 2006 (gmt 0)

I smell death.

What I am reading is that "there is no better way, we are as evolved as far as possible". When an entity doesn't evolve whilst the world changes around it what often happens is a slow decline, even death, for that entity. To assert that the submission and review process is - in essence - as good as it gets doesn't mean that the experiment - which is what evolution is - is over.

How could this be?

Well, the ODP "product" is public license GNU isn't it?

I can envision the eventual full-on cloning of DMOZ as an open public project: A splinter group of editors takes the DMOZ dump, finds a backer and deploys the data to a different management system, one that does a better job of distributing and automating the submission and review process -> DMOZopedia style.

Mambo begot Joomla. Backers and participants of a founding premise - devoted people but of a different mind concerning implementation, deployment, other - choose the open source GNU way: "Okay, you don't own it, so we'll take it upon ourselves to present this premise in a new way".

I'll venture a guess that amongst the editors there is already something in the works. The sign of this will be member editors pitching some version of DMOZ2.0 internally and those who "know better" (the top dogs) arguing for all the reasons why it won't work. Anyone care to confirm that there is an internal dialogue and some fracturing into camps about how to take the project forward?

It's possible there are 7,000 editors of like mind. Is it possible that 63,000 ex-editors left out of frustration with the status quo? Only those who have participated know.

Experiment . . evolve . . or extinction? DMOZ and DMOZOpediaStyle?

Mambo and Joomla?

I know that volunteers and their ideas can be pretty hard to contain sometimes.

"Nothing is quite so powerful as an idea whose time has come".

I'm just guessing here but it may be that the server issues might catalyze a new movement, in GNU style, one that takes the DMOZ idea but manages to deploy it in a manner and method to outdoes the original, Wikipedia style?

[edited by: Webwork at 1:23 pm (utc) on Dec. 12, 2006]

hutcheson




msg:3185806
 2:16 am on Dec 12, 2006 (gmt 0)

The submittal process is about as fast as it can be, I think. Find the category, give the URL, title and description. No capcha foolishness, no Zeal test, no registration: Joe Friday couldn't have stripped it down any further: "just the essential facts." And even if you do that little bit wrong, the editors will correct it!

There just aren't any inefficiencies to strip out.

And -- for those of you interested in the site-suggestion backwater of the ODP ocean -- it's just as efficient on the inside. Whatever I decide to do with a site suggestion, is just about as simple. Correct the URL, title, and description (we always assume that they're wrong, and we're always right), one click to add or move or not-add, type the bare minimum of information for THAT operation, and it's done. Again, no inefficiencies to remove.

The only way that could be improved would be by automated assistance in spotting likely spam candidates -- but where COULDN'T we use that?

The point is, someone who's interested in looking at site suggestions is not going to find, anywhere, a more productive way of doing it.

Our concern, our focus for improvements and changes, our focus, is going to remain making OTHER ways of finding sites more efficient, encouraging editors to use those OTHER ways, every step aimed at getting closer to that ideal, perfect state where all suggestions are irrelevant. Of course, the ODP isn't perfect and never will be, so I suspect that site suggestions will linger on, continuing to provide tiny bits of fragmentary ore.

And I hope that other paradigms for internet search are invented and developed: I'm sure that, if and when they are, ODP editors will be among the first users of them. And as we learn to use them, the current ways of finding sites (including both Google searches and outside suggestions) will become less important. And that will be progress, and I'll watch for it!

As for death: it comes to everything. The ODP needs food (websites). Fortunately, it's very adaptable: it can live without site suggestions for ever. It can live without Google and Yahoo. It can live without magazines or business cards. All it needs, is to be able to find food SOMEWHERE. I haven't died just because I stopped shopping at Albertson's grocery. (Albertson's DID die shortly thereafter, but I'm still alive.) I'll only die when I stop eating, not when I stop buying groceries at any particular grocery. So long as I can shop elsewhere, or mail-order food, or grow my own, any grocery is a minor convenience, not a necessity.

hutcheson




msg:3185857
 3:43 am on Dec 12, 2006 (gmt 0)

Another source of apparent confusion: there are things that CAN be done, but that CANNOT be done by the ODP. That's why there are two organizations in the world: the ODP and Project Gutenberg and Wiki...THREE organizations, that's why there are ...

You get the point. If the ODP can't do something, then I'm certainly not going to criticize if someone ELSE wants to do it. I might even help, if I'm interested in doing that (Wouldn't be the first time. Or the second time. Or the THIRD ... well, you get the point.)

So, don't confuse what the ODP might be able to do, or might WANT to do, and what someone else might try to do, perhaps successfully. I can be a lot more confident about what the ODP can't do, than about what can't be done at all.

(Sometimes, all I know is what everybody hasn't done up till now.)

But note, some things are risky. Someone starting from nothing, may be willing to risk it all for a small chance at an achievement; the ODP community would not (and should not be expected to) take that kind of risk with its assets. It would be extremely unreasonable to expect the ODP to make major changes on a speculation: no, UNLESS those changes were first demonstrated and proven practicable and effective and better than the current approach, it would be an abuse of the administrator's powers to force them on the community. And they know that.

And, if you haven't tried them out, chances are your ideas are not so good as Thomas Edison's -- and he had a 99% failure rate. What he did was set up a lab, and try all of them. He didn't try to force people to take untested ideas, nor did he throw them all away because chances are each one was individually worthless. He tried them.

It always comes back to this. If you think you have a good idea, test it. Assume we ALL came from Missouri. Show us. We don't want to hear your untested ideas, your fantastic speculations. It really doesn't matter whether I think it'll work or not. I could be just as deceived as you are. All that matters is: can you show us proof that it does work, on a large scale, in the face of determined vandalism. If you can show that, then it's worthwhile for us to think about. If you don't think it's even worth your own time to test, then ... I'm not contrary, I'll accept your opinion forthwith. If you do think it's worth testing, I'll bet someone will think it's worthwhile looking at the test results.

Proof, not speculation. The current ODP design has proven itself. Its proposed replacement will have to prove itself even better.

RichTC




msg:3186091
 10:31 am on Dec 12, 2006 (gmt 0)

I agree with having stats on the dmoz pages about how many await review, how many have been rejected etc etc. To clean up the poor image that DMOZ has, it needs to be vastly more transparant than it is currently.

If sludge and spam are such a problem for dmoz then they should make it a fee paying directory imo. They only get zillions of submissions because its free.

If a webmaster had to pay a fee for a review in the same way that a webmaster has to with Yahoo.com and business.com then that would kill spam and sludge submissions overnight. The site gets reviewed and is either included or declined for what ever reason, but its processed all the same.

It would also make dmoz possibly a viable business rather than a liablity?

The only thing i would add to this is that if they went down this route that they:-

a) made sure that it was ONE entry only for a site in the most relevent section of dmoz - This then removes sites an editor favours getting multi listings ( we see too much of this and this leeds to questions over editor bias)

b) that each sector was checked and sites that no longer work, under construction, outdated descriptions were updated, removed, adjusted etc to improve quality control. If editors were employed following introduction of charges they could then do this.

c) Editors had a policy of mutual respect for webmasters and work harder at improving communications with webmasters. ie some of the replies ive seen to webmasters in resource zone for example border on being insulting imo and this isnt an image that a quality facility should be projecting currently esp as its part of the respected AOL/ Time Warner group.

gpmgroup




msg:3186123
 11:53 am on Dec 12, 2006 (gmt 0)

The submittal process is about as fast as it can be, I think. Find the category, give the URL, title and description. No capcha foolishness, no Zeal test, no registration: Joe Friday couldn't have stripped it down any further: "just the essential facts." And even if you do that little bit wrong, the editors will correct it!

There just aren't any inefficiencies to strip out.

Am I missing something here?

One of the biggest bugbears of editors and often repeated is the sheer quantity of spam in the site submission / site suggestion queue that prevents them from constructively finding and adding new sites from the queues.

In which case you need to reduce the spam/noise/"toxic sludge" without impeding the worthy submissions.

As an outsider looking in….

1) Add a “captcha” to the submission page (Human submitters for Human reviewers)
2) Send out an email to the submitter requiring confirmation
3) Allow submitters to have a unique username or even better (more coding) a submitters account.

This should instantly cause the volume of unsuitable sites submitted to fall dramatically. It will also allow you to profile the submitters.

All easily coded and all totally automated.
Result - Editors task is easier and genuine gems get listed faster.

It seems too obvious and too simple? What am I missing?

Webwork




msg:3186180
 1:05 pm on Dec 12, 2006 (gmt 0)

Here's what the Directory Forum Charter says:

Whilst we are willing to be host to threads that offer reasoned critical analysis or commentary about the ODP the days of members injecting posts into open threads that amount to little more than pejorative comments, adding little more than rancor to whatever value the thread had up to that point, are now over.

For those who are confused - the very purpose of this thread IS to open up the issue of the submission process to critical analysis. The Charter does not say there will be no critical analysis, nor have I. One thing I have stated here is that I will not suffer the unending injection of oblique or tangential criticism or complaints about the submission or approval process into every open ODP thread. For all those who have whined or complained about the ODP submission process this this is an example - and an attempt - to focus on that very issue in a critical, examining, proactive way. (Thank you hutcheson.)

Focus your comments on the contents of the statements or critical analysis of any other member, challenging by reason and probative fact whatever proposition they have stated or argued.

I will no more suffer blunt instrument or ad hominem thrusts or attacks directed at the ODP or its editors than I will suffer similar thrusts or attacks directed towards anyone else, including myself.

Please carefully and mindfully read the Charter and the WebmasterWorld TOS. Both apply here. Pursuant to the WebmasterWorld TOS comments whose purpose is to address matters of forum moderation are to communicated by stickymail.

[edited by: Webwork at 2:21 pm (utc) on Dec. 12, 2006]

Webwork




msg:3186198
 1:31 pm on Dec 12, 2006 (gmt 0)

How is it that Wikipedia "does it" - publishes content that manages to maintain a fair bit of integrity - with a distributed system of editorial control?

What is wrong with the distributed model of Wikipedia that will cause it to fail, and therefore the model cannot be applied to an open source directory of web resources?

What is it that has lead to Wikipedia's emergence and success? Can that be duplicated in the context of editorial control of a directory?

[edited by: Webwork at 1:31 pm (utc) on Dec. 12, 2006]

hyperkik




msg:3186287
 3:09 pm on Dec 12, 2006 (gmt 0)

Trusting volunteer developers within their own network servers: this is the trickiest bit, but doesn't affect most of us. However, AOL has managed to let a few technically astute meta-editors behind the firewall -- a pretty impressive demonstration of trust, I think.

They've let a tiny (statistically insignificant) number of volunteers who have dedicated hundreds, probably thousands of hours to the project behind the firewall, to make minor tweaks?

I suggest you visit Matt Cutts' blog, in relation to Google Answers and the killing of zombies.
Products from years ago often need overhauls and rewrites or else the underlying code grows stagnant, and the Answers code launched in 2002.
Has AOL ever evidenced interest in letting its volunteers peek at the obsolete code that powers DMOZ, let alone improve it or (gasp) rewrite it? Giving a couple of people enough server access to keep the zombie alive, to me, sounds like a way to avoid paying for tech staff. Spurning every offer to improve the software? To me, that sounds like distrust.

flicker




msg:3186400
 4:59 pm on Dec 12, 2006 (gmt 0)

>Why not default submissions to approval and listing after 60-90
>days? Why not allow default listings to appear, after that time,
>with a marking that it "has not been reviewed"?

I think that's an interesting tack for a directory to take, and I'd be interested in seeing the directory that resulted from it (as well as seeing how such a directory would be able to cope with spam and junk submissions.)

But it wouldn't be a human-edited directory at that point. What it would be is a more advanced model of a freeforall-style directory, with a certain amount of human oversight. Definitely not the niche the ODP is filling, nor anything the ODP would be likely to want to give up its niche to become. You'd need an entirely different type of volunteer editor for the Freeforall-Plus directory endeavor. But if you could figure out a way to keep it from being overwhelmed by spam, I say go ahead and go for it. :-)

This 36 message thread spans 2 pages: 36 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Search Engines / Directories
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved