homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Visit PubCon.com
Home / Forums Index / WebmasterWorld / Community Building and User Generated Content
Forum Library, Charter, Moderators: rogerd

Community Building and User Generated Content Forum

Forum modification by membership 'at large'
Elements of a successful design - some ideas

 9:43 pm on Nov 27, 2004 (gmt 0)

I often thought that modification by membership at large would be a boone for forums - if it could work.

A few months ago I noticed that community moderation was being given a test drive by CraigsList. Now, if a community as diverse as CraigsList can make it work why not elsewhere? Yet, I don't think I've seen it in operation anywhere else.

Now, it isn't quite full blown moderation at CraigsList. Instead it's thread locking. Members can flag a thread for violation of CL's TOS. If enough members flag a thread then it's locked for review.

So, how would you integrate community modification into a website?

Me? If I was designing the system:

I'd only grant the flagging option to ripened members, which could be as few as 50 posts.

I might like to have the flag coded: SpamURLDrop, Profanity, Flaming, Etc. to drop them into different review queues.

I might like to require more flags for posts by senior members and/or fewer flags for newbies.

I might have the system coded for the 'elder violator' to delete the flagged post him/herself and resubmit x1. I wouldn't offer this to newbie/spammers since it would likely encourage more.

I'd have the system lock the flagged posters posting rights until reviewed.

Alternatively, I might send the post to a thread jail, where it's viewable by the poster only and only the flaggers can comment (up to say 2 to kill any flame wars and offer constructive criticism). System might require an "off to jail" notice so flaggers can follow up.

I might limit the number of flags anyone can 'throw' in a given period, with the option of increasing the number of flags based upon a history of correct calls.

The system should autonotify the poster of a flagged thread.

What else would you build into such a system? What problems would you expect? How would you deal with those problems systematically or by TOS? Any coders out there: How hard would it be to code any element of such a design and how hard would the code hit system resources?



 3:23 am on Nov 28, 2004 (gmt 0)

You've built in lots of options, Webwork. I'd probably add the ability for an admin to override the automated functions, i.e., promote a newer (but trusted) member to mini-mod status, or take away that ability from a high volume member who demonstrates poor judgment.

I have a feeling that building in lots of control variables would be the way to go - that way, it could be adjusted on the fly once the system had been in use for a few weeks.


 4:10 pm on Nov 28, 2004 (gmt 0)

rogerd, I think the principal imbeded question is this:

Why won't such an approach work and then, given those conclusions/assumptions that it won't work, how could those problems be fixed or addressed to either eliminate or minimize them?

A police force is a component of a civilized society, yet large segements of society 'act civilized' each day without the intervention of the police. Forums are a society of different sort: More anonymous, lots of drifters, great diversity, etc. I'm left to wonder if the 'police force' model of behavior regulation is the highest and best approach to problem behavior in a forum or, if anything else can be programmed into the communication infrastructure that helps to weave the society.

For this thread I'd simply like to focus on software extensions that would play a role in behavior regulation or behavior modification.


 12:30 am on Nov 29, 2004 (gmt 0)

This is really an interesting thought. One thing that many forums do to involve the regular members is to include a "report this post" button. That's not as elegant as your solution, but it's a step in that direction. (This feature is built into most software - it will send a link, plus the identity of the reporter (if logged in).

Your post caused me to look at vBulletin's user group manager. That software has very granular permissions for each usergroup with on/off controls for many functions. It seems like you could get some limited functionality by creating some custom user groups. What you really want to create are special "super mods" who can work across all forums but who have limited powers. You might still need some hacks to get more powerful functionality, of course.


 1:31 am on Nov 29, 2004 (gmt 0)

One of the previous iterations of yayhooray relied on community moderation. Like most attempts at such democracy, there's a critical mass when it just becomes too unweildy to manage the moderators en masse, and the content slides downhill into vapid moderating battles.

Community moderation (ie, democratization) is a nice principle, but it still relies on supermoderators (typically site admins/owners) to police those would-be disrupters, and ultimately can prove to be more effort than the traditional line.


 3:27 am on Nov 29, 2004 (gmt 0)

WibbleWobble - if you come back to this thread, I'd ask for your ideas about solutions - or approaches to solutions - to the problem you described. Let's not leave it with 'this is insurmountable'. I'd ask that you describe in the greatest detail 'the problem', any attempts to address 'the problem' and how the solution(s) failed and how the problem might have been more effectively addressed.

It may be that mods in the situation you observed needed clearer guidelines of acceptable member and mod behavior. It may be that mods needed to be encouraged to branch out and start their own forum if they felt suppressed, restrained or antagonized. (It's hard for me to say since I wasn't part of what you described.)

Perhaps a forum isn't exactly a democracy and therefore expectations of free speech exceed the expectations of the founder. I'm certain notions of 'free speech' give rise to many conflicts in forums when people wrap themselves in the idea that "I can say anything I please so long as it's not profanity, racism, sexism, etc.". Not so when the TOS specifically restrict comments about other members, flamming, etc.

No doubt rules of conduct are interwoven into the issue of community moderation by technology, and I'm certain that the absence of clear rules (versus people's expectations or misguided ideas of their entitlements), but for the moment I'd like to focus on the issue of designing into the forum software functions that would effectively allow/facilitate community moderation.

OBTW - I don't care for using the phrase 'community policing' since may engender negative feelings.


 5:15 pm on Nov 29, 2004 (gmt 0)

I really like these ideas and while building my forum software I may add in the ability to turn on such features. However, I feel that community moderation has a fatal flaw, which is that if you are looking for community moderation you are most likely lacking trained moderators. If you have a full staff of moderators, you would not need the entire community to help in the moderation. I really believe there is no substitute for good trained moderations.

On that note, it's hard to find a full staff of moderators that will be available 24/7, looking over every inch of the forum. So a little community aid can't hurt. I am planning on placing the Report a Problem link in a noticeable area. When a member reports a problem they will be able to specify their particular problem with the post, and maybe give the problem a priority based upon their level within the forum. Moderators should then review this queue whenever possible. But I think, at least for now, any effect to the offending post will have to be made by a trained and trusted moderator.

But I'd like to try to see if community moderation that effected the post would work. Hopefully moderation of the flag system would be less work than forum moderation. It wouldn't be too difficult to implement a system that would have the features you suggested but could be toggled to either have an immediate effect or to wait for a moderator to take final action. If I do actually do this and get any results from this I'll be sure to post on it.


 11:21 pm on Nov 29, 2004 (gmt 0)

WibbleWobble - if you come back to this thread, I'd ask for your ideas about solutions - or approaches to solutions - to the problem you described. Let's not leave it with 'this is insurmountable'. I'd ask that you describe in the greatest detail 'the problem', any attempts to address 'the problem' and how the solution(s) failed and how the problem might have been more effectively addressed.

It may be that mods in the situation you observed needed clearer guidelines of acceptable member and mod behavior. It may be that mods needed to be encouraged to branch out and start their own forum if they felt suppressed, restrained or antagonized. (It's hard for me to say since I wasn't part of what you described.)

While I'd like to break down all the pitfalls involved in something like this, I really didn't pay attention to the version which allowed community moderation, so I'm not sure what rules were in place, and suchlike.

The ideas you have in place -- flags for various moderations etc -- all seem like the logical steps to take in engendering a good community driven forum, as they deviate and evolve the standards already in place; ie the 'report post' feature in most fora. But, for want of a better word, it still requires a 'police force' of entrusted higher-ups to oversee the review queue.

Unless you have an extremely mature set of users, there are bound to be disagreements, and the ability to directly influence a post is a tempting prospect in winning the argument.

I think it also requires a level of transparency that may not be in place in the current fora, for instance, when flagged for reviewing, the post ought to be displayed as normal with the addendum that says something along the lines of "this post has been flagged # times" - the inclusion of the flagger's name is debatable, as it may create unwanted rivalry and attention.
It allows the thread to continue as normal, so the poster doesn't feel persecuted, and it prevents false positives having a negative affect on the community's content access.

If a thread or post is deleted or closed - whatever you decide of its fate - I'd personally choose to disable the content of the offending item, with a note from the deciding moderator on the grounds. It stops people posting follow up threads going "omg where is my thread now pls?" and keeps an archive of what you did about it, and why.

It still boils down to needing moderators to oversee the queue though, especially if/when it starts to get larger, when you'll pick up detritus users and duplicate accounts for the sake of disrupting the system.

One (complex) option for creating moderators is to allow flagged posts to be voted upon, with a simple boolean true/false choice. If a certain percentage of the active users (active defined by logged in within X days/hours/minutes) vote true(yes) then the post is automatically moderated into destruction, and the original flagger's note appended. The original flagger would then get +1 to their 'score' if you will. If the active user populace continuously votes for the same result as the flagger, their score will increase. Once a user has a certain defined score, they can then 'automoderate' whereby they negate the voting process and are allowed direct moderation priveliges (or perhaps they require less users vote the same way as them for the moderation to go through).

The latter idea is gleaned from audioscrobbler, which takes a similar approach to artist spelling moderation.

At its most basic, you'd need each post row to contain a column (or columns) in the DB for storing the moderation information, so you might have "URLSpam" codified as "111" - then you could add additional data - number of times reported, or votes against the post etc - after it as "111:13" (it seems to me that you'd not have multiple reportings and voting, because they are basically the same function; by reporting it you are by definition voting yes to moderation). You could store all the various codes in the same column, if you can then throw it in an array in the forum-content delivery, or you could have each moderation type in a different column, which keeps it clean and human readable.

I dont know that any of this was useful, but its an interesting idea, so I'm keeping an eye on the thread's progression.


 1:59 pm on Nov 30, 2004 (gmt 0)

One challenge is separating bad behavior from disliked opinions. I've seen even experienced mods argue to ban a member who vigorously espouses an opinion contrary to their own; the reasoning is often that the member is being too argumentative and overbearing. In some cases that may actually be the case, but it illustrates that the perception of acceptable posts may vary; a member pushing an unpopular viewpoint may end up the victim of an irritated majority.


 11:50 pm on Nov 30, 2004 (gmt 0)

It's been my experience that you should let everyone have the power to flag inappropriate content. If you have to choose between tons of flagged posts to review and tons of inappropriate posts, you will choose the flags.

The way our system works, any user (registered or not), can flag a post for administrative review. They are allowed to include a note of explanation for the flag. The flag gets sent to a holding tank for administrator review - once the the post is reviewed, the admin can either delete the post, or clear the flag. Doing either will send a thank you message to the flagger.

Community moderation will never be a substitute for administrators or mods - but it does help.

We have a couple of folks (non mods) that are methodical about reviewing every single post and notifying us of every possible violation of the TOS. Once your forum gets big, you will appreciate the help - even if it means sorting through a few inappropriately flagged posts.


 1:02 am on Dec 1, 2004 (gmt 0)

There is a big risk that "community policing" can switch very quickly to "mob justice" or "vigilante rule". Perhaps the heart of the problem is that democracy and forums don't tend to go well together. In societal terms, it's more like a gold-rush town in the old far-west - with drifters and new arrivals, uneasy tensions and sometimes fierce competition. Always better for the new arrivals to leave their guns at the sheriff's office, and the sheriff, his deputies and his gun can keep things in check.

Too much moderation power for too many individuals will also increase meta discussion exponentially, and users will start favoring discussions about the board itself rather than the subjects covered by the board. If everyone in the saloon has a say about what to do about the behaviour of every rowdy cowboy, you're more likely to end up with nothing more than a general bar-fight!


 3:10 am on Dec 1, 2004 (gmt 0)

Encyclo, I don't think a flag feature necessarily has to lead to mob rule. There's no reason that admins have to act on the flagging of users, and there is no reason that anyone needs to know about which posts are getting flagged and which aren't.


 10:48 pm on Dec 2, 2004 (gmt 0)

Just to append to this discussion:

In the course of working, I've happened across a PHPBB mod that does this type of thing (ie, allows reporting of various types of bad content), so I'd wager there's a vBB one too. Lord knows about voting and automoderation, which I thought was infinately neater.

Global Options:
 top home search open messages active posts  

Home / Forums Index / WebmasterWorld / Community Building and User Generated Content
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved