The decision to censor material because it is dangerous or highly offensive is a difficult one, but it's not uncommon. Community managers are faced with situations like this often, and while the decision is subjective, it's never taken lightly. We weigh many things as we consider to remove content, but mostly we look to see if it violates community standards or guidelines. The grey areas are spelled out in the site's abuse grid.

Abuse grids and community guidelines are living documents. Since the Web is changing on a daily basis, these documents will be updated often, referencing new types of abuse or speaking to new issues that may need addressing. Not everyone needs a dedicated community manager, but every company needs to take responsibility for their online community.


Abuse: Clean It Up

The free exchange of information on the Internet is important, and most community managers understand and encourage this. We encourage open discussion but still must remove abuse when we see it.

By removing abuse, you create a safe environment and doing so, you encourage creativity and give confidence to new members with something important to say. Cleaning your site of abuse actually encourages free discussion. Community members who have to trudge through a site filled with personal attacks and sock puppets that run rampant can't even express a coherent thought.

We strive to create an environment conducive to free discussion but free of bear traps.

Defining Abuse

The definition of abuse is really determined by your community guidelines. Those guidelines are your external Dos and Don'ts for your community. Internally, you'll enforce those guidelines with the Abuse Grid. The Abuse Grid for your site is an internal document, and it covers every possible abuse situation you'll encounter. A leaked document purporting to be part of the Facebook community toolkit surfaced last week.

Community guidelines are focused on preventing abuse, so they are typically positive in nature. They are also vague, with more attention focused on encouraging good behavior, in hopes that most people will read them and happily comply. Abuse grids are focused on dealing with abuse, so they are typically negative in nature. They are written in an "if this, then that" format, with clear rules that define racism, sexism, homophobia, pornography, spam and more, according to your site's definitions. They also take a user through a number of disciplinary measures, strikes, eventually resulting in removal of posting privileges.

Censorship for Safety's Sake

But, as the Web grows and changes, people use it for more types of engagement each day. As our communities change, so must our guidelines. In some cases that means redefining abuse. Recently, Tumblr began to censor self-harm blogs and instituted a program to bolster healthy self image in response to thinspiration blogs. Reddit removed sexual content that featured minors, including the /r/jailbait subreddit. In conjunction with the National Suicide Prevention Lifeline, Facebook is now sending a link to an online counselor if an individual is deemed distressed based on Facebook postings.

Increasingly, disturbing phenomena are being addressed, despite worries that it will lead to a less free environment. While I'm not saying you can't post insanity to the world via the Web, I am saying you can't do it on my site.

Photo courtesy of Shutterstock.