Reddit and the Problem with Hands Off Administration

Reddit AlienNote: Over the course of this post, I’ll be making reference to, but not linking to, content that is likely to be shocking and/or offensive.

Free speech is a noble cause. Few, online or off, would say that they oppose the ideal of free press and free public discourse.

However, the first amendment, the codification of free speech in the U.S. constitution, only protects the public from the government. This means businesses, employers, individuals and other non-government entities can and often do place limitations on what can and can not be said in certain places.

Most sites take advantage of this to some degree. While almost every site filters and blocks spam, it’s not uncommon for forums and comments to have rules on obscenity, personal attacks, acceptable topics and other limitations on free speech.

Some of this is an effort to prevent chaos and keep the legitimate, on-topic users from being drowned out by irrelevant content. However, much of it also centers around maintaining a sense of community and making the site the kind of place that people want to visit.

Still, there are some sites that take the idea of “free speech” even more to heart and publicly vow not to remove any content that they aren’t legally required to. This approach is tempting because it seems to be the most fair, removing the personal biases of the administrators from the rules that run the community.

However, as the popular social news site Reddit found out this week, that path is fraught with its own perils and can risk both legal issues and alienating the very people who were drawn to it.

The Story of Reddit and TheFappening

Reddit, at its core, is a social news site comprised of thousands of smaller commnities called “subreddits”. Any user can create a subreddit on a topic they desire that each subreddit has it own moderator team that lays down rules specific to that particular community.

When it comes to offensive content on the administrative level, Reddit’s policy has always been very open about the types of content it allows. On it’s rules page, Reddit describes itself as a “Pretty open platform and free speech place,” but notes it has rules against five things: Spam, vote maniuplation, posting personal information, child pornography and tampering with the site itself.

However, that “hands off” policy got a test last week with the nude celebrity photo leak. While that leak may have gotten its start at 4Chan, Reddit quickly became ground zero for people looking to both view the existing photos and post new ones. That effort was centered on a user-created subreddit entitled /r/TheFappening, which was dedicated to the leak. That subreddit quickly became one of the most popular on the site and easily the fastest-growing subreddit on the site.

The subreddit remained up for about a week and became the center of public controversy and legal threats, especially as the backlash against the leaks grew. However, over the weekend two things happened almost simultaneously. First, Reddit posted an explanation to its blog about why it doesn’t ban controversial subreddits. Then, it banned /r/TheFappening.

Reddit admins would later post an explanation as to why the subreddit was banned, saying, in short, that the subreddit had become so onerous to manage due to traffic, copyright claims and spam that it was interfering with the rest of the site. As such, they shut it down. They also shut down dozens of other subreddits related to the leak, including clone subreddits of /r/TheFappening.

However, those who have followed Reddit will know that this isn’t the first such battle they’ve had. In 2011 the site drew controversy for the subreddit /r/JailBait, which was a subreddit dedicated to non-nude but sexualized images of underage girls. Reddit originally refused to remove the subreddit but, as outrage grew, including it being featured on several mainstream media outlets, Reddit eventually took it down and it’s rules were amended to include sexually suggestive material involving minors.

That cycle repeated itself a year later with /r/Creepshots, a subreddit dedicated to photos of women taken without their knowledge or consent, usually in public. That subreddit became a major news story after a substitute teacher was fired for posting images to the forum and, after the outrage, it too was banned.

In four years Reddit has faced three instances where it has been forced to ban subreddits that have become the center of national controversy. Sadly, this trend is only likely to grow as Reddit becomes more popular, putting even more strain on its policies and further testing its principles.

Why the Law is a Poor Guide for Communities

When it comes to using the law as a rule book for your community site, there are three key problems:

  1. The Law Isn’t Always Clear: In many of the subreddits that have been banned, Reddit’s legal obligations were dubious. They might have been legally safe but there have been no relevant tests. The law is often ambiguous, especially when it comes to the responsibilities of intermediate parties and few, is any, want to be the legal case that sorts these issues out.
  2. The Law Isn’t the Only Limit: In addition to the law, you have limited resources for running your site including time, money, people, computing resources and so forth. Even if you are on the correct side of the law, legal battles, public backlash and user backlash can sap all of those resources, making it impossible to continue. This is what Reddit has most directly struggled with.
  3. Pushing Boundaries: Finally, sites that don’t impose limits tend to become havens for people that don’t want such limits. Whether it is with copyright infringing works, questionable pornogrpahy or racist/sexist material, users always try to push the boundaries of what is acceptable at a forum. In the case of communities that turn to the law for their standards, this means users will push the boundaries of the law itself, often with limited understanding of the laws they are testing, while putting the site in danger of legal action.

This says nothing about the image problem that can come with such a policy. If you’ve only read mainstream media stories about Reddit over the past five years, you could easily believe that Reddit is primarily a meeting place for people who like to look at photos of women, often underage, that were shared against their subject’s will.

While Reddit has an ideal that it is a haven for free speech, limited only (or at least primarily) by the law, that ideal is not practical and has been shown as much time and again.

Now, that impracticality has left it in a difficult position. Though /r/TheFappening has been removed, other subreddits that feature stolen photos from women remain active. The only thing that separates the from /r/TheFappening is that these subreddits feature non-celebrity women, meaning that there is no national news story and no army of lawyers filing takedown notices.

The same can be said for the deluge of racist, homophobic, misogynistic and other offensive subreddits that exist on the site, all of which are widely known but haven’t become national media stories yet.

In short, Reddit’s content policy isn’t governed by an internal process or even, as it claims, the law. Instead, it’s governed by public backlash, quickly removing subreddits that attract too much negative attention to maintain. That makes the arbiters of what is acceptable on Reddit not Reddit’s admins or users, but public.

That is neither free speech nor an evenly-applied community standard.

Bottom Line

To be clear, I agree with the removal of all of the subreddits involved. Ethically, I am opposed to all of them and many were legally dubious for Reddit to begin with.

However, it’s clear that Reddit’s policy is not working well for the site. Three national controversies in four years is bad enough. But the most recent removal leaves it in an awkward position, standing by subreddits that encourage the posting of leaked/stolen sexual images of ordinary women, while actively banning subreddits that try to encourage the posting of leaked/stolen sexual images of celebrities.

Rather than simply punting on the issue and turning to the law as a guidebook on community administration, it’s important to have a conversation with the community and decide what type of place they want to create. Administrators need to work with moderators and users to craft policies that not only steer the site away from legal trouble, but also to nurture the type of community they want.

While Reddit is right when it says that it is a platform, platforms still have a choice in type of culture they nurture.

Furthermore, Reddit is not the U.S. government. It is a private company with a private website. While free speech is a noble ideal, there is no law requiring they, as a private company, provide it and, in many ways, they already don’t. After all, plenty of spammers think Reddit is anything but a haven for free speech.

In the meantime though, Reddit is an interesting case study for other community administrators and a warning against being too hands off when it comes to your content policies.

A counter example might be Facebook, which has a lengthy list of unacceptable content types. While enforcement of that policy hasn’t always been perfect, the policy has largely prevented them from being the center of major controversies or being called hypocrites when they remove content that does spark outrage.

While there are certainly drawbacks to Facebook’s approach, it’s safe to say that the site has done very well for itself.

Want to Reuse or Republish this Content?

If you want to feature this article in your site, classroom or elsewhere, just let us know! We usually grant permission within 24 hours.

Click Here to Get Permission for Free