Burn gay flag church reddit for free#
This allows the site to choke out racists and bigots without giving them motivation to act as martyrs for free speech. They are allowed to exist, but only in a small corner with no resources and no permission to aggressively promote an agenda across the site. Reddit’s second approach, and its new method under Huffman, involves isolating and refusing to support communities beyond the bare bones of the site’s architecture.
The outright banning of content tends to create a sense of martyrdom in the name of free speech, meaning some of Reddit’s darkest ideas spread to unrelated communities. New members only saw posts related to the banning of some niche communities, and had no way of knowing this wasn’t business as usual. This influx of hate-based speech on other forums caused tremendous damage to the reputation of Reddit for new members considering joining. Almost every time a subreddit of decent size was banned, its community members made it a personal mission to flood other communities with its ideas in protest. Most recently, this approach was taken with the controversial subreddit r/fatpeoplehate. One is outright banning the community in question. It appears there are two approaches Reddit has taken to dealing with “hate speech” communities. And as a company, it tends to wear its heart on its sleeve-you can easily see the intent of any community-related policy or decision it makes. Reddit is navigating relatively uncharted waters now. These are not the kinds of comments you typically see in an AMA, yet due to the growth of these communities, the ideas are rapidly spreading across Reddit. As a result, the AMA was flooded with comments suggesting blacks have lower IQs than whites and are inherently prone to violence, among other things. This AMA was linked to in some of the racist communities on Reddit, and their related off-site chat rooms. The ACLU and recently did an AMA relating to the one-year anniversary of the Michael Brown incident in Ferguson. Then, this behavior gets noticed when ideas that would typically be frowned on are suddenly being screamed through a bullhorn. This is not simply limited to “hate speech” but all ideas. Once a community like this hits a critical mass, the ideas they scream propagate across the site and the users take great offense at dissenting or competing viewpoints. Users pick out viewpoints that reflect their own, exclude other ideas, and exist in echo chambers that amplify and reinforce their thoughts. The problem with controversial or hate speech typically is tied to how social media is built. Tumblr banned blogs that encouraged self harm.īut the fundamental issue with content bans is they’re inherently limiting and fail to kill off “group think” that drives the content in the first place. Instagram uses banned hashtags to mitigate pornography. Sites like Facebook and YouTube hire thousands of content moderators to keep the most unsavory elements out of their feeds. Social media platforms have deployed a few different strategies for coping with this-most revolve around the outright ban of content. More than once, female moderators were told they needed to just “shut up, lay back, and be a good woman.” When users are unsure of the gender of the moderators, they tend to hurl threats of rape or insults based on sexual orientation-often at the same time for added effect!
Most moderators on our teams have been called disgusting and racist names, threatened with physical violence, and some have been subjected to doxxing attempts. Many times, our female moderators have endured rape threats in public comments and private messages (though it’s not always limited to women). The things people say online with or without a connection to their real identity provides a disturbing glimpse into some of the darker parts of humanity. This is a great step, as combating it on our own is untenable. In the wake of the recent upheaval with Reddit, the new leadership has committed to helping the moderators mitigate these types of speech. For simplicity’s sake, we will refer to communities that support racist or otherwise discriminatory viewpoints as hate speech communities. But where there are people on the Internet, there is also horrible, negative commentary.