Online Moderation: Don’t Fan the Flames With Your Messaging
Moderating an online community is already hard enough. Don't make it harder by failing to follow your own rules. Just ask YouTube, still smarting after a recent painful misstep.
Content moderation is a harder job than ever, in no small part because its potential impact is no longer limited to the platform itself—especially when the figures at the center of the conflict have significant profiles of their own.
Last week, YouTube found its content moderation practices heavily scrutinized by the online public after it chose to “demonetize” the channel of a popular conservative commentator, Steven Crowder. Crowder had made a string of comments criticizing a video producer for the website Vox that were widely seen as homophobic. The producer, Carlos Maza, raised the issue on Twitter, a platform YouTube doesn’t own, so the bulk of the conflict took place outside of YouTube’s walled garden.
But it was still YouTube’s problem. The Google-operated video service was caught between a rock and a hard place: Both parties in the conflict are well-known figures with significant fan bases, so no matter what decision it made, people were going to be mad. The company’s rules on harassment explicitly ban “hurtful” language, but YouTube failed to stick to its own moderation rules in initially siding with Crowder. That led to more attention, and YouTube had to modify its response.
Today has generated a lot of questions and confusion. We know it hasn't been easy for everyone. Going forward, we'll be taking a closer look at our own harassment policies, with the aim to update them.
— TeamYouTube (@TeamYouTube) June 6, 2019
Our thoughts and plans: https://t.co/sYJYK44djO
It did this through a series of responses on a Twitter account, @TeamYouTube, that at times seemed to throw coals onto the fire. Eventually, the company had to put up a series of blog posts promising it would rethink its approaches to the problem. This gave the impression to some creators, including those critical of Crowder, that the company’s moderation standards were completely arbitrary.
Long story short: Nobody was happy, and YouTube’s response made matters worse—particularly as the blowup occurred during Pride Month, an event that the company was featuring in its own advertising.
a messaging problem
Now, to be clear, I don’t think that associations will be dealing with issues quite as messy as this, nor on the scale of YouTube, which is one of the most active and messy platforms around. But I do think the confusion created by YouTube’s own messaging was a big part of the problem. And that’s where an important lesson is to be pulled here.
“The problem for YouTube is that for rules to be taken seriously by the people they govern, they need to be applied consistently and clearly,” New York Magazine scribe Max Read wrote. “And YouTube has done a very bad job of that.”
If you run a community, you need to moderate consistently and objectively. No matter your size, this is not easy if there’s too much flexibility, either in the rules or your organization’s interpretation of them.
Whether you’re talking about a small community of a few hundred people or a large one of a few hundred million, poorly considered rules of the road are bound to create some major headaches. And this conflict highlights the fact that this is a really, really hard job.
But if you’re managing a community on the internet, it’s the job your organization signed up for. In an interview with The Washington Post, Microsoft researcher Tarleton Gillespie, an adjunct professor at Cornell University and the author of a recent book on content moderation, pointed out that building a platform comes with a kind of responsibility that nobody has on the open internet:
Platforms wish they could be the open Web done right: that they help us all join in, say what we want, find entertainment and community—and they get some ad dollars in the process. What they quickly found was that building community requires dealing with social tensions; that providing people a way to be heard means they will use that tool to their own ends, whether noble or cruel.
Often the solutions to moderation problems aren’t technical in nature—it’s all about communication.
Have a Plan and Stick to It
In many ways, this debate over what’s good or allowable online—whether in a website comments section, a private community, or a hosted forum on a secondary platform like Slack or Facebook Groups—is fundamental to the internet, and everyone’s going to have an opinion.
This discussion might lead you to tighten the reins, or even drop a community entirely. But if you do keep that community around, you have to take a close look at your moderation strategy. And once you get a basic framework built, stick to it, both organizationally and with your own community management team.
YouTube’s failure to stick to an existing framework made things a lot worse for everyone involved. There’s no reason the same thing has to happen to your communities. Follow the rules you set for yourself—and don’t let molehills turn into mountains.
(Bet_Noire/iStock/Getty Images Plus)
Comments