Lessons From Facebook’s Hate Speech Controversy
After a boycott by women's groups over offensive content, Facebook responded with a full action plan to address their concerns. Both the campaign's structure and the company's response are instructive.
When Facebook faced an outcry from several groups over its handling of hate speech—against women in particular—it was an uncomfortable situation all around. But in the end, the advocacy campaign had the intended effect.
What your association can take away from the controversy to apply when your own flare-ups arise:
The situation: In recent years, Facebook has been challenged with how to handle censorship and hate speech: The company has been criticized for overzealously blocking content in some cases but being lax in others. Facebook has worked on creating policies to deal with these issues, but implementation has been inconsistent and often based on surges in user reaction. In recent days, the issue flared up again, with women’s groups noting that a number of Facebook pages that contained sexist content and depiction of violence against women stayed online despite numerous complaints.
The backlash: The criticism of Facebook gained steam after the advocacy group Women, Action & the Media (WAM) wrote an open letter to the social network. “It appears that Facebook considers violence against women to be less offensive than nonviolent images of women’s bodies, and that the only acceptable representation of women’s nudity are those in which women appear as sex objects or the victims of abuse,” the letter argued. “Your common practice of allowing this content by appending a [humor] disclaimer to said content literally treats violence targeting women as a joke.” The letter, signed by numerous advocacy groups and nonprofits, asked Facebook users to contact advertisers regarding content deemed offensive to women. On a separate page, WAM posted graphic examples of content allowed to remain on Facebook despite its clearly offensive nature. More than a dozen advertisers removed their ads from the service as a result of the boycott, the Los Angeles Times reported.
The response: On Tuesday, Facebook agreed to modify its standards. “In recent days, it has become clear that our systems to identify and remove hate speech have failed to work as effectively as we would like, particularly around issues of gender-based hate,” Marne Levine, the company’s vice president of global public policy, wrote in a blog post. “In some cases, content is not being removed as quickly as we want. In other cases, content that should be removed has not been or has been evaluated using outdated criteria.” The company vowed to update guidelines, improve training processes, increase accountability, and work more formally with groups that have direct experience with dealing with hate speech, such as the Anti-Defamation League. The site is also adding new degrees of accountability for users who post crude content, requiring them to use their real identities if they want the post to stay online.
The lessons: They’re two-sided: how to wage a successful advocacy campaign (as WAM did) and how to respond to criticism in a constructive way (as Facebook did). By drawing attention to the company, WAM convinced Facebook to react in a way that promised long-term change. And when Facebook offered a solution, it was more than a promise. It was a structured approach, allowing for outside input and checks from groups that have experience dealing with such issues. “This was a targeted, strategic, and ultimately apparently quite effective campaign to try to force a powerful company to do more than it has been doing. That is something we haven’t seen much of yet,” Susan Benesch, founder and director of the Dangerous Speech Project, told the L.A. Times.
Has your organization ever taken part in a boycott, or had to respond to one? If so, what did you learn? Let us know your take in the comments.
(photo by Johannes Fuchs/Flickr)
Comments