Membership

How Can You Manage Misinformation in Your Online Community? Think Like a Journalist

Community managers face a tough job as misinformation, once the domain of mainstream social media platforms, begins to seep into private forums. An association pro says taking a journalist’s mindset to managing information presented in a community can go a long way.

Misinformation is running rampant on social platforms large and small. Even LinkedIn, a network for professionals, isn’t safe anymore from discussions that stretch the definition of truth.

Inevitably, this will likely bleed into platforms that you manage, even private communities.

So What’s a Community Manager to Do?

Cecilia Sepp, a nonprofit expert and the founder of Rogue Tulips, LLC, has been thinking about this issue, writing a blog post on this topic late last year. She suggested that community guidelines can help set the foundation that can help keep a discussion from veering off track.

“Our community guidelines may need to be revisited, because we need a starting point to refer to about what should and should not be picked up,” Sepp said. “And a professional forum is not a place where you should be discussing or writing things about political issues on a personal level.”

While an association’s forum may be an appropriate space for discussing a political stance it has taken, it is not carte blanche to discuss political issues in general, she added.

“If you are in a community of your peers within a profession or trade or an industry, you need to be aware of the fact that it’s a different level of conversation,” she said.

What Is Truth, Anyway?

Thanks to the rise of misinformation, community managers have a difficult job on their hands as they manage the way people discuss topics in professional forums. A “factual” assertion with no sources backing it up—or sources that are in and of themselves faulty—can directly misinform other community members if you’re not careful, a problem that significantly predates the rise of the internet. Sepp pointed to urban legends, such as the widely described myth that Coca-Cola removes rust, as an example of this.

When players in any conversation, online or otherwise, enter with an agenda, that may not lead to information in and of itself—but it can create a climate where misinformation is used.

“A lot of the bad behavior starts with people trying to impose their personal beliefs on other people without having permission from people to discuss that topic,” she said. “And I think that a lot of it [happens] since we’re trying to be more aware of people in the world and how people are sensitive to different issues.”

Even when something seems like it might be factual, if it’s poorly sourced or presented in a misleading way, it may be tantamount to misinformation. Sepp pointed to the way that science-based data may be more fluid than it seems.

“We like to think science is very clear-cut,” she said, “but there’s a lot of science that really isn’t.”

Health science is a good example of this, as proven by the often complex discussions around COVID-19, in which so little is understood about the disease that theories are often presented as fact, and information can be cherry-picked to support either side of an argument.

One approach she suggested to navigating this complex environment around truth could be to tell users, when sharing an opinion, to make it clear they’re offering a point of view, rather than speaking in facts.

“If we could train people to do that, boy, would the world be a better place,” she said.

The Role Moderators Should Take

Sepp emphasized that while moderators may have to take an active role in helping to manage discussions in which misinformation may be shared, ultimately, their role is not as the truth police, fact-checking every single piece of information that users share.

Instead, it’s as an objective observer, trying to manage a discussion to ensure it remains positive for the community as a whole, curating the discussion so it moves away from misinformation. (Some virtual event moderation tips even remain useful for community managers online.) Sepp suggested that community managers have a lot in common with journalists, and might want to approach their roles as such. That applies not just to moderation but to the foundation of the community; she suggested that organizations should write guidelines that reflect journalistic integrity.

But beyond how they manage what appears in a forum, community managers should nonetheless feel empowered to make a call when one is necessary.

“I think that the next step after guidance is to nurture community managers that are empowered and confident enough to put posts in moderation until some more research has been done on it,” she said.

Sepp suggested that one of the ways community managers may play this role is by developing an instinct to make a call on when a discussion has run its course.

“It’s OK to end the conversation,” she said. “It’s OK to end the discussion, because it elevates that experience [when you don’t] let someone open that can of worms over and over again.”

(A_M_Radul/iStock/Getty Images Plus)

Ernie Smith

By Ernie Smith

Ernie Smith is a former senior editor for Associations Now. MORE

Got an article tip for us? Contact us and let us know!


Comments