Business

#Ideas17: A Scientific Method for Parsing Research

Are you properly making decisions based on the research you're considering? At ASAE's Great Ideas Conference Sunday, a panel suggested taking a systematic approach to analyzing information you receive from both internal and external sources.

If the recent media coverage of the “fake news” phenomenon has taught us anything, it’s that accuracy is something the average person may struggle to assess.

There’s a related issue in the association space: dealing with the challenge of making decisions based on the best possible information—while realizing all information has its weaknesses.

“If we had one takeaway, one message, it would be: Pause. Stop. Step away from the data. Take a deep breath. Think critically. And really, really look,” said Susan Crystal-Mansour, director of the Center for Association Services at Westat, speaking at the “Information Indigestion: Consume Research Responsibly” session Sunday afternoon at the ASAE Great Ideas Conference in Orlando, Florida. “It really doesn’t take that much to look.”

The speakers suggested using a process akin to the scientific method for analyzing data from both internal and external research and evaluating its strengths and weaknesses.

This approach creates a fundamental system for an organization to analyze a piece of research or data, consider its potential biases and sampling weaknesses, and assess its potential value for the organization, Crystal-Mansour said. The approach helps to operationalize research within an organization, particularly how association data teams respond to a report or a member survey.

“It’s an ongoing way of thinking about an organization,” added Natasha Rankin, chief operating officer at the American Counseling Association.

The session, which included an interactive element in which groups were asked to analyze research scenarios based on the method, touched on common weaknesses [PDF] in research, including numerous kinds of bias (such as from the sample size, the responses, the data collection method, the respondents, and what can be recalled from memory). Marc Beebe, senior director of strategic research, public imperatives, and corporate development for IEEE, characterized bias as “a systematic error in the research.”

Speakers also explained the distinction between correlation and causation, with Crystal-Mansour referring to a Ted Talk highlighting a correlation suggesting that ice cream causes drowning. (Hint: It doesn’t.)

David Krantz, vice president of research and knowledge management at the American Society of Interior Designers, said that when research seems to suggest cause and effect, the reader should see a red flag.

“When you see an implied causality, pause and take a look a little bit further,” Krantz said.

(iStock/Thinkstock)

Ernie Smith

By Ernie Smith

Ernie Smith is a former senior editor for Associations Now. MORE

Got an article tip for us? Contact us and let us know!


Comments