Center for Open Science Findings Raise Questions About Research Integrity
A reproducibility project carried out by the Center for Open Science found that more than half of the published psychology studies it reviewed could not be successfully duplicated. What can associations learn from the findings and apply to their own research?
Conducting research is an essential way that many associations serve their industry. It’s imperative then to ensure that the research is conducted fairly and responsibly. That means a study’s findings, if put to the test, should hold up when the study is reproduced.
That’s exactly what the Center for Open Science attempted to do in a four-year “reproducibility project” focused on psychology research. What COS found, however, was that of the 100 social science studies they re-ran—all of which were published in three leading psychology journals—more than 60 percent could not be successfully reproduced.
COS published the findings last week but emphasized that the failure to reproduce results didn’t necessarily mean that the original studies were incorrect.
“A replication team must have a complete understanding of the methodology used for the original research, and shifts in the context or conditions of the research could be unrecognized but important for observing the result,” Elizabeth Gilbert, a team member from the University of Virginia, said in a statement.
Role of Associations
Still, the results of the COS project raise a lot of questions about properly conducting research.
“Associations can, and do, play a role in bolstering the quality of research in their field,” Patrick Glaser, director of research at McKinley Advisors, said in an email to Associations Now. “Associations that have researchers among their membership should weigh how their members’ needs are being met and where the gaps are. Similarly, associations that fund, sponsor, or conduct research directly should consider their own role in the process. If they are funding research, are they promoting research integrity through their process of reviewing and selecting projects? Do they support integrity through their requirements? And what standards are they adhering to themselves?”
Glaser, coauthor of ASAE’s recently released report Responsible Conduct of Research, said groups conducting research can take steps to ensure that they know what’s involved in the process and, in turn, run a successful study.
“A survey, for example, will require expertise in designing questionnaires, substantive knowledge of the subject matter, an understanding of available online or offline survey data collection tools, and the knowledge of how to analyze and report on survey data. If the expertise cannot be found in-house, they can tap outside vendors that should be able to demonstrate competency in those areas,” he said. “It is also helpful to think about the purpose of the research. Is the research being conducted for the sake of informing the public, the field, or some other stakeholder group? The answers to these questions have implications for revealing whether any conflicts of interest exist.”
And when the research is completed, it’s important to look critically at the findings, Glaser said.
“The cornerstone of research integrity is scrutiny, and in some cases it may be necessary to find researchers that can judge independently as to whether the report is fair or not,” he said. “Associations have the authority to represent a field, and to bring its members together. They are sometimes the only source for critical information for members and the public. Thus, it is very important for associations to ensure they conduct research with the utmost integrity.”