Skip to main content
Confidence

Results We Can Believe In

Results we can believe in: Shaping up psychological science.


Published findings aren’t always true

When we read in the New York Times or Psychology Today that a study reported in a top scientific journal showed surprising effect X, we tend to assume that the study was conducted and analyzed rigorously which allows us to safely conclude that the effect is real. The vast majority of the time, we are safe in making the first of these two assumptions regarding the rigor of the research, however, this does not mean that the second assumption is true. A study can be run with all possible vigor and still turn out to produce effects that aren’t reflective of reality. It is not the author’s fault. This is the fundamental nature of science and statistical inference.

Because of this, there are few greater reliefs as an academic than seeing one’s own findings replicated, ideally by other labs. No matter what precautions we take, we know there is some probability that our own published findings might not replicate. In some cases, we replicate our own findings before publishing them, which increases everyone’s confidence. This still doesn’t rule out that something highly idiosyncratic was producing the effects in both cases (e.g. wearing a lab coat) so its best to see replications coming from different labs.

Often, replications are not published with the initial findings for a multitude of reasons. Scientists are afraid that if they wait for a replication before they publish someone will ‘scoop’ them and publish similar findings first (being first counts for a lot in science). Many journals also have short word limits that discourage replications. Finally, for the kind of fMRI work I do, it is extremely expensive to replicate a study and funding agencies will never fund a group to simply replicate a finding. As a result, many impressive findings live in a kind of limbo. In the absence of published replications (or failures to replicate), scientists are often skeptical of a finding, whereas the public assumes the finding represents the final word on the subject.

In reality, single studies are like anecdotes. They are suggestive and point in a particular direction, but they should be thought of as the first word, not the last. The solution is to have multiple studies run and then to conduct what are called ‘meta-analyses’ which essentially test whether the effects show up frequently enough across studies to warrant the more general conclusion that the effect is real (this is an oversimplification). There are meta-analyses that cover hundreds, even thousands, of studies of the same effect. The results of a well-conducted meta-analysis should be trusted far more than any single study, no matter how compelling the original finding is.

Making copies: the Journal of Psychology Replication

Because journals tend not to publish strict replications and are loathe to publish failures to replicate (because there are many reasons why effects don’t replicate that are unrelated to whether the effect is real), there is a haphazard, uncontrolled set of unpublished attempts at replicating important effects. The best labs rarely run pure replications of others’ studies because there are few incentives for doing so. I’d like to propose a two-part plan for remedying the situation. I’ll focus on social psychology, but the ideas can be applied in at least some other areas.

First, the Society for Personality and Social Psychology should nominate 10 studies each year for which an accumulation of replications would be especially valuable. All the members could vote each year, like the Academy awards. The idea would be to find studies that are really exciting if replicated (and important to discard if consistently fail to replicate). All incoming social psychology graduate students would select one of these 10 studies in consultation with their advisor to replicate during the first two years of graduate school. They would enter their choice into an open access registry as well as their own prediction of whether the effect will replicate or not. Authors of the nominated papers would be invited to provide additional methodological and statistical details so as to not be bombarded with questions from first year graduate students all year long. Although such detail is traditionally thought to be in the original papers, there are almost always details missing. The authors could even upload photos of the configuration of things in the lab or on computers screens. Students could also choose to run a strict replication or more of a conceptual replication, but this choice would be registered upfront as well. This would be great for students because it would provide them with a careful training experience working closely with their new advisor.

Second, all of the results of these studies, regardless of the findings, would be published in the to-be-created online-only Journal of Psychology Replications. Introductions and discussions would be minimal (250 words each) with the focus on methods and results. Thus, every first year student would get a guaranteed first authored publication. They would also be free to use the data as part of a more traditional publication as well.This would work better in the U.S. than some other countries where PhD programs are much shorter. Critically, each of the 10 nominated studies would receive 30-50 replication attempts very quickly. Last but not least, someone would be invited to conduct a meta-analysis of all of the studies replicating a particular finding that would be published in thejournal. This could be through a nomination/vote process as well.

I freely admit that I would be a bit anxious to have one of the nominated studies for the year. However, if you make it through this gauntlet and the exciting effect you initially found holds up it would serve as a much stronger testament to the quality of your research and insights. This process would allow both the public and the scientific community to have far greater confidence in our findings.

Social: Why Our Brains Are Wired to Connect - now available from Amazon http://amzn.to/ZlYMP3

Follow me on Twitter: @social_brains

Follow me on the Web: www.scn.ucla.edu

advertisement
More from Matthew D. Lieberman Ph.D.
More from Psychology Today