Everyone at the Twitter water cooler is talking about pre-registration—the idea that scientists should submit their research plan and get it approved before collecting data.
There are two (or more?) sides to the debate; one side says that we have to stop researchers who selectively report data that tell a good story and conveniently don't report measures, or entire studies, that don't. The other says preregistration simply doesn't work for some forms of inquiry and it ties the hands of researchers who want the freedom to modify study 2 after looking at the results of study 1, etc. You can read the two sides here and here, and check out this Storify version of the debate here.
Peregistration is a textbook case of regulation. It's a set of rules designed to prevent people and/or institutions from engaging in practices that help them but hurt society. We don't want people to "cheat" with their data, so we set up regulations designed to prevent cheating.
I have no problem with regulation. I'd regulate the crap out of Wall Street and oil companies if I was in charge. But I do have a few concerns about preregistration.
1. Trust. Regulating psychologists is like going to APS and putting up a big fat sign that says "you people are not to be trusted."
Maybe it's true. Obviously some of us really aren't to be trusted, as well known recent scandals have shown. But I think most of us are trustworthy. When thinking about how to deal with the people we trust, maybe we can get by if we educate our students and ourselves about best practices of science. In my opinion, this is one of the points where preregistration is worth it. I like to be trusted, but I'd be OK giving up a little trust to force myself, and others, not to mess around with data.
What about the cheaters who we really can't trust--will preregistration help? Of course not. There are a couple of great ways around it. First, collect your data and when you get something good, preregister the methods, twiddle your fingers for a while so it seems like you're collecting data, and then publish your lovely data. If a study produces bumb results, just don't preregister it. Or if that's too much effort, take the traditional approach: fake the data.
2. Adversarial relationships produce adversaries. When you tell people that they're not to be trusted and that there's a new system designed to keep them straight, the system becomes the enemy.
I'd argue that right now we have a society of mostly honorable and idealistic scientists who want to find the truth and do the right thing. Again, it's a question of trust, but, I'd hate to see scientists' feelings of moral responsiblity decay because they see the regulation as responsible instead of themselves.
I worry that grad students coming up will see preregistration as an adversary and a pain in the neck. There are ways to game just about ANY system, and it'd be naive to think there won't be ways to game preregistration.
3. Selectivity is good. I do lots of studies. Some aren't great. Maybe they were dumb ideas to begin with or maybe they have boring, indeterminate results. If I go to psychonomics and tell people about these studies, for some reason I can't fathom, they always seem to fall asleep immediately.
This is bad for me and my friends--that is, for authors and readers. First, if I preregister something that ends up being boring, I have to spend my time writing it up, and I don't have enough time to write up the good stuff as it is now, much less the boring stuff. I'm not trying to hide bad results, I just prefer to spend my time on results that might contribute to the literature.
Second, there's already way to many articles to read; the last thing I want is preregistered articles to start coming out even if the results are boring. In addition, I'm afraid that people will start to form an association between "preregistered article" and "article with sort of boring results". Not that good articles won't be preregistered, but a lot of boring ones will too.
All of which is to say, again, it's about trust. If you believe in me, let me concentrate on telling you about my best studies. Trust that I won't hide the ones that disconfirm my theories, I'll just hide the boring ones.
The counterargument to this last point is a good one: what if I do 50 unrelated studies and one shows a crazy social priming result so I only publish it? From a science standpoint, I'm basically Freddy Kruger. So maybe preregistration wins this section of the debate.
4. Lots of good studies have been done without preregistration. If you outlaw guns, only outlaws will have guns, and if you outlaw non-preregistered studies, only outlaws will do non-preregistered studies. In other words, let's not all decide that studies that aren't preregistered are bad. Neurocritic has a proposal for some new research badges.
5. Red tape. In order to preregister, you have to write up studies that might end up being boring, and you have to submit them, and depending on the situation, they have to be reviewed and approved before you're supposed to conduct the study.
In other words, if I want to do a study, I might face the following choice:
Arghh. Preregistration would take all the fun out of science!
That might sound infantile but let's be honest: you probably hate lots of lots of administrative and/or bureaucratic rules that you think are just red tape. You have to report this to the adminstration or deparment; you have to clear that with the IRS. But many of these rules and procedures probably sounded worthwhile when they were proposed. Is preregistration going to end up seeming like more red tape in an already slow publication process? Again, it depends on how it works. But the danger is definitely there.
In sum, I'd say let's not start derogating studies that weren't preregistered quite yet. And let's recognize that preregistration might have unintended consequences, including a progressive reduction in trust and responsibility among scientists and a progressive increase in red tape.
But I don't know what I'm talking about. My recommendation? Let's all try preregistration and see how it works.