Bad Science Revelation of 2012: Reparative Therapy is a Lie
In 2012 we learned that "reparative therapy" is a lie based on bad science.
Posted Dec 27, 2012
Spitzer, now 80 years old, was no stranger to the controversy surrounding his chosen subject. Thirty years earlier, he had played a leading role in removing homosexuality from the list of mental disorders in the association's diagnostic manual. Clearly, his interest in the topic was more than a passing academic curiosity – indeed, it wouldn’t be a stretch to say he seemed invested in demonstrating that homosexuality was changeable, not unlike quitting smoking or giving up ice cream.
Fast forward to 2012, and Spitzer is of quite a different mind. In April he told a reporter with The American Prospect that he regretted the 2001 study and the effect it had on the gay community, and that he owed the community an apology. And in May he sent a letter to the Archives of Sexual Behavior, which published his work in 2003, asking that the journal retract his paper.
Spitzer’s mission to clean the slate is commendable, but the effects of his work have been coursing through the homosexual community like acid since it made headlines a decade ago. His study was seized upon by anti-homosexual activists and therapists who held up Spitzer’s paper as proof that they could “cure” patients of their sexual orientation.
Spitzer didn’t invent reparative therapy, and he isn't the only researcher to have conducted studies claiming that it works, but as an influential psychiatrist from a prestigious university, his words carried a lot of weight.
In his recantation of the study, he says that it contained at least two fatal flaws: the self reports from those he surveyed were not verifiable, and he didn’t include a control group of men and women who didn’t undergo the therapy for comparison. Self reports are notoriously unreliable, and though they are used in hundreds of studies every year, they are generally regarded as thin evidence at best. Lacking a control group is a fundamental no-no in social science research across the board. The conclusion is inescapable -- Spitzer's study was simply bad science.
What’s remarkable is that this classic example of bad science was approved for presentation at a conference of the leading psychiatric association, and was subsequently published in a peer-reviewed journal of the profession. Spitzer now looks back with regret and critically dismantles his work, but the truth is that his study wasn’t credible from the beginning. It only assumed a veneer of credibility because it was stamped with the imprimatur of his profession.
Why this occurred is a bit more complicated than a mere case of professional cronyism. For many years before his paper on reparative therapy, Spitzer had conducted studies that evaluated the efficacy of self-reporting as a tool to assess a variety of personality disorders and depression. He was a noted expert on the development of diagnostic questionnaires and other assessment measures, and his work was influential in determining whether an assessment method was valuable or should be discarded.
Little wonder, then, that his paper on reparative therapy--which used an interview method that Spitzer recognized as reliable--was accepted by the profession. This wasn't just anyone claiming that the self reports were valid, it was one of the most highly regarded diagnostic assessment experts in the world.
Reading the study now, I'm sure Spitzer is embarrassed by its flaws. Not only did he rely on self reports, but he conducted the participant interviews by phone, which escalates unreliability to the doesn't-pass-the-laugh-test level. By phone, researchers aren't able to evaluate essential non-verbal cues that might cast doubts on verbal responses. Phone interviews, along with written interviews, carry too much guesswork baggage to be valuable in a scientific study, and Spitzer certainly knew that.
The object lesson worth drawing from this story is that just one instance of bad science given the blessing of recognized experts can lead to years of damaging lies that snowball out of control. Spitzer cannot be held solely responsible for what happened after his paper was published, but he'd probably agree now that the study should never have been presented in the first place. At the very least, his example may help prevent future episodes of the same.