James C. Coyne Ph.D.

The Skeptical Sleuth

Faux Evidence-Based Behavioral Medicine at Its Worst (Part I)

Review reflects politics, not grasp of management of cancer pain.

Posted Mar 20, 2012

In this next series of blogs I will treat the recommendation with skepticism and suggest that we need to be wary of such recommendations coming from professional organizations. First, I will explain why and then I will begin the critique of the Society Behavioral Medicine meta-analysis that will be continued in future blog posts.

We need to keep in mind that the branding of "evidence-based practice" is an important marketing tool for professional organizations. The branding legitimizes research funding, third-party reimbursement for services, and acceptance by policymakers and consumers.

Effective marketing of services as "evidence-based" means higher-paying jobs for members and more jobs, meaning more members and together, a more powerful organization. Moreover, as professional organizations become more dominated by clinicians, rather than researchers, the impulse becomes more tempting to sacrifice a commitment to decisions based on best evidence to the narrower guild interests of the clinicians.

Professional Organizations and Their Critics

Organizations must display some semblance of having followed the rules for determining what is an evidence-based practice, and they often marginalize critics and aggressively resist any suggestion that claims are not based on best evidence. The goal is to have the appearance of enough rigor to justify preordained conclusions and to avoid anyone raising doubts about the quality or quantity of evidence or about the integrity of the whole process.

This is why claims of professional organizations should be examined skeptically, recognizing that the organizations have as much conflict of interest as a pharmaceutical or dietary supplement company claiming their products are scientifically proven to be effective.  Although it is more difficult when professional organizations are involved, the task remains the same as for industry: uncovering vested interests and the conflict of interest in the promotion of particular claims. The skeptic has to follow the money, uncover the funding source and consider what financial interests are furthered by the claims being made.

Professional organizations need to preserve their appearance of detached, dispassionate consideration of the evidence, and they often capitalize on readers having lent them a credibility that industry does not have. After all, professional organizations are seen as serving the lofty goal of promoting science and human well-being, and they must downplay that they also serve guild interests with large financial stakes in the promotion of the practices offered by their membership as being science-based. The skeptic who makes the connection risks the wrath of the organization and the label of "cynic." Only a cynic would doubt the lofty goals of the organization and bring up instead the distracting issue of guild interests. Skeptics who share professional credentials of the members of the organization are particularly dangerous, such as psychologists who raise doubts about the evidence for the efficacy of psychotherapy or psychiatrists who worry that antidepressants are being overprescribed, and so are held in particular contempt, as traitors to their professions.

I could provide ample incidents for my own career, but let me cite a recent Psychology Today blog post by Allen Frances, the former chair of the American Psychiatric Association DSM-IV Task Force. He has become a harsh critic of the APA's DSM-5 Task Force deliberations. The APA stands to make millions from sales of the DSM-5 manual to clinicians, policy makers, and researchers, and its members have a stake with uncounted millions of dollars over time for reimbursement for services for an expanded range of diagnoses. According to Frances' blog and Time magazine article reports, the APA has recruited a new public relations spokesman who previously worked at the Department of Defense and has declared Frances

"a dangerous man trying to undermine an earnest academic endeavor."

This provides a perfect example: a professional organization protesting that its efforts are "an earnest academic endeavor" and a detractor-critic is a "dangerous man." I could not have invented something better to illustrate my point.

Where seldom is heard a discouraging word.—Brewster M. Higley

Professional organizations have formal rituals for rolling out claims of being evidence based, while protecting themselves from scrutiny or criticism. Special issues of journals, presidential addresses, and invited symposia are designed to create credibility, while notably excluding critics or naysayers from having a voice. Yet seldom do these activities have a declaration of conflict of interest related to their support from professional organizations. At best, there is a grateful acknowledgment that this support was provided and no that support may have tainted the proceedings.

An excellent case example is provided by the Society for Behavioral Medicine's (SBM) recent declaration that psychosocial interventions to reduce pain in cancer patients are ready for systematic implementation. The announcement was presented as the conclusion of a meta-analysis published in Journal of Clinical Oncology (JCO) on February 10, 2012 and a closely related special issue of JCO that is edited by the senior author and has articles by him. The meta analysis is also highlighted  in an April 2012 special symposium at the annual meeting of SBM in New Orleans. The special issue of JCO as well as the meta-analysis, will be the focus of a series of blog posts.

I have not done a thorough exploration of every detail of the meta-analysis in JCO and of the studies that were entered into it, but when I present my critique in the next blog post, I think readers will agree that there is no further analysis necessary to undermine the credibility of the meta-analysis. What I have found is sufficient to indicate gross deficiencies in its integration and interpretation of bad studies, the likes of which can serve to mislead cancer patients, their family members, clinicians, and policymakers.

Bad Evidence

Just a tantalizing clue for skeptical readers who want to jump ahead and go directly to the meta-analysis: the largest effect size by far (ES = 1.69) was shared by two poorly designed studies, one comparing 10 patients who received hypnosis to 10 who received usual care and the second comparing 20 patients who received relaxation therapy to 17 patients who received usual care. The smallest study compared 7 patients receiving pain education to 6 remaining in usual care. These studies are too small to have been included in the meta-analysis. The largest study entered into the meta-analysis compared 202 patients who received telephone-based management of medication to 203 patients who remained in usual care and did not have the access to medication that the intervention patients had, much less the telephone management. The study does not fall under the umbrella of a "psychosocial intervention to reduce pain" and should not have been included in the meta-analysis. But there are many more discoveries if you keep looking.

What is strikingly missing from the Society of Behavioral Medicine meta-analysis is any analysis of this problem, or of how psychosocial management should be integrated with medication or even education about medication in the treatment of cancer pain. Are psychosocial interventions for cancer pain complementary or alternatives to management with medication? When and under what circumstances? Psychosocial interventions have a role in facilitating effective pain management for cancer patients with medication, but what is it and what is the evidence for their effectiveness?

Keeping the "War On Drugs" Out Of The Hospice

Think of it. The declaration of SBM is broad and makes no distinctions among various psychosocial interventions in terms of their efficacy, other to indicate that in pooled analyses skills-based approaches are better—but not significantly more effective than pain education. One of the many flawed studies entered into the meta-analysis, the one with third largest effect size (ES = -1.10) is a trial finding that cancer patients receiving soothing music actually had higher pain levels than patients in a control group. The SBM meta analysis article does not call attention to this finding, but without comment keeps soothing music under their umbrella of its approval of 'psychosocial interventions warranting systematic implementation.'

Be Skeptical About the Undocumented Opinions of The Skeptical Sleuth. Who Is He To Think For You?

I have now provided my negative, but as yet largely undocumented assessment of this article, so treat my personal opinion with skepticism. Before I provide documentation, read the article and decide for yourself. Here are links to the meta-analysis, a widely accepted checklist for evaluating meta-analyses and systematic reviews, and a simple checklist from the Cochrane Collaboration with which you can evaluate the risk of bias in the studies included in this meta-analysis. Note that the link to the meta analysis requires access through a paywall that can be obtained by going through a university library or by purchasing the article for $22. [I know, this is outrageous, but don't get me going on open access versus paywalled articles.] If you do not have such access, you can write the corresponding author and request a free PDF at paul.jacobsen@moffit.org.

Then when you are ready, go to my next blog and see how I quickly evaluated this meta-analysis and some of the studies that went into it. If you too get worked up and consider writing a critique to the Journal of Clinical Oncology, be advised that the journal has a strictly enforced policy of refusing to publish critical correspondence if the authors of the article in question refuse comment. The policy is not publicized. I found out about it when a colleague and I wrote a critique of the study grossly exaggerating the rate of PTSD among cancer survivors, misinterpreting their continued physical health problems as being psychological. The editor wrote back that the authors refuse to reply and that he would not engage in further communication concerning the policy that governs his decision not to publish our critique. Effectively, authors of articles, no matter how flawed and misleading, have final censorship over any exposure of their faults. That is just the kind of protection that professional organizations seek.

Finally, bonus credit to the skeptical sleuths who follow further the money associated with the special issue of JCO. All of the articles the issue state "the author(s) indicate no potential conflicts of interest." However, go to an article by the senior author of the meta-analysis in the November 2011 issue of Psycho-Oncology. Note the declaration of conflict of interest there and note the similarity between the title of the Psycho-Oncology article and the topics of the special issue. Go to the senior author's  curriculum vitae (cv) on the web and examine page 15. Note the $10 million from Pfizer that is associated with the Psycho-Oncology article. Is there a conflict of interest that should have been declared associated with the special issue of JCO? The plot thickens...

Stay tuned for my next blog post available Tuesday March 27, 2012 at 5 PM Eastern Daylight Time.

Postscript (March 23, 2012): I received a phone call this afternoon from Alan Christiansen, President-elect of the Society of Behavioral Medicine (SBM). He strongly objected to my description of this flawed meta-analysis as commissioned by SBM and noted that nowhere did the article indicates that it was commissioned by SBM. While that is true, I have an e-mail from the first author, Sherry Sheinfeld Gorin in which she describes the meta-analysis as one of "the three SBM-sponsored meta-analyses (at least one of which is now published, Sheinfeld Gorin et al., Journal of Clinical Oncology, 2012: I can send you a copy if you'd like)".

Furthermore, the article does indicate that financial support was provided by Bonnie Spring and David Mohr. At the time the meta-analysis was initiated, Bonnie Spring was either President (2008-2009) or immediate Past President of the organization and David Mohr was head of the Evidence-based Behavioral Medicine Committee of SBM. So, if "sponsored" makes the leadership of SBM more comfortable than "commissioned," so be it. But really, rather than quibbling about whether the meta-analysis was "commissioned" or merely "sponsored," I don't think we should be distracted from its poor quality or concerns that it's conclusions are misleading.