WikimediaCommons.com/Public Domain
Portrait of Henrik Ibsen, Norwegian playwright who wrote "An Enemy of the People." Painting is by Eilif Peterssen, 1895, in Oslo.
Source: WikimediaCommons.com/Public Domain

A physician suspects his town’s water supply is contaminated and sends a sample of the water to the University for a “precise chemical analysis.”  When the results confirm his suspicions that the water is indeed poisoned and a menace to the townspeople’s health, the physician believes he has a moral duty to share the information with the town leaders. The town’s livelihood, though, depends on its water reserve: its baths have become its “little gold mine” as “crowds of invalids” flock to the town. Repairing this water source is estimated to take two years. To the physician’s astonishment, the Mayor, the physician’s own brother, essentially forbids him from releasing the report because the information will result in the financial ruin of the town. The physician is hailed, not as the hero he had expected to be, but as a pariah, “a real enemy of the people.” The story, of the same name, is, of course, by 19th century Norwegian playwright Henrik Ibsen (1828-1906), and it depicts a typical conflict of interest scenario, i.e., between “public duty and private interest.” (Newton et al, Biomedical Central Public Health, 2016)

WikimediaCommons.com/Public Domain
Lithograph of Ibsen by Frank Wedekind, 1898.
Source: WikimediaCommons.com/Public Domain

The Oxford English Dictionary defines conflict of interest as “a situation in which an individual may profit personally from decisions made in his or her official capacity." Financial gain is often the focus in a discussion of conflict of interest, particularly in scientific research, because it is more objectively quantifiable but there are other types of gain. Researchers, for example, may champion a position or conduct investigations that will enable them, their colleagues, or even their institution to achieve professional recognition, advancement, and fame.  (Cope and Allison, Acta Paediatrica, 2010)

A conflict of interest is a wicked problem. This term was popularized and defined in the 1970s, in the context of government planning, by Berkeley professors of design and city planning, Horst W.J. Rittel and Melvin M. Webber. (Rittel and Webber, Policy Sciences, 1973)  Wicked (i.e., “malignant” or “tricky”) problems have certain distinguishing properties. They are “essentially unique.”  Wicked problems have no stopping rule: one stops with the sense that “That’s the best I can do” or “That’s good enough,” and they are neither true nor false, but more likely “better or worse.” There is neither any immediate nor any ultimate test of a solution to a wicked problem. As a result, there is no opportunity to learn by trial and error. Wicked problems are “incorrigible,” and they defy “efforts to delineate their boundaries and to identify their causes.” (Rittell and Weber, 1973)

“There are no official boundaries on what could be a reason for a conflict of interest,” writes Boston University Professor of Epidemiology Kenneth Rothman. (Rothman, JAMA, 1993) For example, Rothman makes the provocative argument that when a journal editor asks for disclosure of “any relationships” that might be construed as causing a conflict of interest, authors might need to disclose their religious or sexual orientation along with any financial considerations. For Rothman, this smacks of McCarthyism.

Rothman defined conflict of interest as “any situation in which an individual with responsibility to others might be influenced either consciously or unconsciously by financial or personal factors that involve self-interest.” (Rothman, 1993) He noted that the term often has a pejorative connotation—what he called a “pernicious ambiguity.” (Rothman, J. Clin. Epidemiology, 1991-supplement) The term, though, does not necessarily indicate any wrongdoing or even anything unethical. Kozlowski (Science and Engineering Ethics, 2016) defines it as “not being free to take an opposing position,” i.e., “…not completely free to adopt opinions that would be unsupportive of the opinions of their agencies, superiors, close colleagues…” A conflict of interest refers to a setting in which one’s conduct might be affected, and so could lead to a researcher’s work being biased. (Rothman, 1991) Says Rothman, “…no scientist is objective…each of us is buffeted by the winds of unaccountable biases, some overt and some hidden.” (Rothman, 1991)

WikimediaCommons.com/Public Domain
"The Agnew Clinic" by Thomas Eakins, 1889, in the Philadelphia Museum of Art.  Medical research in the 19th century was much less complex than today. 
Source: WikimediaCommons.com/Public Domain

Bias in research has been defined vaguely as any deviation from the truth.  It includes systematic errors, as opposed to errors by chance, that can occur throughout the course of research and severely compromise the research’s integrity. (Tripepi et al, Kidney International, 2008) David L. Sackett, one of the forefathers of evidence-based medicine, delineated 55 types of biases that he categorized by the stage of research involved, including conducting a literature review, choosing a population sample, and implementing the research’s design, measurement, interpretation, and even its publication. (Sackett, J. Chronic Diseases, 1979.) More recently, others have described more than 70 subcategories of possible bias in scientific research. (Delgado-Rodriquez and Llorca, J. Epidemiology and Community Health, 2004)

Bias can result from a scientist’s deliberate alteration, regardless of motivation, of his or her research findings, and that is considered overt fraud.  When there is an unconscious “tilting” of study design or data collection, for example, says Rothman, “that is incompetence.” (Rothman, 1991; Rothman, 1993.) Adds Rothman, “No one works in a vacuum…and no one is truly free of pressures that might distort intellectual endeavors,” but those pressures notwithstanding, “The scientific work must be judged on its merits.” (Rothman, 1993; Rothman, J. Clinical Epidemiology, 2001)

That is unquestionably the conclusion of obesity researcher and biostatistician David B. Allison, PhD, with whom I spoke, a Distinguished Professor of Public Health and Director of both the Office of Energetics and the Nutrition Obesity Research Center (NORC) at the University of Alabama at Birmingham.  Says Allison, “We should reject ad hominem reasoning when we judge the quality of evidence, including judging research by its funding source.  In science, three things are relevant:  the data, the methods that generated the data…and the logic that connects the data to the conclusions.  We should do everything we can to strengthen the rigor and transparency of those three things, and any focus on ad hominem attacks is not only uncivil, but also unscientific.”

WikimediaCommons.com/Public Domain
"Trouble Comes to the Alchemist," Dutch School, 17th Century, artist unknown. Painting at Chemical Heritage Foundation, Philadelphia.
Source: WikimediaCommons.com/Public Domain

The label conflict of interest, though, has frequently been used as an ad hominem attack specifically with the intent to discredit someone personally and create guilt by association. Rothman emphasizes it is “… essential to the fabric of science” to avoid denigrating scientific research based on funding sources by ad hominen attacks, a practice that is clearly unethical itself. (Rothman, 1991; Rothman and Cann, Epidemiology, 1997)

It is important, though, to emphasize that funding for research has to come from somewhere and any funding source, whether industry, government, public, or private, can result in the appearance of potential conflicts of interest. (Rowe et al, American J Clinical Nutrition, 2009) Though perhaps sometimes unfairly (Thomas et al, International Journal of Obesity, 2008), industry funding has become particularly suspect and maligned. For example, a recent study published in the journal JAMA Internal Medicine uncovered correspondence that seemed to implicate the Sugar Industry's clandestine ties to research in the 1960s and 1970s that sought to demonize fat consumption and minimize the effects of sugar intake on cardiovascular disease. (Kearns et al, 2016)  Another study had demonstrated that industry-sponsored research was five times more likely to report inconclusive scientific evidence, in this case, for sugar-sweetened beverages and their relationship to weight gain, than studies independent of the food and beverage industries. The authors, though, acknowledged a limitation in their data: they could not rule out the possibility of publication bias in the studies that did not declare any conflicts of interest. (Bes-Rastrollo et al,PLOS Medicine, 2013) Publication bias occurs when a study is more likely to be published because of statistically significant results (i.e., its outcome) rather than its overall quality. (Allison et al, International Journal of Obesity and Related Metabolic Disorders,1996)  

Conflict of interest was the subject of a comprehensive report in 2009 by the Institute of Medicine (Lo and Field, eds., National Academies Press) that emphasized the “central goal” of any policy regarding conflicts of interest is “to protect the integrity of professional judgment and to preserve public trust.” It also noted that conflicts of interest often involve degrees that are “not directly quantifiable” and depend on the context, i.e., the particular circumstances of the risk and not one personal decision. Furthermore, these circumstances are seen from the perspective of people who do not necessarily have all the information to evaluate the motives involved, and as such can be only "perceptions or appearances." Conflicts of interest, just like wicked problems, are not binary. In other words, they are "not simply present or absent, but rather more or less severe.” Adams, in criticizing the “over-simplified perspective” whereby researchers are divided into “those willing to accept funding and those who are unwilling,” describes a “continuum of moral jeopardy, stretching from those with minor involvements to those with unmanageable conflicts of interests.” (Adams, Addiction, 2007)

WikimediaCommons.com/Public Domain
"Claude Bernard and his Pupils" by L-A. Lhermitte (1844-1925). (Wellcome Library, London) 19th century French physiologist Bernard is considered one of the fathers of the use of the scientific method in medicine.
Source: WikimediaCommons.com/Public Domain

Given the nature of wicked problems and given that as humans, we are all subject to biases—some conscious and some unconscious, are there ways to mitigate against the appearance of conflicts of interest? Most sources have suggested that full transparency is required and “adherence to the basic principles of good science like reproducibility, replicability, and reliable scientific reporting” is essential. (Binks, International J Obesity, 2014.) In other words, researchers should not only identify their funding sources with full disclosure (Rowe et al, 2009) but should share their data publicly. (Allison, Science, 2009.) Certainly, there is no place for ad hominem attacks that amount to professional bullying or what Sagner et al refer to as “a fallacy of relevance that undermines scientific progress.” (Progress in Cardiovascular Diseases, 2016.) Only then is there the possibility that researchers can preserve their own professional judgment and maintain the public’s confidence.  

Nineteenth century French physiologist Claude Bernard, often considered the father of the scientific method in medicine, wrote in his 1865 An Introduction to the Study of Experimental Medicine, "A man of science should attend only to the opinion of men of science who understand him, and should derive rules of conduct only from his own conscience.” Adds Allison, “We need to pursue truth through science not as a job but as a discipline, a vocation, and a privilege.”

WikimediaCommons.com/Public Domain
"Death in the Sick Room" by Norwegian artist Edvard Munch, 1893, in the National Gallery in Oslo.
Source: WikimediaCommons.com/Public Domain

You are reading

The Gravity of Weight

The Gambler's Fallacy in Research

Down the "statistical garden path" for those participating in clinical studies.

The "Furry Test Tubes" of Obesity Research

The "intellectual leap" of translating science from mice to humans

Of March and Myth: The Politicizing of Science

Scientific integrity, self-correction, and the public