Brain Myths

Stories we tell about the brain and mind.

Is The Widespread Belief in Neuromyths Itself a Myth?

Ambiguous questionnaires can inflate estimates about beliefs in brain myths.

Researchers interested in people’s beliefs about brain and psychology myths need to be careful they don’t create myths of their own by relying on unreliable research methods. Ambiguous questionnaires and imprecise scoring can lead to inflated estimates of how many people believe in myths like the idea that we only use ten per cent of our brains.

This issue was brought home to me earlier this year when I blogged here about research showing the high proportion of school teachers who endorse neuromyths. The respected health writer Maia Szalavitz over at Time got in touch to point out that some of the myths used in that study were in fact true. I took another look at the questionnaire and I could certainly see that some of the items were written ambiguously, leaving plenty of room for interpretation.

Find a Therapist

Search for a mental health professional near you.

Now a trio of researchers has focused specifically on this issue of how rates of belief in neuromyths (they call them “psychological misconceptions”) can vary with the type of questionnaire used. Led by Sean Hughes at the National University of Ireland, the research is currently in press at the journal Teaching of Psychology and available on the author’s website (pdf).

Hughes and his colleagues, Fiona Lyddy and Robin Kaplan, looked at two main issues – whether myths are written ambiguously or precisely, and whether the answer system for participants to reveal their endorsement of the myths is a simple true/false/unsure format, or a 7-point sliding scale from Strongly Disagree to Strongly Agree with Unsure as a midpoint.

An example of a myth presented ambiguously is: “People’s responses to inkblot tests can tell us about their personality”. Stated more explicitly it became: “People’s responses to inkblot tests provide us with valid and reliable means to assess their personality.”

Hughes and his colleagues gave one of four versions of a questionnaire about neuromyths to 404 undergrads in North America and Europe (most were psych undergrads). Each questionnaire contained 30 myths and 10 true facts. The myths were either written ambiguously or unambiguously and the scoring followed one or other of the scoring systems described above.  

The key finding is that the amount of endorsement in neuromyths varied according to the type of questionnaire used. Both ambiguity and the 7-point scoring scale independently increased agreement with myths. This meant the highest proportion of agreement was for students who completed the questionnaire with ambiguously written myths and who answered using the 7-point scale (they agreed on average with 53 per cent of myths). By contrast, students who completed the version with unambiguous myths and who answered via the true/false/unsure system only agreed with 37 per cent of myths on average.

As an aside, the North American students tended to agree with more myths than the Europeans (although the former used an electronic version of the questionnaire, which may have made a difference). First year psych students agreed with just as many myths as non-psych students, but by the second year their amount of endorsement was lower. Third year psych students performed better still.

The findings about the effects of questionnaire style could help explain why previous research on the prevalence of beliefs in neuromyths has produced such wildly varied results. Hughes and his colleagues urge researchers in this area to look closely at the methods they use and to provide rationale for their choice of myths and the response format. They also encouraged researchers to explore alternative approaches than questionnaires, such as open-ended questions. “Our findings draw attention to the dual influences of language and response format when constructing and interpreting the findings of misconception questionnaires,” they concluded. “We suggest that features of the procedures used … may inadvertently lead researchers and educators alike to overestimate their prevalence.”

Taken together with another new paper that questioned whether brain scan images really do have a seductive allure, as is commonly assumed, this new research is a reminder to brain myth busters everywhere – beware you don’t inadvertently allow yourself become a myth-maker. 

Christian Jarrett, Ph.D is the editor of the British Psychological Society's Research Digest blog and staff writer on their magazine The Psychologist.

more...

Subscribe to Brain Myths

Current Issue

Dreams of Glory

Daydreaming: How the best ideas emerge from the ether.