I was pacing in front of the classroom, giving my last test in Statistics and Research Methods, when one of my advisees came running up to the door, grinning. I walked into the hallway.
"We were arguing about Research Methods!" Franny said, bopping happily.
I looked at her quizzically.
"Last night, I was hanging out with Tess and some other Psych majors and this friend of mine came up and said wasn't it interesting how people spend two weeks of their lives just kissing."
She went on to tell me how Snapple has a series called "Real Facts" printed under the lids of all their bottles of iced tea. Fact #29 is "On average a human will spend up to 2 weeks kissing in his/her lifetime." Her friend thought that was really neat.
The psych majors weren't so sure.
"What culture did they study? I bet that's really different is some countries than in others."
"How would you measure kissing? Is that duration of each kiss or kissing episodes? I mean, if you kiss someone and then break contact and kiss them again twenty seconds later, do you count the seconds looking into each other's eyes as part of the kiss episode?" (I'm willing to bet this was Tess, who is currenty micro-coding affectionate touch in videotaped conversations of romantic couples.)
"And how could you measure it accurately? I mean, this would obviously have to be all retrospective data. Even if you kept 'kiss diaries', people wouldn't be able to accurately remember how much time they spent."
"And how reliable is that? I mean, I might be abe to report on yesterday or maybe last week. But the last six months? Or the last year? No way!"
"And if you did it cross-sectionally, it would be completely confounded. I mean, older people and younger people could give you estimates, and then you coud figure out maybe what a composite person would be like. But you probably have cohort effects, where people who were born in different years had different typical kissing experiences. So no way would it be accurate."
All this story came tumbling out as Franny bounced up and down, grinning happily. Apparently her friend had NOT been thrilled with the discussion. She had liked the factoid - unadorned and just the way it was. It was kind of cool. She hadn't wanted it broken down and dissected.
I, on the other hand, was thrilled.
One of the side benefits of the complexity of psychology as a science is that psychology students spend much more time than many other majors explicitly studying critical thinking. Except we don't call it critical thinking. We call it research methods and statistics. Whereas a typical introductory biology textbook describes fascinating facts and critical experiments that tell a compelling story about the way things work, a typical psychology textbook doesn't.
Psychology textbooks have the sometimes maddening tendency of teaching the controversy - i.e. the process of science.
One set of theories and experiments suggests this . . .
Another set of studies points out major problems with this work . . .
The preponderance of evidence is consistent with this interpretation . . .
But that only holds for particular populations, in particular historical periods, in particular cultures.
Maddening, but deeply embedded in the way scientists really work and the kind of grappling with evidence that scientists struggle with in understanding complex phenomenon among heterogeneous human beings living in diverse circumstances.
This is also why virtually all psychology majors take at least a semester - often two or more - of statistics and research methods. One of the goals of learning to design studies is that it forces you to think hard about issues of sampling (who was studied?), measurement (how were ideas actually operationalized and measured?), generalizability (what are the limits to our knowledge?), and confounding factors (are there alternative explanations for these findings?). All of these are critical in gathering data to help us understand the complexity of human behavior.
And all of that training is basic to critical thinking.
The fact that students who had taken a methods class with me a year before could habitually pick apart a 'fact' in terms of those issues while hanging out on their own time made me feel like perhaps some of how we train students can pay off. Even if they never become research psychologists, run another ANOVA or design a study in their lives. Just as basic consumers of pop culture.
Perhaps we teach them something important after all. Skepticism.
I thanked Franny, as we talked about maybe using that example in class next semester.
When I told my husband this story over lunch, he got a faraway look in his eyes.
"Two weeks of their lives kissing? Didn't we do that in the May of 1975?"
© 2010 Nancy Darling. All Rights Reserved
Other postings on statistics and research methods: