The question emerged: how could a neutral outsider decide the issue? We routinely trust people’s reports of their experience, particularly in the absence of ulterior motives. But the critics’ points sounded plausible as well. How may we know the truth?
Luckily, our civilization has developed a way to referee between competing testable claims. It’s called, of course, the scientific method. It’s one of humanity’s greatest achievements.
Many are tempted to see science as a list of answers, a parade of celebrated discoveries and innovations. But scientific discoveries are merely products of the real innovation, which is the method of inquiry itself. The most unique contribution of science is its way of posing questions, of testing claims by following evidence, rather than by following expectations, traditions, wishes, fantasies, intuitions or the whims of authority figures.
In other words, science doesn’t want to believe. It wants to know. And it knows how.
To find out whether the facilitated communications were in fact messages from the children or merely the facilitators’ musings, scientists designed a simple experiment: keep the facilitator blind to the question posed to the child. If the facilitators were merely facilitating the child’s own hand movement, then their knowledge of a question should be immaterial to the child’s ability to answer it. The results were conclusive. Autistic children could not answer a question that their facilitator did not know the answer to (or could not hear or see). Taking the facilitator out of the communication loop at the question phase eliminated any positive effects of the facilitation procedure at the answer phase. Facilitated communication lost its status as a legitimate therapy.
Still, not everyone lost their faith in the process. As shown in a recent article, facilitated communication is still alive, still practiced and taught.
This case illustrates, among other things, how scientific knowledge does not easily supplant faith; facts do not readily replace beliefs. Human beings are quite tenacious in their ability to hang on to beliefs, even in the face of contradictory evidence. This phenomenon is so common, it even has a name: belief perseverance effect.
Professional skeptics and science geeks often assume that this tendency to latch onto faith, and the attendant inclination to persist in a belief in the face of contradictory evidence, are signs of people’s laziness and naïveté. But that view is, generally speaking, lazy and naïve.
Faith and facts appear to be adversaries; but, like two football teams, on a deeper level they actually cooperate in keeping the game going. The game, for humans, is survival. The need to believe and the need to know are both features of humanity's survival architecture. The dynamic tension between faith and knowledge is a manifestation of the rules, not a subversion of them.
For its part, belief advances our survival odds in multiple ways. First, ‘large belief,’ as manifested in a religious faith, serves to strengthen social organization. As the sociologist Randall Collins has well articulated, God is a symbol of our social existence. When we celebrate our God, we in fact celebrate our ability to get along, our shared values and bonds, and our robust and clearly delineated group parameters. Large faith improves social unity and cohesion. Members of coherent, well-organized groups are more likely to survive. This is one reason that, as E.O. Wilson has observed, the human mind evolved to believe in the gods. It did not evolve to believe in biology.
At the same time, ‘small faith,’ for example my belief that my wife will not leave me, is necessary to sustain day-to-day social commerce. Given that no human being can be completely known and wholly predictable (to others or to themselves); given that the unpredictability of human beings is at least as--and often more--dangerous to our survival and well being than the unpredictability of natural processes or animals; and given that we--herd animals that we are--have to trust each other in order to survive and thrive, faith becomes a necessary scaffold, bridging the gap between the unknown that is ‘me’ and the unknown that is ‘you.’ Trusting someone is always a leap of faith.
Moreover, since belief can be formed quickly and easily, it often precedes knowledge. First on the scene, it helps us organize, tolerate, and persist in the quest for knowledge, which, for its part, moves slowly, haltingly, with many dead ends and wrong turns along the way. Faith allows us to take the first steps even if we don’t yet see the whole staircase, to paraphrase Martin Luther King Jr.
Thus when new knowledge finally arrives, it often emerges against a backdrop of preexisting beliefs and requires some of those beliefs to change. Resistance to such change is often seen as a form of, well, foolishness or laziness. But it need not be that at all.
In fact, resistance to change is a useful feature of any bounded system. A completely porous, infinitely elastic, endlessly agreeable system is no system at all. If change were too easy for us, our lives would become chaotic. A stubbornly inflexible system is still better than chaos, just as stubbornly inflexible parents are generally better than no parents.
Moreover, knowledge itself is often legitimately suspect. History is replete with ‘truths’ that were later shown to be neither ‘the whole’ nor ‘nothing but.’ No wonder old beliefs are reluctant to surrender to new knowledge. Belief, ironically, has good reasons to be skeptical about knowledge. It need not be apologetic.
At the same time, our desire for knowledge cannot be denied. Human beings posses a strong, foundational need to know, to sort among competing claims, to test hypotheses and verify facts, AKA, to figure shit out. Faith in this regard is wholly insufficient even when it is necessary. “A casual stroll through the lunatic asylum shows that faith does not prove anything,” said Nietzsche, implying that proving something was desirable. And it is. We desire proof. A baby looking at a ball may have a guess or a preference as to what will happen if she kicks it. But she will not settle for that. Invariably, given a chance, she will kick the ball to see what happens. The Galilean itch to build a telescope and check out the moon is as recognizably human as the tendency to tremble in awe before the grand mystery of the starry night sky.
We want to know. In knowledge, we gain power and control, which we crave for security and safety and peace of mind. And rightly so. Survival (and other) decisions based on fact will in the long run trump decisions based on (our own or others') hunch, hearsay, hope, expectation or guess.
An ironist could say that in belief we become human. In knowledge we become Godlike.
The consideration of how belief and knowledge fit together is not just an abstract intellectual exercise. At the small liberal arts university in the Midwest where I teach, many of my students are confused about what science is and why they should care about it. Faith is easy for them. On the ‘large faith’ level, my students mostly live with religion. They see many believers, but they meet very few scientists. On the ‘small faith’ level, the concept of belief is easy and useful in their lives. Faith requires little effort. The language of belief is socially adept. In my students' lives, faith often lets everyone in and lets everyone be. It keeps all positions worthwhile and valid. You believe what you believe and I belive what I believe. No need to fight. Faith puts people first, and it equalizes them.
But science, they find, is hard. It takes time and effort. And it makes judgments. It has winners and losers. It puts objective truth over the subjective person. My students often feel that science is something alien and harsh while faith is natural and kind.
As a teacher, my first task is to show them that they are in fact scientists already; that the scientific impulse is a part of their inherent human endowment, wired into the processes of their brain. I may try to make that point using the following example:
“A young woman is sitting at a Café drinking her latte when she spots a young man across the room. She checks him out; she thinks he’s cute, she may tell herself, ‘being with that guy would be enjoyable.’ What did she just do? She’s created a testable hypothesis, the first stage in all scientific inquiry. Now she has to figure out a way for them to get together, a way to test her hypothesis. She may choose to make eye contact, or just walk up to him and say hi. Whatever strategy she chooses will be her study design. Next, she has to follow through. Design alone will not tell her anything. So she approaches him. They go on a date. That’s her data collection phase. Then she goes back home and thinks about what happened. Did he fulfill my expectations? Was he nice? Was there chemistry? That’s data analysis. She analyzes the information she has obtained and arrives at a conclusion: ‘my hypothesis was supported: I enjoyed his company.’ But she is not done, and should not assume that he’s the one just yet. She needs to go on more dates with him, to seek converging evidence and confirmation. She needs to replicate the study.”
“In other words,” I tell my students, “this young woman followed correctly all the steps of scientific inquiry. She is a scientist, as are all of you.”
This example is often useful in illustrating how scientific thinking is wired seamlessly into our cognitive architecture and used informally by individuals as they navigate their world. But it does not illustrate clearly why we need formal science, why we, as a society, need to invest in and support scientific education, literacy, tools and research. For that, I may use the following vignette:
“A student is found dead in his dorm room with a knife to the forehead. Suspicion quickly falls on his roommate. The roommate has a reputation as a jerk, a hothead. He is not well liked around campus. Many people in fact would like him gone. Most students believe the roommate was the killer, they expect it to be the roommate, they hope it’s the roommate. Then, in come the police investigators. They bring their tools of investigation. They look for evidence, fingerprints, DNA, surveillance videos, eyewitnesses. Slowly, a shocking picture emerges. It turns out the roommate was actually out of town on the night of the murder. A YouTube video showing him drunk outside a Vegas hotel goes viral within hours. It could not have been him. Instead, the evidence points in an unexpected direction: the dead man’s ex-girlfriend--a popular student whom everybody loves. Nobody wants it to be her, no one expected it; nobody guessed it, and no one hoped for it. Yet it’s her fingerprints on the bloody knife found in the bushes by her dorm room, it’s her DNA at the crime scene, surveillance video footage shows her sneaking into his room that fateful night with a knife in her teeth, next door neighbors remember her knocking on the door at midnight, and the dead man, in his last gasp, wrote in blood on the wall: ‘Why, Jennifer?’ She also finally confesses on Facebook, in a heart rending video that instantly gets one million ‘likes.’”
“Now,” I ask my students, “do you want to live in a society that sends to jail the person we guessed and wished was guilty, or one that sends to jail the person who actually did it?”
Not surprisingly, they want to live in a society where the girlfriend goes to prison, not the roommate. They want to live in a society where competing claims are decided based on the evidence, a society that puts much of its faith in science.