Most parents provide their children some sort of religious education or experience, whether at home, in their place of worship, at a parochial school, or some combination. I expect that the great majority of U.S. parents also want their kids to be able to distinguish fact from fiction: to understand, for example, that President Obama is a real person and Spiderman is not. According to a recent study, these goals may conflict: Young children with a religious education were much more likely than religiously uneducated children to think that fictional characters were real. If adults thought Sarah Palin and Princess Leia had equal odds of affecting political discourse, the thinking would be seriously delusional—and dangerous. And not at all the consequence the vast majority of parents seeks (or, again, so I presume). So should parents lay off the religious training? Or change it somehow?
My own now-grown children regularly attended an Episcopal church, and we celebrated the major religious holidays at home as well, with the usual festooned trees, painted eggs, and glut of sweets. But when our eldest was around 5, I became convinced that perpetuating the Santa Claus myth was harmful—not only might it confuse him about what was real and what was not, it could also teach him to distrust my word, and expose him to disappointment and disillusionment. So I explained to him about department store Santas, and the physical and geographical impossibilities of reindeer flight and fitting down chimneys.
Problem was, he insisted that of course Santa Claus was real.
Why did he insist? It might have been that, like a typical 5-year-old, he accepted what adults told him. Except that I told him both that Santa Claus was real (before my decision) and that Santa Claus wasn’t real (afterward). The religious education study suggests another possibility: he might have been, in the authors’ phrase, “open to the impossible” as a result of attending church.
The basic outline of the study was that the researchers read brief stories to individual young children, who decided whether each story’s main character was real or not real. The children then explained their reasons for their answers. For analysis, the authors grouped the children according to whether they had no sustained religious exposure at home or at school (“secular children”), home or church exposure but not parochial school, parochial school but not home or church, or both types of religious exposure. The latter three groups were the “religious children.” (Notice a few caveats: The “religious” groups were all “Christian,” a designation that excludes other religions and potentially includes a wide variation in creed and upbringing. Secular upbringings are also diverse. Plus, there is no advance guarantee that any particular child will conform to expectations of a religious or secular upbringing. So I will keep the terms “secular children” and “religious children” in quotes throughout to keep emphasizing these limitations.
In the first test, the children were told explicitly religious, explicitly magical, or realistic stories. Result: the “religious children” were more likely to say a character who performed super-human feats was real than were “secular children.” This did not surprise the authors—after all, the “religious children” had been taught that such events as parting the seas and walking on water were real. But the researchers wondered what, exactly, triggered the different response. Was the trigger, they wondered, the familiarity of the feat involved to something the children had been taught? Or did children put God and magic in the same category? Or did the religious children have a different concept of what could happen than the secular children? So in the second test, the stories left out any specific references to God or religion, but included fantastical/miraculous events familiar to those with a Christian background, stories in which the fantastical/miraculous event would be unfamiliar, and stories with and without explicit references to magic. They also varied the specific stories used and the order of presentation.
In this second test the “religious children” continued to be more likely to judge fictional characters as real. But it turned out that the “religious children” judged familiar and unfamiliar stories similarly: It was not just familiarity to bible stories that led them to be more likely to judge fictional characters as real. Neither did the “religious children” lump magic and God together: References to magic led both “religious” and “secular” children to judge a character not real, and the “religious children” gave almost no religious explanations of their conclusions about any characters in the second test. Instead, the researchers’ last hypothesis seemed to hold: the “religious children” were more likely to think that extraordinary events can occur in realistic stories.
What to make of this? Possibly nothing, of course. It’s just one study, in need of caveats. But for the sake of argument, let’s say the finding holds up across various types of secular and religious upbringing. What then? Does it mean, for example, that children raised with religious background are raised to be gullible? Or that children with a secular upbringing are more likely to be critical thinkers? Or is it possible that “openness to the impossible” is a good thing?
The last possibility depends on just what “openness to the impossible” means in practice. It’s a very big jump from kids putting figures in a “real” or “pretend” box to adults doing anything, much less having a psychological or intellectual propensity for a particular attitude toward the unknown. But let’s say “openness to the impossible,” for a grown-up, means retaining hope for unexpected, positive change. That outlook could be a good thing. It could help motivate work for change, while lack of openness might support defeatism: Why fight insurmountable odds? However, if “openness to the impossible” means uncritical credulity—well, that’s a trait that gives us misguidedness and science denial in many hurtful forms. This is a bad thing for effective individual functioning, and even worse for directing policy.
One thing the phrase should not be interpreted to mean is that “openness to the impossible” is irrational. Both the “religious children” and the “secular children” could justify their conclusions to the researchers. Just as with adults, mistaken and irrational are not the same concern.
A second thing to notice is that “openness to the impossible” does not equate to dogmatism—that “bad guy” of every form of reasoning: religious, secular, scientific, practical, ethical, etc. Contrasting dogmatisms say that X (an event, perhaps) is never possible and that X is always possible. (It’s never possible that a dying patient will recover; it’s always possible that a dying patient will recover.) If “openness to the impossible” stipulated the “always, ” it would be dogmatic; equally, a “realism” that stipulates the “never” is dogmatic. Non-dogmatism—against either form—recognizes that we might misunderstand X, and the construct “openness to the impossible” allows for this.
From all this I conclude that if a religious upbringing teaches something like non-dogmatic openness to the impossible, that, by itself, is probably not a bad thing. It may even be a good thing.
In any case, our son’s religious education does not appear to have done lasting damage. He’s now a data scientist with serious respect for facts and analysis, an equally sincere respect for human and scientific fallibility, and he’s neither gullible nor chronically disillusioned. He has a reasonable expectation of himself as a potential change-maker. And he no longer expects a be-sooted elf to fill his Christmas stocking.