We all use what the late, great Nobel laureate psychologist (his prize was for his seminal contributions to economics, but his work was oh so psychological) Herbert Simon and other cognitive psychologists refer to as external memory aids. Our collective national history is found in books and newspapers. Our individual histories are found in scrapbooks and photo albums. We jot down notes and keep to-do lists. Some people even write on their arms or hands, using such temporary tattoos to remind themselves of important facts, dates, and phone numbers. By tradition, students have taken notes in notebooks during class for later study. But what about the impact of technology on how we think and remember? Should we wonder, maybe worry a bit?
Admittedly, I love the fact that Facebook informs me when a friend's birthday rolls around, as I am the absolute worst at keeping track of such things. But that sort of memory matter is not the one I am worried about. Let me refer to an example that, as an educator and parent, I've been fretting over of late. I am thinking of a current television ad for a laptop computer deal. You may have seen it. In one shot, a professor lectures at the front of the room and the camera focuses on a post-adolescent slacker dude lounging back in his lecture hall seat while his laptop-using voice recognition software-takes verbatim notes for him. The implication is that one need not bother anymore with the heavy lifting of taking pen to paper-or even, it seems, key to screen-the software and hardware will do it all for you. Students need only access the saved transcript (like so many Watergate scholars) at a later data in order to make sense of Ezra Pound's poetry, Piltdown man, or the happy nuances of bipolar disorder. Who knows, perhaps the software will even read it back in a soothing voice like those found in corporate voice mail and GPS units, or maybe even -if one is so bold-in a tone akin to Stephen Hawking's speech synthesizer. Duck soup!
Or maybe not. As you may be guessing, I am not impressed. Rather, I am worried. Learning and retaining information takes effort. Ideally, of course, student engagement is also present; we learn better when we are motivated to do so. Not everything we need to learn will excite or intrigue us, as my (happily long) past experiences with chemistry, the Periodic Table, and balancing equations will attest, but even minimal effort is likely to help us recall something. (Let the record show that I remember that Fe stands for iron-somewhere Mendeleev, if not my high school chemistry teacher, Dr. Freas, is smiling).
What about our now reflexive reliance, really, dependence, on the various search engines? A recent set of studies by Betsy Barrow, Dan Wegner, and Jenny Liu suggests some interesting possibilities about our online activities. Sparrow, who teaches at Columbia University, was interested in determining whether people are more or less likely to recall information when they know it can be easily retrieved from a computer. In one study, participants entered 40 pieces of trivia into a computer. Half of the participants were led to believe the data would be saved in the computer, while the other half thought it would be erased. Sparrow and her colleagues found that people were more likely to remember what they typed when they assumed they would not be able to access it later. In other words, people put less effort into remembering or learning the trivia if they thought they could access it later. Another study found that we may be better at remembering where the trivial information is located than its nature; we know, as it were, where the book with what we need to know is, but not what we need to know.
So, if we know that what we need can be found quickly with our iPad, why bother retaining it? Well, I think we would do well to think about what this may mean down the proverbial road. We may not bother learning a variety of basic information and simple skills because we don't need to because it is only a few taps away. We may remember the gist of the thing, but not the thing itself-and we've not even considered how we misremember or distort (topics for another time). In psychology education, for example, learning to understand and apply statistical tests to data fits my concern. In the not so distant past, students would learn to perform statistical comparisons by hand; many students still do. But quite a few simply read conceptual descriptions of what the tests mean and "do" quantitatively before quickly retreating to error free software packages that do the work and reveal the answers in a nanosecond or two. Just print the screen, circle the answer, and turn it in. Product, not process. Duck soup!
So, is something lost when we don't learn to do some of that heavy mental lifting? Maybe. Note that I am not arguing that one should learn to do complex and time consuming calculations by hand, but I think doing some basic ones is still a good idea (consider this: just because my surgeon may use a laser doesn't mean I don't want her not to know her way around a traditional scalpel). And please realize that despite what some of my friends and readers may think, I am not being a Luddite here. I would be lost without my laptop and iPad, just as before I was helpless without my calculator (and no, I never learned to use a slide rule or an abacus for those of you who are wondering-and if your age precludes you from knowing what those things are, by all means, Google them with dispatch). Duck soup!
*The American slang phrase "duck soup!" means a cinch, something easy to do. The immortal Groucho Marx explained it thus: "Take two turkeys, one goose, four cabbages, but no duck, and mix them together. After one taste, you'll duck soup for the rest of your life." You could search it online: Duck Soup!