Brave New Tech

How are children thinking about robots?

Posted Feb 19, 2019

'Alan Beck' used with permission
Robot as best friend?
Source: 'Alan Beck' used with permission

A brave new world has stealthily infiltrated our daily lives. We humans, ever the uber social species, now live with, fantasize about and obsess over a whole new class of creatures. These beings sometimes speak with us in disembodied voices. Increasingly, however,  the creatures come in shapes and sizes echoing the carbon-based living beings we are familiar with.  These “embodied objects” might be in the shape of a dog, a seal or a cartoon-like version of a dinosaur. Or, they might be designed to approximate a human child, teenager or adult. Covered in shiny gray metal,  soft faux fur, or skin-like fabric,  animaloid or humanoid robots are not designed to deceive but to evoke. These robots hijack human evolution’s deeply embedded social consciousness. They look at us, we look back, and we cannot help but connect as if with another sentient being. They make sounds and movements we viscerally experience as signals of happy or sad emotions, and it feels natural for us to see the robots as happy or sad.

The proliferation of socially responsive robots is part of a tech revolution that has made rock stars out of start-up entrepreneurs. In some ways, this is nothing new. Since the industrial revolution, technological innovation has been viewed from a default lens of “cool,” “good,” and “labor-saving.” Each new product warrants another celebration of progress. Some voices, however, are more measured, cautioning that every new technological “advance” carries with it potentially unintended consequences, which may be negative as well as positive. At the very least, new technologies, such as interactive and “intelligent” robots, challenge us to explore critically how they are impacting human behavior and thinking.

It is particularly important to get inside children’s heads. Kids are increasingly growing up with socially interactive technologies, including “intelligent” robots. Education for both typically developing children and those with special needs is incorporating so-called “education robots.” Robots as caregivers and companions have moved from a fanciful concept to a real possibility.

Psychologists have been paying attention.  The proliferation of this technology changes the social world of children. To understand current and future trends, child development scholars need to incorporate socially interactive technologies into their conception of children’s social relationships. In addition, the study of social robots opens up some fascinating questions about epistemology, cognitive development, and moral reasoning. As children distinguish between the living and non-living world, how do they make sense of things that “straddle the boundary between animate and non-animate,” in the words of Jipson and Gelman, 2007)?  What does it mean for something to have attributes of both living beings and artifacts?

While empirical studies of children’s relationships with social robots have not yet reached a critical mass, we now know a lot more about how children are thinking about and behaving with this technology, which itself is dynamically changing.  Let us interrogate a {fictional} scholar of developmental robotics.

Do children view social robots as alive? No, is the simple answer, but the complete answer is far more complex. Even preschool age children, 3-5 years of age, do not attribute biological properties to social robots such as the robotic dog, I-Cybie. That is, when asked if I-Cybie can eat, grow, or have babies, most children say no. More directly, when asked if such a social robot is “alive,” the answer is: ”No." However, when it comes to psychological properties such as thinking and feeling, most young children and older ones too, see social robots as able to think for themselves and feel emotions. For example, in a study of 9-15 year-olds responding to the humanoid robot, Robovie, 77% felt that the robot could be their friend, and 64% believed that Robovie could feel emotions like sadness. Studies of the robotic dog AIBO find that similarly, children from preschool through adolescence see this gray metal animaloid robot as a potential social partner, with its own thoughts and feelings. Thus, in some respects, although not alive, such robots are viewed as having some of the properties of living creatures. We might say, that for many children, social robots are “alive-like.” For example, in the study of Robovie mentioned above, 38% of the children who played with the robot felt that it was “in-between” alive and not-alive.

What makes a robot seem more or less" “alive-like”? Not all robots are created equal. When a robot has facial features, such as eyes, children (and adults) are more likely to give the robot “alive-like” properties. But, as a robot more closely approximates the look of a human, the robot may evoke a reaction called the “uncanny valley,” a feeling of creepiness or disquiet. When a robot behaves contingently, responding to human actions in ways that mimic living beings, robots feel to us more “alive-like.” We begin to treat them as if they were living beings while recognizing that they are not.

Are there developmental changes in how children relate to robots that emulate living beings? The verdict is still out here. Some studies find that older teens are less likely to see such robots as alive-like and more attuned to their machine-like properties. This may be because of greater technological savvy as children get older. Over time, the (usually) middle-class children participating in robot studies may become habituated to interacting with ever more sophisticated technologies. It is also possible that readiness to pretense and fantasy may be involved in how children view robots. By the teen years, fewer children are likely to be ready to treat objects, such as stuffed animals or even interactive robots, as if they were somewhat alive.  As children spend more time immersed with videogame avatars and virtual reality with lifelike computer-generated characters, do these experiences color their perceptions of robots (and by extension, humans in the real world)?

In a future world where robots may proliferate and begin to take over some roles that humans and animals now play, how will these artificial emulations of the living world be treated? What will the moral standing of robots be? Studies of both humanoid and animaloid robots suggest that because they are not viewed simply as artifacts, children accord such robots some moral rights, although not to the extent of living humans or animals. Thus, the majority of children (54%) in the Robovie study felt it was morally wrong to put the robot in a closet (after it was programmed to protest it did not want to go there), but nearly every child condemned the idea of putting a human in the closet.

As children become savvier about how robots are constructed and programmed, will they begin to see them as less alive-like and perhaps, less deserving of some moral regard? How will our perceptions of robots generalize, if they do at all, to living beings?  If robotic pets and playmates become widely acceptable as substitutes for their living counterparts, will children become used to relatively impoverished but perhaps more readily controlled relationships? Cannot robots easily become targets of abuse and bullying? It is likely that children will take cues from the social context in which robots are encountered and the adult structuring (or lack of it) of such contexts. A context inviting aggression is likely to result in children hitting the same robot that they might hug in another context. This makes it all the more urgent for tech developers and consumers to consider the human dimensions of engagement with technology, particularly those robots that hijack our social instincts.


Jipson, J. L., & Gelman, S. A. (2007). Robots and rodents: Children's inferences about living and nonliving kinds. Child Development 78, 1675-1688,

Kahn, P. H. Jr., et. al. (2012). "Robovie, you'll have to go into the closet now": Children's social and moral relationships with a humanoid robot. Developmental Psychology 48, 303-314.