- Deepfakes are seemingly realistic artificial intelligence depicting people or events to look and sound authentic, even though they're not.
- Deepfake resurrections are manipulations using AI that revive a realistic yet fake existence of a deceased person.
- Prosocial deepfake resurrection shows potential for addressing domestic violence and drunk driving as life-threatening social problems.
Artificial intelligence has become increasingly advanced. Deepfakes are seemingly realistic, digitally manipulated videos that can depict people, events, and things to look and sound authentic, even though they are not. Deepfakes can be indistinguishable from reality. They are so advanced that they can become viral Internet messages with the potential to have an emotional and behavioral impact on us.
The use of deepfakes has sparked discussions of morality, ethics, and legality. Psychologists have recently also taken interest in the persuasive capability of deepfake resurrections, which are manipulations using artificial intelligence that revive a realistic but fake existence of a deceased person (Lu & Chu, 2023). We know that humans have varying feelings and beliefs about life after death, but what about how people may be potentially impacted by receiving artificial intelligence–based communication from a realistic but fake deceased person delivering their own death narrative to promote activism and policy change?
Prosocial Deepfakes: Domestic Violence and Drunk Driving
Lu and Chu (2023) focused on drunk driving and domestic violence as common health problems that are also life-threatening. These are real-life social problems that few would likely disagree are important to target for prevention and intervention efforts. The researchers stated that these health problems are important regarding social policy changes and activism efforts. The researchers had two research questions:
- What effect, if any, does exposing people to a public service announcement (PSA) video using deepfake resurrection have on viewers?
- What psychological processes account for potentially positive or negative effects of exposure to deepfake resurrection videos?
Lu and Chu were motivated to focus their research on what they call prosocial deepfakes because so much of what exists about deepfakes focuses on the spread of disinformation and otherwise negative or malicious use of artificial intelligence. They wanted to study how a deepfake PSA from an innocent deceased victim of drunk driving or domestic violence, telling the story of their death, might impact viewers. Could this sort of deepfake resurrection have a positive benefit as a PSA, or, because of an attitude of some people regarding respect for the dead, instead have a negative reaction to the video?
Biopsychosocial and Cultural Factors
It's important to research more about underlying biopsychosocial and cultural factors that might account for our potentially positive or negative reactions to seeing dead persons speaking again about what happened to them (e.g., drunk driver–based, mass shooter–based, domestic violence–based) in a high technological computer-manipulated fake but realistic image.
When I told my students about Lu and Chu's research of prosocial deepfake resurrections, my students were visibly disturbed by it and disagreed with the use of such imagery of the deceased, even if to deliver messages speaking out against drunk driving and domestic violence. I teach in the southern United States at a historically Black university with predominantly Black and Christian-identified students. I'm confident that culture matters to how my students' and other humans' frontal lobes and limbic systems might react to deepfake resurrection, no matter the type of message being delivered.
Lu and Chu did hypothesize that a culture of respect for the dead, rejection of exploitation of the dead, and beliefs of anti-desecration of the dead could likely impact reactions to deepfake resurrection death narratives. Yet they also believed that the surprising nature of seeing a deepfake resurrection might have the ability to motivate viewers to plan to take action aligned with the positive PSA message in the video.
Lu and Chu have advanced the study of deepfake artificial intelligence beyond the typical focus on its negative effects. They wanted to understand more about how deepfake resurrection could potentially be used for a prosocial objective. They found that the element of surprise at hearing and seeing a deepfake resurrection and death narrative can be effective for motivating viewers to want to get involved via direct action or policy changes regarding domestic violence and drunk driving. However, further research is definitely needed to gather information across a wider cultural representation of persons of varying belief systems, particularly as deepfake manipulations of humans both alive and deceased will continue to rapidly advance.
Lu, H. & Chu, H. (2023). Let the dead talk: How deepfake resurrection narratives influence audience response in prosocial contexts. Computers in Human Behavior, 145, https://doi.org/10.1016/j.chb.2023.107761