Skip to main content

Verified by Psychology Today

Is Porn-Filtering Software Ineffective and Insecure?

Questioning the reliability and safety of porn accountability apps.

Key points

  • Software offers some ways to block and filter explicit sexual material online, with varying levels of accuracy.
  • So-called accountability software tracks and reports a person's Internet history to a third party to end "anonymous" Internet usage.
  • Unfortunately, such accountability software has significant security problems and exposes people to sexual shame.
Lukas Bieri / Pixabay
Source: Lukas Bieri / Pixabay

Various groups, mostly affiliated with religious organizations, distribute software that allegedly blocks out pornography and explicit sexual content. The idea is a good one – for those people who don’t want to encounter explicit images or videos they shouldn’t have to. Unfortunately, these programs have turned into something else – an ineffective, unproven treatment strategy for people who struggle to control their pornography use. Worse, they may expose people to overconfidence in protections or even to significant cyber-security risks.

Since the early 2000s, pornography-filtering software has been increasingly big business, garnering significant attention and millions of annual downloads. They’ve been extensively installed in libraries to limit the use of public computers for viewing sexual and illicit materials. This strategy raises some concerns about limits on free access to information but is generally regarded as a positive way to limit inappropriate public sexual behavior and protect library staff.

Past studies suggest that porn filtering software on family computers generally works to reduce exposure of young children to inappropriate materials, though it is much less effective with children over 16, and that these strategies may sometimes prevent access to non-pornographic material, such as health education and information.

How It Works

Pornography-filtering software works in different ways. Many of the apps used do not actually “filter” or proactively identify explicit sexual images or content – instead, they rely on blocking access to certain websites which have been identified and listed as containing pornographic content. Thus, an attempt to visit a website such as PornHub might be blocked if one attempts to visit it using public Wi-Fi at the airport, but one is still able to view pornographic content posted on the Twitter site without such images being filtered or blocked.

Other pornography-blocking software works by using technologies to measure how much “skin” is in images and videos under the assumption that explicit sexual material contains more images of more skin. Color-based identification of pixels of skin is the most commonly used strategy, as it is the most straightforward. Other strategies include the use of deep-learning techniques and object identification, but all have advantages and disadvantages. For instance, skin-based detection software algorithms can inadvertently filter out medical images, educational material, or even sports competitions.

Accountability Applications

At first, software around pornography filtering was primarily intended for the uses identified above – blocking or limiting intentional or unintentional access to such materials for families with children or in public environments such as airports or libraries. However, as social concerns about people overusing or being unable to control their use of pornography received more attention, a new kind of related technology began to proliferate. Referred to as “accountability apps,” these technologies purport to not only limit or restrict access to explicit sexual materials but also enlist the support of others in order to track and monitor one’s use of the Internet.

In the late 1990s, psychologist Al Cooper suggested that Internet pornography was likely to be addictive because it was "Anonymous, Accessible, and Affordable (free)," what he called the “Triple-A Engine.” This theory is commonly invoked to explain the overuse of Internet pornography, though its components failed in the only research the theory was ever subjected to. Nonetheless, such accountability software works to counteract the “anonymity” of Internet browsing: Would you look at that website or search for that video if you knew that someone in your life and social circle would know and confront you about it?

Thus, accountability software relies primarily on reporting one’s Internet usage to others and having those other people determine whether the websites or material are acceptable or not, presumably through some pre-established, agreed-upon criteria.

Wired Magazine covered these issues in a recent scathing exposé that found that religious organizations were using these accountability technologies to not only shame individuals for sexual interests or behaviors but to monitor wide-ranging behaviors unrelated to sexuality.

In the Wired investigation, they found that not only did the software rate a psychiatry textbook as “highly mature” material but that it specifically flagged any words, text, or images related to non-heterosexual orientations. In a novel and troubling finding, Wired revealed that these software applications were extremely vulnerable to security flaws and bugs. Essentially, installing these applications on one’s device opened the device up to many other analytic and tracking technologies and exposed user data to interception by hackers.

After Wired published its findings, Google Play took down some of these apps due to security concerns, and some of the apps' sponsors agreed to investigate and correct these security flaws. Unfortunately, the CEO of one of the apps identified by Wired responded to the controversy by saying, “To me, this attempt to slander our work at Accountable2You is evidence that we are in a spiritual battle, and our enemy, the devil, is real and active.”

In the Wired story, volunteers and staff at a church were required to install these accountability software applications on their devices in order to be monitored by church leaders. Other compulsory users of these technologies include probation and parole officers who often require individuals under their monitoring to install such software, ostensibly to prevent inappropriate use of the Internet, under terms of their legal restrictions.

Effectiveness of Accountability Software

Do these software applications work? At this point, there appears to be almost no objective evidence that they do. Technology researchers from New Zealand recently published an investigation of over 170 such applications and found that “content-blocking” (porn-filtering/blocking) was the most prevalent and marketed feature of these types of software.

Unfortunately, they identify that these features have inconsistent effectiveness and accuracy levels. The 170 applications reviewed by the researchers had no empirical data to support effectiveness at increasing personal management of the use of pornography. These applications offer some probably useful strategies, such as “panic buttons” to seek immediate personal support when struggling with urges. Unfortunately, without evidence of safety and effectiveness, it remains uncertain whether such apps are of true personal or clinical value.

The technology researchers also identified an often unexamined issue: “People who struggle with compulsive behaviors tend to search for new pathways to indulging in their behavior and may find ways to bypass multiple protective layers, mainly when triggered or under stress.” As many parents can confirm, when we institute a behavioral control, sometimes it can paradoxically increase the appeal of trying to overcome that control, whether for autonomy or pure obstinance.

In 2021, former reality television star Josh Duggar was sent to prison and convicted of possession of child sexual exploitation material. During the trial, it was revealed that Duggar had installed such accountability software on his devices with the purpose of increasing his wife’s ability to monitor and track his Internet usage. However, Duggar then created a separate partition on his computer, inaccessible to that software, which allowed him to search, view, and download illegal materials without the software, or his wife, detecting it.

Given that legal authorities such as probation officers are using these applications to monitor peoples’ use of the Internet, these limitations, vulnerabilities, and security flaws raise very real concerns that people and authorities may be overconfident in the effectiveness of these pornography-blocking and monitoring software applications. There is zero objective empirical evidence that these forms of monitoring prevent crimes such as downloading illegal materials. They do not have sufficient evidence to protect public safety and may expose individuals to further privacy and security violations.

Addressing the Moral Conflict or Trying to Control?

The best research evidence to date indicates that the majority of people who report struggling with control of their use of pornography are experiencing a conflict between their sexual morality and their behaviors. As many of these pornography-accountability applications are designed, distributed, and monitored by religious organizations, it raises real concerns that these applications may actually exacerbate underlying moral incongruence and stigmatize healthy sexual behaviors such as homosexuality.

Rather than relying upon such untested, unproven, and flawed software, I recommend that people engage in open discussions with themselves and with sexuality health professionals about their pornography use, their sexual desires, and the feeling that come up for them as they think about these issues. Therapeutic strategies that attempt to shift behaviors rather than stop them and examine the cognitions and feelings about these behaviors remain the most efficacious interventions at this time.

To find a therapist near you, visit the Psychology Today Therapy Directory.

advertisement