Technology itself isn’t good or bad. Yet there is an innate drivenness in our relationship with technology: “If we can do it, we should do it.” We charge ahead. We put new tools into action rather than slowing down and taking the time we need to evaluate their potential uses and consequences. The temptation of tailoring the world to our desires leaves little room for thoughtful review.
Consider generative artificial intelligence (AI). Like Gutenberg’s printing press, it is a revolutionary tool, transforming how we operate at a fundamental level. It has many positive outcomes. Rebind, for instance, provides teachers and students with AI-generated expert commentaries on classroom texts, whether Kafka’s Metamorphosis or Jane Austen’s Pride and Prejudice. What student wouldn’t want a personal tutor in the palm of their hand?
The printing press made the Bible accessible and played a part in the Protestant Reformation, but we have largely forgotten its harmful consequences. It propagated the spread of misinformation, disinformation and political instability, to name a few. Similarly, generative AI is already creating new risks and harms. A striking example is how it is changing the way pornography is produced and consumed.
To be clear, pornography has always been destructive. The Salvation Army’s international positional statement on pornography defines it as “print or visual material containing the explicit description or display of sexual organs or activity, intended to stimulate sexual excitement.” The intention of porn is to expose what is meant to be a private, intimate experience. It also exploits people. Individuals featured in porn are not depicted as having inherent dignity. They are commodities to be consumed. And online porn has long been used for revenge and blackmail.
Today, generative AI is amplifying these harms. It hasn’t changed the sinful intention behind porn. Instead, it has expanded the scope of exploitation to include unknowing victims: you and me.
What is deepfake porn?
Deepfake technology allows users to digitally alter what a person says, does or looks like. You might be familiar with deepfakes that have been consented to, such as the digital recreation of a young Luke Skywalker in The Mandalorian. Some deepfakes are non-consensual, such as the image of Pope Francis wearing a white puffer jacket. And other deepfakes are harmful, such as the deepfake porn images of pop superstar Taylor Swift. Any online photo or video of a person can be manipulated to create non-consensual pornography.
Deepfakes happen by splicing together various body parts from online images down to the pixel. For instance, face swap apps superimpose one person’s face onto another person’s body or even an AI-generated body. It is possible to create realistic images of sexual encounters and positions acted out by avatars of different genders, ethnicities, ages and so on. And each can be tailor-made to the preferences of the consumer.
The rapid progress of generative AI is lessening the number of images needed for deepfakes. It’s also making the “fakeness” of a given image harder to detect. Soon, the human eye will no longer be able to distinguish a real image from a deepfake image. And the proliferation of deepfake porn is skyrocketing. A 2023 study reports that 98 percent of online deepfake images are pornographic. Since 2019, their number has increased by 550 percent. This is disinformation at its most extreme.
Why aren’t we protected?
Many platforms have attempted to prevent the circulation of non-consensual deepfake porn, creating various internal safeguards. Access controls limit viewership through mandatory user consent and age verification. Unfortunately, there is no digital way to prevent the display or distribution of online porn without fail. Producers and consumers have proven their ability to work around digital safety nets.
When it comes to the law, publishing and distributing intimate images of a person without consent is a criminal offence in Canada. Most provinces and territories include similar laws. However, as of this writing, only British Columbia and Manitoba have intentionally expanded their laws to account for intimate images that are computer-generated or digitally altered. Federal law is still catching up. The Artificial Intelligence and Data Act, proposed to come into effect in 2025, remains unclear about regulations around generative AI pornography. Moreover, no law will prevent such crimes from happening. And for victims, legal redress takes time and energy.
Is there a silver lining?
Some argue that generative AI will soon be able to design pornographic content that looks authentic without using the images of real people. This would eliminate the harm of exploitation against persons who have and have not consented to having their images used.
I cannot take this claim seriously. It is hard to believe that pimps will lose their jobs. Or that consumers will lose their desire for sex with real human beings, whether in healthy or exploitative ways. But even in cases where consumers take in so-called non-exploitative deepfake porn, the act of consumption itself is harmful.
Some of these harms are not new. Studies have revealed a negative correlation between porn consumption and mental and physical health. For young people, porn has often served as a substitute for legitimate sexual education. It “teaches” them what sex looks like (or should look like) and how to treat one’s sexual partner.
Porn as sexual education is not limited to young consumers. When it comes to sexual intimacy in marriage and other sexual relationships, porn consumers sometimes treat their partners in the degrading way they view porn actors. What you see is what you do.
Even when a consumer does not intend to degrade their partner, other harms persist. Consuming porn is often done independently of one’s partner, a pattern that can diminish the closeness of the relationship. Sometimes, a consumer will invite his or her partner to watch porn together. This experience can distance the partner, potentially causing feelings of personal and sexual inadequacy. What you see is what you are supposed to be.
Non-exploitative deepfake porn will aggravate these harms. Once upon a time, there were limits to what producers could make porn actors do. But when producers become able to create porn avatars without a pixel of human content, those limits will be gone. No matter how much mutual love and value is present in a relationship, a sexual partner is never tailor-made. No one can live up to consumer expectations.
I can also imagine a new harm that reaches beyond sexual relationships. Generative AI images are not limited to the porn industry. Imagine a virtual population of ideal humans in an ideal world. Will we choose artificial interactions over authentic relationships? This shift could fundamentally alter our understanding of the value of the human community and of the way God has designed us to live.
How then shall we live?
So, how can we live on earth as in heaven when earth’s landscape is rapidly changing? Technology may not be intrinsically good or bad. But as the Apostle Paul says, not everything we can do is good for us (see 1 Corinthians 6:12 and 10:23). Right now, lies spread faster than we can react, and the development of generative AI far outpaces the time we need for intentional ethical examination.
One thing we do need is robust international policies. There is little reason to think they will keep up with the current and potential harms of deepfake porn. However, they will at least establish standards for production and consumption, maintaining some kind of social norm.
A second thing we need is a meaningful conversation about these harms and how to address them locally. Let’s learn how to harness generative AI for positive purposes and safeguard individuals and communities from its harms. We can’t afford to wait any longer.
Dr. Aimee Patterson is a Christian ethics consultant at The Salvation Army’s Ethics Centre in Winnipeg.
Photo: Brian/stock.Adobe.com
Leave a Comment