Can nude and pornographic deep fakes ever be normalized? Should they be?

Debate is raging about how to stop creators of pornographic and other harmful deep fakes from making them.

Some states have laws that can potentially make the creators liable (although there is no federal law prohibiting them). Ohio legislators just introduced legislation to outlaw deep fakes per se. I’d love to see Taylor Swift, who was recently a target of a deep fake, or some other influential celebrity file suit, win and set precedent. 

In the meantime, some of the big tech companies such as Meta, Google, OpenAI and Microsoft are trying to address the problem with so-called content credentials. This is essentially a technical standard that will embed data into AI-generated media to identify the source, as explained in this week’s Marketing AI Institute podcast (episode 84). This will enable viewers to determine if the image is an AI deep fake.  Meta is even putting the equivalent of a label on images to mark fakes. 

The idea is to educate viewers on what’s fake and also deter misuse. But there is massive downside:

  • It only works if everyone follows these standards.

  • The labels can be removed from images, making the effort useless.

  • It doesn’t stop them from getting out there in the first place.

With the recent announcement of the text-to-video product Sora by Open AI, the outrage has ratcheted up over yet another AI tool to create deep fakes.

Control vs. Normalization

Another debate on social media suggested a different approach: stop making such a fuss over nude deep fakes so they become more normalized and less of an attraction. It’s somewhat similar – on a lesser scale – to topless or nude sunbathing, which is acceptable and “normal” on public beaches in other countries but not in the U.S.

It’s an interesting debate and one that has been addressed with other illicit activities. Drug use, for example, has become more normalized as many U.S. states legalized marijuana and several European countries decriminalized the possession of small quantities of drugs for personal use.  

But to me, the deep-fake issue comes down to one simple issue: control. With legalized marijuana, you control whether you take it or not. Your choice. (I acknowledge there can be addiction issues, making “control” a bit of a misnomer.)

But with nude and pornographic deep fakes of a woman’s body, it is out of her control. Others are using her image and likeness and body without her control or consent. It is not her choice.

So whether these images are normalized or not isn’t the point, in my view. It’s about whether she consents and controls how her body is depicted.

Another Threat for Our Daughters

Ask any mother and it will make their stomach churn to think of their daughter’s body “AI’d” into a nude or pornographic image or video. To know it can be done so easily now is incredibly unsettling.

I tell my teenage daughter all the time not to post photos of herself or even share them with friends. That’s a bit like spitting into the wind and not expecting to get wet, I know. But the fewer photos floating around out there, the better.

At least until that blockbuster lawsuit or federal legislation offers greater protection. 

Note: The image for this blog was AI-generated. The writing is not.

Sue Valerian