Close
Updated:

Law Enforcement Ponders the Role of Deepfakes in CSAM

Child sex abuse material (CSAM) remains a major concern for law enforcement. Today, the penalties for possession, dissemination, and creation of CSAM are higher than ever before. Part of the role that law enforcement plays is identifying victims. Today, it is possible to generate the face of a random individual that looks real. Could producers of CSAM use this technology to produce illegal pornography? Absolutely. 

But is it a crime?

What Remains Illegal?

Producing pornographic materials depicting children remains illegal. The AI databases that generate deepfake images are seeded with real images. It is unlawful to possess the real images, and it is unlawful to create new images from those images. However, the images must meet all the definitions of pornography. So, there is a lot of gray area as of now that is making things difficult for law enforcement.

However, law enforcement is well known for turning swords double-edged. Today, they are using deepfakes to lure pedophiles from the dark corners of the internet and get them to commit crimes.

The Lauren Book Situation

Deepfake technology is also being used to humiliate women in positions of power, former exes, and more. While celebrity fake nudes have been around since computers could create images, this is a bit different. This does not involve people fantasizing about their on-screen crushes; Instead, it involves individuals attempting to destroy the psychological health of individuals. 

It remains difficult for victims to prevent this technology from being produced. While laws involving cyberstalking and harassment do cover some elements of a crime like that, it does not protect anyone from the production of such material or even its dissemination.

In the future, legislators will address deepfake technology and curb its use for punitive actions or sexual exploitation. 

Lastly, deepfake detection remains an obstacle for law enforcement as detection methods can be manipulated to produce better fakes. Each time law enforcement figures out a way to determine if a video is a deepfake, the deepfake producers adapt. When it comes to images, it is even more difficult because you have less information. 

A deepfake competition recently brought together the best AI producers in the world; they managed to correctly identify deepfakes only 65% of the time. Our legislation has yet to keep up with advances.

Talk to a Chicago Criminal Defense Attorney Today

David Freidberg represents the interests of those charged with sex crimes in Chicago. Call today to schedule an appointment, and we can begin planning your defense immediately. 

Contact Us