CAMERA

Minnesota Considers Law to Block ‘Nudification’ Apps That Use AI to Make Explicit Images

Minnesota is considering a landmark law to block widely accessible “nudification” technology and apps that use AI to create explicit images from regular photos without consent.

A bipartisan bill, currently under review, aims to target the companies that run these nudification websites and apps.

These apps — which have soared in popularity in the last year — allow users to upload an ordinary photo, which is then transformed to produce hyper-realistic nude images or pornographic videos.

According to a report by AP News, Democratic Senator Erin Maye Quade, the bill’s lead author, argues that stronger regulations are needed to combat deepfake pornography as AI technology advances at an alarming pace.

Maye Quade says the bill would require the operators of “nudification” sites and apps to turn them off to people in Minnesota or face civil penalties up to $500,000 “for each unlawful access, download, or use.” App and website developers would need to determine how to turn off the function for Minnesota users.

Maye Quade also plans to share her proposal with lawmakers in other states, highlighting how little awareness exists around the ease of access to this technology and its ability to generate explicit images in minutes.

Some states have banned the distribution of sexually explicit deepfakes as a way to regulate AI-generated content. However, Minnesota’s groundbreaking bill focuses on stopping such material from being created in the first place — before it can spread online.

“It’s not just the dissemination that’s harmful to victims,” Maye Quade tells AP News. “It’s the fact that these images exist at all.”

The bill comes after San Francisco filed a first-of-its-kind lawsuit in August against several widely visited “nudification” websites, alleging they broke state laws against fraudulent business practices, nonconsensual pornography and the sexual abuse of children.

The San Francisco City Attorney’s office is suing 16 of the most frequently visited AI-powered “undressing” websites, often used to create nude deepfakes of women and girls without their consent. These platforms allow users to upload images of real, fully clothed people, which are then digitally “undressed” with AI tools that simulate nudity.

San Francisco City Attorney David Chiu says the targeted websites were collectively visited over 200 million times in the first six months of 2024 alone.


Image credits: Header photo licensed via Depositphotos.


Source link

Related Articles

Back to top button