Minnesota Moves to Ban AI Software That Generates Fake Nude Photos



In short

  • Minnesota lawmakers have passed a law banning AI tools that generate fake nude photos.
  • Violators face $500,000 per use and treble damages.
  • The bill preserves the protections of Section 230 and will take effect on August 1 if signed into law.

Minnesota lawmakers have passed a resolution aimed at stopping the growing trend of AI abuse by targeting the platforms that support it.

On Thursday, the Minnesota Senate voted 65-0 to pass it Picture of 1606to be sent to Governor Tim Walz for his signature. The measure prevents websites and apps from providing tools that create fake nude photos of celebrities.

Under the bill, companies that manage a website, app, or software service may not allow users to access or use these image processing tools or create them on behalf of the user. Advertising or marketing of such services is prohibited.

The measure allows victims to sue individuals or companies that use or control nudity materials, such as websites, apps, or software that create fake nudity images. People exposed to AI-generated nude images can seek damages, including emotional distress, and courts can award up to three times actual damages, as well as punitive damages, attorney fees, and injunctions to stop the behavior.

The bill also gives the state attorney general the authority to enforce the law, with civil penalties of up to $500,000 for enforcement. According to the bill, fines are paid to the general fund and then distributed to victims, including assistance to survivors of domestic violence, domestic violence, and child abuse.

The bill targets devices that require little technical expertise and are widely available, including for children. If signed, the law would take effect on August 1 and apply to new cases from that date forward.

Although the new bill does not refer to a single AI developer, the news comes after several high-profile events at X, including in August 2025, when Elon Musk’s xAI device, Grok, created deep genitalia. Taylor Swift. The pop star moved a symbol its terms and conditions with the US Patent Office in April, possibly in preparation for future AI releases.

Musk is also facing legal pressure, including federal class actions The lawsuit filed by three Tennessee children alleges that Grok created child abuse material from their photos. Also, consumer protection a case from the city of Baltimore alleges that the company knowingly sent a system that produced and distributed sexually explicit material, including children.

Public Citizen President Robert Weissman said the proliferation of these tools shows how quickly AI has lowered the barrier to creating unbiased images and expanded its reach.

“These programs are 99% targeting women, more than 90% of which are under the age of 18. It is a tool of intimidation and abuse of women with very negative psychological effects,” Weissman said. Decrypt. You’ve seen this across the country and around the world.

Weissman added that state laws can work hand in hand with federal efforts, especially when it comes to enforcement. He said local authorities may be willing to act quickly on individual cases, while state agencies may not prioritize or pursue them.

Minnesota’s law also comes amid an ongoing battle between President Donald Trump’s administration over who should control AI laws. The Take It Down Act, signed by President Donald Trump in May 2025, they commit crimes The distribution of inappropriate intimate photos gives victims a way to seek public destruction.

“I think having federal and state laws is good, at least in theory. We’re talking about different enforcement systems and enforcement agencies,” Weissman said. “So you may have a federal standard, but you may not have federal enforcement powers.”

Governor Walz’s office did not immediately respond Decryptrequest for comments.

Daily Debrief A letter

Start each day with top stories right here, including originals, podcasts, videos and more.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *