Minnesota's Legislative Push Against AI-Generated Explicit Content

Minnesota lawmakers are advancing legislation to combat AI-powered ’nudify’ apps and deepfake technology that can generate non-consensual explicit images. The bill would make it illegal to create and distribute sexually explicit images of people without their consent using artificial intelligence tools. The legislation specifically targets apps that can digitally undress people in photographs and create realistic fake explicit content. The proposed law would allow victims to sue creators and distributors of such content for damages, with potential criminal penalties including up to three years in prison and $5,000 in fines. The bill has gained bipartisan support and follows similar efforts in other states like New York and Texas. Lawmakers emphasized the psychological harm and privacy violations these AI tools can cause, particularly to women and minors. The legislation addresses growing concerns about AI technology’s misuse for harassment and exploitation, as these apps become increasingly sophisticated and accessible. Tech experts testified about the rapid advancement of AI image generation capabilities and the need for legal frameworks to protect individuals. The bill also includes provisions for law enforcement training and victim support services. This legislative effort represents part of a broader national movement to regulate AI technology and protect individuals from its potentially harmful applications.

Source: https://abcnews.go.com/Technology/wireStory/minnesota-considers-blocking-nudify-apps-ai-make-explicit-119442849