Minnesota Tackles Deepfake Porn: New Legislation Aims to Curb AI 'Nudification'
Molly Kelly discovered someone used AI to create deepfake pornographic images of her, sparking Minnesota's legislative response. The state is considering a bill targeting companies that enable the creation of such content. Amidst growing efforts nationwide to regulate AI and deepfake technology, experts caution potential legal hurdles regarding free speech.
Molly Kelly was shocked to learn that someone she knew used AI technology to create explicit deepfake images of her from family photos shared on social media. This disturbing discovery led Minnesota legislators to propose a bill aimed at curbing such AI-driven content.
The bill, supported across party lines, targets companies that facilitate the creation of explicit images through 'nudification' sites. Penalties of up to $500,000 could be imposed on companies allowing Minnesota users access to such services. However, constitutional challenges on free speech grounds are anticipated.
Other states and Congress are also seeking regulatory measures against deepfake technology. While some legal experts warn about broad legislative terms, advocates emphasize the urgent need to control such harmful AI applications, which rapidly spread anonymous, unretractable images online.
ALSO READ
-
Commission for Air Quality Management reviews preparedness of states to eliminate stubble burning
-
Nepal: Case filed against ex-minister, Chinese company over corruption in China-funded airport
-
BJP to act against those involved in Suvendu aide murder once it forms govt in WB: Oram
-
Murder of Adhikari's aide part of TMC's 'Khela Hobe' politics: BJP
-
UPDATE 1-EU prosecutors arrest 21 Croatians over suspected farm aid fraud
Google News