(KTTC) – The world of evolution of artificial intelligence (AI) becomes an ingenious tool, but when this technology is abused, there are many dangers posed that accompany it.
AI has the ability to create deep counterfeits. This is the handling of the photo, video or audio of a real or unless real person.
Minnesota legislators make efforts to protect minnesotans from non -consensual nudification.
Nudification is described as using AI technology to generate someone’s pornographic image.
With the progress of technology, these images become more realistic and more practical to create.
In 2023, Minnesota promulgated the Sexual false This was modeled after non -consensual pornography laws.
This criminalizes anyone who makes deep counterfeits related to sex.
Senator Erin Maye Quade (DFL) and defenders of this week express their support for a new bill that would oblige companies to deactivate access to nudification technology in Minnesota.
Quade declared that the damage was not in the dissemination or distribution of it, it is in creation.
“Anyone could take a photo of our children or grandchildren online, the grocery store playing in the park and put it in an application and create a very real pornographic video,” said quade. “These applications are available on each computer, each mobile phone, downloadable from all ages. They are available in Apple Store and on Google Store. “”
These companies already violate the law of prohibition of the deep false dissemination by allowing the person to make these images of AI according to quade.
She said they wanted companies to deactivate this in Minnesota.
To find out more about the invoice, please Click here.
Find stories like this and more, In our applications.
COPYRIGHT 2025 KTTC. All rights reserved.