Artificial intelligence-generated images are still in the spotlight. Recently, Microsoft commented that it is necessary for government agencies to take action on this matter, as it is necessary to establish laws to regulate everything. Now we have news of great importance in the United States: websites dedicated to using AI to undress people are being sued.
Thanks to a recent tweet published by the San Francisco prosecutor, David Chiu, we have learned that a lawsuit has been filed against 16 websites that are dedicated to creating and distributing AI-generated images, but where adults—and even minors—appear to be naked.
According to David Chiu, these portals were creating deepfakes and monetizing the content. It is estimated that during the first six months this year, they generated over 200 million visits among the 16 websites—a staggering number of views.
Website operators all over the world are accused of violating many currently existing federal laws, with some accusations as serious as child pornography.
“The damage caused to consumers undoubtedly outweighs all the benefits associated with these practices,” says the prosecutor from San Francisco. “This investigation has led us to the darkest corners of the Internet, and I am absolutely horrified by the women and children who have had to survive this exploitation. This is a huge problem that, as a society, we need to solve as soon as possible.”
As soon as we have more news about this lawsuit, here at Softonic you will be informed.