David Chiu, the attorney for the city of San Francisco, is suing the most visited artificial intelligence (AI) deepfake sites in an attempt to punish them for unauthorized use of images, which are used to “nudify” their subjects without consent.
Ars Technica reported that while the FBI is struggling to process cases due to the influx, or “flood” of AI-generated explicit child images, Chiu is determined to punish their existence.
San Francisco attorney begins case against deepfake sites
Chiu made his stance clear on the existence of the deepfake sites and laid out his approach to prosecute them as part of a recent news conference.
Chiu said he is suing these illicit operators on behalf of the State of California. “We are bringing this lawsuit to get these websites shut down,” he added, “but we also want to sound the alarm.”
Such “non-consensual, AI-generated pornography” has been at the top of Google and Bing search rankings for some time. As we recently covered, Google has recently taken a more proactive approach to keep deepfakes out of search results.
The world’s largest search engine said, “The updates we’ve made this year have reduced exposure to explicit image results on these types of queries by over 70%.
“With these changes, people can read about the impact deepfakes are having on society, rather than see pages with actual non-consensual fake images,” Google said.
Generative AI has become a wild landscape with regulation around the use of the technology very much in its infancy. Several applications and sites have boasted of serving more romantic content to consumers the ability of AI to become companions and , but the risk of the technology being applied to more illicit practices always looms large.
“Generative AI has enormous promise, but as with all new technologies, there are unanticipated consequences and criminals seeking to exploit them,” Chiu said. “We must be clear that this is not innovation. This is sexual abuse.”
Image: Pexels.