in

Study Finds AI Image-Generators Trained on Child Pornography

Spread the love



According to a new report by the Stanford Internet Observatory, in collaboration with the Canadian Centre for Child Protection and other anti-abuse charities, over 3,200 images of suspected child sexual abuse were in the AI database LAION.

AI database LAION (Large-scale Artificial Intelligence Open Network), an index of online images and captions, has been used to train leading AI image-makers such as Stable Diffusion. As reported by the Associated Press, the report externally validated about 1,000 images. 

Following the report’s release on Wednesday, LAION informed the AP that it was temporarily removing its datasets. 

In a statement, the LAION stated it “has a zero-tolerance policy for illegal content, and in an abundance of caution, we have taken down the LAION datasets to ensure they are safe before republishing them.”

Stanford Internet Observatory’s chief technologist David Thiel, who authored the report, explained that the move is not an easy problem to fix because multiple generative AI projects were “effectively rushed to market” and made widely accessible due to a competitive field. 

“Taking an entire internet-wide scrape and making that dataset to train models is something that should have been confined to a research operation, if anything, and is not something that should have been open-sourced without a lot more rigorous attention,” Thiel said in an interview.

The Stanford report also found that although newer versions of Stable Diffusion have made it more difficult to create harmful content, an older version, released last year — which Stability AI says it didn’t release — is still used in other applications and tools and remains “the most popular model for generating explicit imagery.”

“We can’t take that back. That model is in the hands of many people on their local machines,” explained Lloyd Richardson, director of information technology at the Canadian Centre for Child Protection, which runs Canada’s hotline for reporting online sexual exploitation.

On Wednesday, Stability AI said it only hosts filtered versions of Stable Diffusion and that “since taking over the exclusive development of Stable Diffusion, Stability AI has taken proactive steps to mitigate the risk of misuse.”

“Those filters remove unsafe content from reaching the models,” the company said in a statement. “By removing that content before it ever reaches the model, we can help to prevent the model from generating unsafe content.”

German researcher and teacher Christoph Schuhmann, who created LAION, told AP earlier this year that one reason that the database was made widely accessible was so that a handful of powerful companies would not control the future of AI development.

“It will be much safer and much more fair if we can democratize it so that the whole research community and the general public can benefit from it,” he said.

Photo Courtesy: ©iStock/Getty Images Plus/Supatman


Milton Quintanilla is a freelance writer and content creator. He is a contributing writer for Christian Headlines and the host of the For Your Soul Podcast, a podcast devoted to sound doctrine and biblical truth. He holds a Masters of Divinity from Alliance Theological Seminary.

Related podcast:

The views and opinions expressed in this podcast are those of the speakers and do not necessarily reflect the views or positions of Salem Web Network and Salem Media Group.

Related video:

We would do well to consider how biblical patterns might inform our contemporary actions. Read James Spencer’s full article here

Sound and Photo Credit:©/iStock/Getty Images Plus/skynesher





Source link

Written by admin

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

GIPHY App Key not set. Please check settings

Karen Kingsbury Urges Families to Focus on ‘Peace and Reconciliation’ this Christmas

‘VeggieTales’ Celebrates 30 Years of Faith and Fun with Bob and Larry