In a statement today, the White House said it has received commitments from several AI companies to curb the creation and distribution of deepfake porn, also known as image-based sexual abuse material. Engadget reports: The participating businesses have laid out the steps they are taking to prevent their platforms from being used to generate non-consensual intimate images (NCII) of adults and child sexual abuse material (CSAM). Specifically, Adobe, Anthropic, Cohere, Common Crawl, Microsoft and OpenAI said they’ll be: “responsibly sourcing their datasets and safeguarding them from image-based sexual abuse.”
All of the aforementioned except Common Crawl also agreed they’d be: “incorporating feedback loops and iterative stress-testing strategies in their development processes, to guard against AI models outputting image-based sexual abuse” and “removing nude images from AI training datasets” when appropriate. […] The notable absences from today’s White House release are Apple, Amazon, Google and Meta.
Read more of this story at Slashdot.