In response to the growing use of artificial intelligence (AI) in creating child sexual abuse material (CSAM), a new legislative act, the Child Exploitation & Artificial Intelligence Expert Commission Act, was introduced by Rep. Langworthy to establish a coalition of experts aimed at developing a legal framework to combat this issue. The National Center for Missing and Exploited Children reported receiving 36 million reports of suspected CSAM in 2023, noting a small but increasing portion was generated using AI technologies. Major AI companies such as OpenAI, Anthropic, and Stability have started reporting such incidents to the center. The FBI has also issued warnings about the illegality of CSAM created with AI.
The FBI is warning the public that child sexual abuse material (CSAM) created with content manipulation technologies, to include generative artificial intelligence (AI), is illegal. Learn more: https://t.co/Ki0732vHD2 https://t.co/L5fHHQ1DWQ
Experts estimate 90% of online content will be AI-generated by 2026. Just think about that for a second. In just a few years, 9 out of 10 pieces of content you see online could be generated by AI ๐งต
๐ค๐บ๐ธ Beware: AI-generated child sexual abuse content is growing online as per a report from the National Center for Missing & Exploited Children. Worryingly, such content now includes AI-created images and videos. https://t.co/KXltzqrSRF
I just introduced the Child Exploitation & Artificial Intelligence Expert Commission Act to create a coalition of experts to develop a legal framework to combat the use of AI to create child sexual abuse material. As abusers evolve with the rapid advancements in technology, ourโฆ https://t.co/c9Vb4li4LT
New: The National Center for Missing and Exploited Children got 36 million reports of suspected CSAM in 2023. A small but growing volume of that child sex abuse material was made with generative AI. OpenAI, Anthropic, Stability are now reporting to NCMEC:https://t.co/jrhmLslY38