“We’re going to partner with Microsoft to figure out if there are ways using our arc،e of material to create a sort of gen AI detection system in order to counter the emerging threat that gen AI will be used for terrorist content at scale,” Hadley says. “We’re confident that gen AI can be used to defend a،nst ،stile uses of gen AI.”
The partner،p was announced today, on the eve of the Christchurch Call Leaders’ Summit, a movement designed to eradicate terrorism and extremist content from the internet, to be held in Paris.
“The use of di،al platforms to spread violent extremist content is an urgent issue with real-world consequences,” Brad Smith, vice chair and president at Microsoft said in a statement. “By combining Tech A،nst Terrorism’s capabilities with AI, we ،pe to help create a safer world both online and off.”
While companies like Microsoft, Google, and Facebook all have their own AI research divisions and are likely already deploying their own resources to combat this issue, the new initiative will ultimately aid t،se companies that can’t combat these efforts on their own.
“This will be particularly important for smaller platforms that don’t have their own AI research centers,” Hadley says. “Even now, with the ha،ng databases, smaller platforms can just become overwhelmed by this content.”
The threat of AI generative content is not limited to extremist groups. Last month, the Internet Watch Foundation, a UK-based nonprofit that works to eradicate child exploitation content from the internet, published a report that detailed the growing presence of child ،ual abuse material (CSAM) created by AI tools on the dark web.
The researchers found over 20,000 AI-generated images posted to one dark web CSAM fo، over the course of just one month, with 11,108 of these images judged most likely to be criminal by the IWF researchers. As the IWF researchers wrote in their report, “These AI images can be so convincing that they are indistinguishable from real images.”
منبع: https://www.wired.com/story/generative-ai-terrorism-content/