Ukraine
影響を受けたインシデント
インシデント 6027 レポート
Russia Using Artificial Intelligence in Disinformation Campaigns to Erode Western Support for Ukraine
2023-10-02
The Russian government has been stepping up its foreign influence campaigns by using artificial intelligence and emerging technologies to spread disinformation and sow distrust in policies supportive of Ukraine. Part of the strategy includes carrying out influence laundering operations by disseminating their messages to the American public via allies inside nominally independent organizations, according to a recent declassified analysis. This incident is an evolving project.
もっとインシデント 5853 レポート
Kremlin-Linked Entities Allegedly Using Generative AI to Spread Russian Disinformation in Latin America
2023-10-26
Moscow-based tech firms and an industry association with links to the Kremlin are allegedly using generative AI to spread Russian disinformation in countries throughout Central America and South America. According to the U.S. Department of State, the Russian companies rely on local writers to compose stories which are then amplified across social media using artificial intelligence chatbots.
もっとインシデント 6563 レポート
Alleged Deepfake Disinformation Broadcast by Russian State TV Blames Ukraine for Moscow Attack
2024-03-23
Russian state media is reported to have broadcast deepfaked videos of Ukrainian officials, notably fabricating a video of Secretary of the National Security and Defense Council of Ukraine admitting to orchestrating the Crocus City Hall terror attack in Moscow. The effort appears to be a bid to wrongly assign blame for the incident, which ISIS-K has officially claimed.
もっとインシデント 7742 レポート
Covert AI Influence Operations Linked to Russia, China, Iran, and Israel, OpenAI Reveals
2024-05-30
In a report released by OpenAI, the company described how its generative AI tools were misused by state actors and private companies in Russia, China, Iran, and Israel to conduct covert influence campaigns aimed at manipulating public opinion and geopolitical narratives.
もっと