Associated Incidents
AIID editor's note: This report is significantly abridged. Please see the original source for the full findings.
Key Findings
- A coordinated network of more than 50 inauthentic X profiles is conducting an AI-enabled influence operation. The network, which we refer to as "PRISONBREAK," is spreading narratives inciting Iranian audiences to revolt against the Islamic Republic of Iran.
- While the network was created in 2023, almost all of its activity was conducted starting in January 2025, and continues to the present day.
- The profiles' activity appears to have been synchronized, at least in part, with the military campaign that the Israel Defense Forces conducted against Iranian targets in June 2025.
- While organic engagement with PRISONBREAK's content appears to be limited, some of the posts achieved tens of thousands of views. The operation seeded such posts to large public communities on X, and possibly also paid for their promotion.
- After systematically reviewing alternative explanations, we assess that the hypothesis most consistent with the available evidence is that an unidentified agency of the Israeli government, or a sub-contractor working under its close supervision, is directly conducting the operation.
. . .
Conclusions
Covert influence operations have been a feature of armed conflict dating back centuries, their characteristics evolving as new technologies open up opportunities for experimentation. Today, we are witnessing a new generation of covert influence operations mounted principally on and through social media and featuring the use of artificial intelligence and related digital tools and techniques. These operations benefit from the distributed nature of social media, their built-in algorithmic engagement properties (which help propel sensational content by design), as well as recent measures taken by many tech platforms to reduce the capacity of, or eliminate altogether, internal teams responsible for removing coordinated, inauthentic content. The growing sophistication and ease-of-use of AI tools has also helped power these campaigns, providing actors with an unprecedented ability to produce increasingly realistic-looking videos and images with far fewer resources than what would have been required in previous eras.
In this investigation, we have identified a coordinated network of inauthentic X profiles that, since 2023, has conducted an influence operation targeting Iranian audiences. The objective of PRISONBREAK appears to be to foster a revolt against the Iranian regime among the Iranian population. One striking feature of this campaign is its synchronization with events on the ground: the content generated by PRISONBREAK appears to have been prepared in advance of and co-timed with military strikes undertaken by the IDF in June 2025.
Although we cannot attribute this to a particular entity, the advanced preparation required and the timing of the coordinated, inauthentic posts suggests some kind of connection to the Israeli state. We believe that while it is technically possible, it is highly unlikely that any third party without advance knowledge of the IDF's plans would have been able to prepare this content and post it in such a short window of time. Based on the data reviewed in preparing this report, the campaign we have documented was mostly likely undertaken either by an Israeli agency in-house or a private entity contracted by the Israeli government. However, without additional information we are unable to conclusively attribute the responsible parties.
While attribution around any covert operation is inherently challenging because of the steps taken by perpetrators to conceal their tracks, attribution is also made especially difficult today because social media platforms restrict access to their platforms to outside researchers, and thus to the artifacts and other details that are critical to make conclusive attribution possible. In spite of these restrictions, we were able to use a combination of qualitative and quantitative methods to conclusively determine what we were observing was not spontaneous and organic, as the perpetrators were hoping to be perceived, but rather highly coordinated and inauthentic.
However, these methods are not readily available to the general public and/or to specific target populations and typically require considerable analytical effort, time and access to special resources and data sources. Additionally, it is now generally accepted that in today's social media environment, sensational falsehoods spread quickly and are shared widely because they feed off of human emotions and are amplified by the engagement-driven algorithms of the platforms, all of which works to the advantage of those mounting covert influence operation campaigns. These dynamics and their potentially harmful effects are exacerbated in times of political crisis and conflict, such as the military confrontation between Israel and Iran.