← Back to Insights

What are the DISARM Frameworks?

By Julian 13 October 2025

Logo

What are the DISARM Frameworks?

By Julian | Last Updated: 13 October 2025

The DISARM frameworks (Disinformation Analysis and Risk Management) are an open-source tool / taxonomy for describing, analyzing, and responding to disinformation (or information manipulation) campaigns, often in the context of influence operations. 

Origins & Purpose

  • DISARM was inspired by cybersecurity frameworks (in particular, MITRE’s ATT&CK) and seeks to adapt those ideas to the information domain (i.e. disinformation, propaganda, influence operations).

  • The idea is to create a common language and taxonomy so that different organizations (governments, NGOs, researchers, media fact-checkers) can document influence operations in a consistent way, share intelligence, and align countermeasures.

  • It is maintained by the DISARM Foundation as an open, community-led project.

Structure: Red & Blue Frameworks

DISARM comprises two complementary “sides” (analogous to offense/defense in cybersecurity):

Framework

Focus

Purpose

DISARM Red

Disinformation (creator) behaviors

Catalog the tactics, techniques, and procedures (TTPs) used by actors producing influence operations, disinformation, or manipulation. acigjournal.com+4GitHub+4disarmframework.herokuapp.com+4

DISARM Blue

Countermeasures / defensive responses

Catalog possible responses, mitigations, and interventions that defenders can take against influence operations. Alliance4Europe+3GitHub+3disarmframework.herokuapp.com+3

Red side — tactics, techniques, procedures

  • Tactics are higher-level objectives or phases (e.g. “Develop Narratives”, “Maximise Reach/Exposure”) that an adversary may aim for in an influence operation.

  • Techniques are specific actions or methods used to further those tactics (e.g. creating deepfakes, flooding a platform, coordinating inauthentic networks)

  • Procedures are particular combinations or instantiations of techniques across multiple tactics—i.e. how a specific actor mixes methods in a campaign.

Thus, one can break down an observed disinformation incident (or campaign) into component tactics, the techniques deployed, and the way they’re combined (procedurally).

Blue side — countermeasures

  • The Blue side provides a taxonomy of responses one might apply at different stages (or tactics) of a campaign.

  • It’s more normative: it contains a library of “what defenders could do” (e.g. public awareness, content warnings, deplatforming, algorithmic adjustments) rather than a prescription that every situation must follow.

  • There’s caution advised: because context, ethics, legality, and proportionality matter, not all countermeasures in Blue are suitable in all settings.

Technical / Data Aspects & Integration

  • DISARM is designed to be compatible with information security / threat intelligence standards. For example, it uses STIX (Structured Threat Information eXpression) templates for DISARM objects, so that disinformation events, actors, techniques, etc. can be encoded in machine-readable form.

  • It also integrates with platforms like Open CTI (an open threat sharing platform), allowing DISARM tags and datasets to be shared in collaborative systems.

  • The framework’s objects include not only tactics / techniques but also incidents, actor types, playbooks, and more—supporting both qualitative and structured analysis.

Applications & Use Cases

  • Incident documentation & analysis: Analysts can take a disinformation event (e.g. a misleading narrative spread during an election) and break it down using DISARM’s taxonomy. This helps with clarity, comparability, and pattern-detection.

  • Comparative / shared intelligence: Because multiple organizations can use the same taxonomy, data collector A and B can share “this actor used technique X under tactic Y” in a standardized way.

  • Designing countermeasures: After mapping out how an influence campaign is constructed, one can use the Blue framework to consider what mitigations or responses are viable.

  • Training, capacity building & policy: Organizations and governments can train their analysts using DISARM, and embed it into policy workflows or response protocols.

  • Hybrid threat / cognitive warfare analysis: DISARM is also used in academic or policy research to analyze the nexus of disinformation, cyber operations, and cognitive influence (e.g. in the context of foreign interference).

Subscribe now &

Get the latest updates

Subscribe