Governance & ComplianceCISOCIO

NIST AI Risk Management Framework (AI RMF)

The US National Institute of Standards and Technology's voluntary framework for managing risks associated with AI systems, built around four core functions: Govern, Map, Measure, and Manage.

What it means

The NIST AI Risk Management Framework (AI RMF 1.0, published January 2023) provides a structured approach to identifying and managing AI-related risks across the AI lifecycle. It is organized around four core functions: Govern (establishing the organizational culture and accountability for AI risk), Map (identifying and categorizing AI risks), Measure (assessing and quantifying those risks), and Manage (prioritizing and addressing risks with appropriate controls).

NIST AI RMF is voluntary for US organizations but is increasingly referenced by regulators, procurement requirements, and enterprise governance programs as a baseline for AI risk management maturity. Its companion document, the AI RMF Playbook, provides specific practices for each function.

The framework explicitly calls for operational controls — not just policies and documentation — as part of the Manage function. Runtime policy enforcement is a key operational control in this context.

Why enterprise executives need to understand this

CISOs and compliance teams are increasingly asked to demonstrate AI RMF alignment to boards, auditors, and regulators. The Manage function of AI RMF requires organizations to have active controls that respond to identified risks at runtime — not just document them. Corules provides the operational enforcement mechanism that operationalizes the Manage function for AI workflow decisions.

How Corules implements this

Corules supports NIST AI RMF alignment by providing: versioned policy governance (Govern function), structured context logging for risk mapping (Map function), violation metrics and escalation rates (Measure function), and deterministic runtime enforcement that prevents policy violations from executing (Manage function). The immutable audit log provides evidence for all four functions during assessments.

Frequently Asked Questions

Is NIST AI RMF mandatory?

It is currently voluntary for US organizations, but it is referenced in federal procurement requirements, and state-level AI regulations are increasingly aligning with it. For organizations that want to demonstrate AI governance maturity, alignment with NIST AI RMF provides a recognized baseline.

See NIST AI Risk Management Framework (AI RMF) in production

Corules implements every concept in this glossary. Join enterprise teams enforcing policy at runtime — no credit card required.

Request access