Cybersecurity professionals don’t struggle to find maturity models. We struggle with the downstream effects of inconsistent semantics, inconsistent scoring, and inconsistent mapping:
For years, I’ve been harmonizing and reconciling different cybersecurity maturity models and this year decided to put that information into the public domain. Not invent a 4th maturity model, but harmonize the existing ones so they can be translated (the way NIST CSF is the common language between other cyber standards.
I’m calling it the OT Cyber Maturity Journey (OT CMJ) approach that provides a harmonized maturity journey, including a pragmatic “Developing” (1.5) stage and an explicit cumulative rule (no rounding up) to improve repeatability and reduce score inflation. Triggered by an ICS/OT use-case, but if you look closely it applies to any environment that uses CMMI, C2M2, NIST CSF, 62443-2-4, etc.
What follows are concrete use-cases unlocked by having a maturity scale that is more standardized (shared definitions), more consistent (cross-walked), and more defensible (requirements-based and cumulative).
Details below.
Problem: Benchmarking is meaningless if maturity scales are not comparable. Even within the same framework, organizations often create local scoring rubrics, which defeats peer comparison.
How Harmonized Maturity Model helps: OT CMJ functions as a translation layer between commonly used public-domain models (e.g., CSF tiers, C2M2 MILs, CMMI semantics), enabling apples-to-apples comparison across sites, subsidiaries, and partners. The NIST Cyber Security Framework was intended to be this translation layer between different cybersecurity standards, and the OT CMJ can be the translation for maturity.
Implementation:
Problem: OT teams often can’t translate “maturity improvement” into risk reduction in a way ERM, finance, or insurers can consume.
How Harmonized Maturity Model helps: OT CMJ enables consistent assignment of control strength / effectiveness assumptions by maturity level, which can feed quantitative or semi-quantitative models. NIST explicitly frames cybersecurity as an input into enterprise risk decisions and emphasizes integration with ERM processes. (NIST Publications)
Two practical approaches:
Implementation:
Why it matters: If maturity scoring is inconsistent, your risk quantification inputs become arbitrary and the model loses credibility. Also, if the intent is to assign ‘effectiveness coefficients’ for each maturity level (e.g., L2-Basic is 50% effective) then consistency becomes even more important. Then as more data is available to support the effectiveness values, the coefficients can be adjusted slightly.
And create a starting point for the next iteration staying in the public domain)
Problem: Security standards and frameworks keep reinventing maturity models, or consultancies suggest their own proprietary version in their reporting. This fragments the community and reduces comparability.
How Harmonized Maturity Model helps: A shared baseline encourages the industry to evolve from “new model creation” to “improve the common model,” including:
This aligns with how NIST CSF positions itself: a common taxonomy for communication, profiles for tailoring, and informative references for mapping across standards. (NIST Publications)
Problem: “Raise maturity everywhere” is not a strategy—especially in OT, where constraints (uptime, safety, vendor support, lifecycle) matter.
How Harmonized Maturity Model helps: You can set target maturity by criticality and build an investment roadmap that is defensible and measurable. This matches the stated intent of C2M2: measure capabilities over time, set target maturity based on risk, and prioritize actions/investments to meet targets. (The Department of Energy's Energy.gov)
Implementation:
OT-specific nuance: “Target maturity” often differs by architecture zone (e.g., DMZ vs cell/area zone vs SIS zone). Use OT CMJ as the maturity dimension; use your architecture zoning model and criticality as the scope dimension.
Problem: Many maturity assessments are workshop-driven and drift over time. The same program assessed by two teams yields two scores.
How Harmonized Maturity Model helps: OT CMJ’s requirements-based, cumulative structure supports more defensible scoring.
Where to take this next (and why it matters):
Implementation:
Problem: OT dependencies (OEMs, integrators, MSPs, remote support) create risk, but supplier requirements are often vague (“follow best practices”) or overly IT-centric.
How Harmonized Maturity Model helps: Use OT CMJ to express minimum maturity expectations for third parties in contract language:
NIST SP 800-161 Rev. 1 provides a structured approach for cybersecurity supply chain risk management (C‑SCRM), including strategy, plans, policies, and risk assessments—concepts OT CMJ can maturity-grade. (NIST Computer Security Resource Center)
The ISA/ISO/IEC 62443-2-4 standard focused on Security Program Requirements for ICS/OT Service Providers and is based upon a CMMI-like maturity model for measuring their capabilities. The maturity model in Part 2-4 can be replaced with OT CMJ.
Implementation:
Problem: Ransomware readiness tools exist, but organizations struggle to tie findings to long-term maturity roadmaps.
How Harmonized Maturity Model helps: Use OT CMJ maturity levels to structure ransomware resilience work into a progression:
CISA’s ecosystem includes ransomware readiness assessment approaches (e.g., via CSET’s ransomware assessment options) designed to help organizations evaluate readiness. OT CMJ can provide the common maturity “spine” for prioritizing and tracking those improvements over time. (CISA)
Problem: Individual organizations build maturity models and “target states” in isolation, then struggle to justify why their target is appropriate.
How Harmonized Maturity Model helps: Use OT CMJ to support community-driven baselining and target states, similar to NIST CSF Community Profiles—baseline outcomes created and published to address shared interests among organizations. (NIST)
Implementation:
As you can tell, I’m passionate about cybersecurity maturity as I’ve used it throughout my customer-facing OT cyber consulting engagements the last 20+ years. I’ve seen the usefulness, the pitfalls, the evolution of maturity models and their application to cybersecurity.
Many would disagree that maturity levels are not technical enough - I would agree if you’re the technical engineer responsible for patching, hardening, and other front-line cybersecurity operations. But the terminology used on the front line doesn’t translate or well understood by Senior Management. Determining the maturity level of each cybersecurity function/category is close to the cyber insurer’s method for assessing cyber risk.
Maturity modelling of cybersecurity remains an important aspect of cyber risk management, helping remind ourselves it is a continuous journey of improvement, with guideposts provided by the maturity requirements as what we should be doing next.
Once again, these use-cases above are unlocked by having a maturity scale that is more standardized (shared definitions), more consistent (cross-walked), and more defensible (requirements-based and cumulative).