Data maturity models measure the extent to which organizations have developed their data capabilities. They focus on a couple of dimensions that can include strategy, leadership, culture, people, governance, architecture, processes, and technology.

The maturity levels of each of these dimensions may be measured along a continuum of four to six levels. The lowest level is often referred to as “lagging,” “initial,” or “ad hoc,” while the highest level is labeled as “optimized,” “leading,” or “transformative.”

Figure 1 below presents an overview of the world’s leading models and the dimensions and levels they use.

Part-1-Data-Maturity-Models
Part-1-Data-Maturity-Models
Part-2-Data-Maturity-Models
Part-2-Data-Maturity-Models

The Missing Dimension: Adoption

As you can tell from the table above, existing maturity frameworks and models focus on data capabilities, such as having a data strategy, reference architecture, or roles and responsibilities. They do not pay a lot of attention to the extent to which these capabilities are adopted across the organization. As such, questions like these are not asked:

  • To what extent does the reference architecture represent the enterprise’s data landscape?
  • Are all data domains identified? How many of them abide by the identified roles and responsibilities?
  • What percentage of critical data sources are measured for data quality?

Indeed, adoption (or, in other words, implementation) is not called out as a separate (sub-)dimension. This is problematic because it’s not enough to have the right tools, processes, and people unless you can provide evidence that you’ve applied them to your data.

Existing models don’t ignore adoption altogether. For example, if you consider the DCAM model, in the example below, you’ll find the question, “Are all critical business functions represented in the discussion?” which does relate to adoption. 

That said, the question is rather vague and buried deep within the survey. And some of the other data models don’t reference adoption at all.

Figure 2 – A sample dimension from EDM Council’s Data Management Capability Assessment Model.
A sample dimension from EDM Council’s Data Management Capability Assessment Model.

Why Does It Matter? Lessons from Traffic Rules and Governance

To understand why adoption should be given more (if not primary) attention, let’s consider the analogy of traffic rules and governance. A set of capabilities is critical to ensure minimum safety on public roads. 

As a first step, there should be clear, unambiguous rules regarding what side of the road to drive on, what kind of vehicles are allowed, what can be the maximum speed depending on the type of road, and where drivers may stop, park, or turn their vehicles. Penalties can be defined for drivers caught breaking the rules.

With this set of rules in place, there should be mechanisms to publish and socialize them so that every driver can reasonably be expected to be aware of, understand, and abide by them (for example, a driver’s license exam). 

A variety of people, processes, and technology components can then be mobilized to ensure enforcement. Traffic police officers can be trained to implement compliance processes, for example, pulling over and ticketing drivers who drive too fast. Speed detection cameras can be installed to automate the process of enforcement. The list goes on and on.

The critical insight here is — what if these capabilities aren’t actually used, or at least not fully? What if traffic rules are only enforced in 10% of the country because traffic officers aren’t actually on the road enforcing the law, perhaps due to misaligned incentives, bad training, or budget cuts? What if speed cameras are only placed on a few roads (or turned off), or fines are set so low that drivers, by and large, just do whatever they want?

If we take the aforementioned data maturity models, in this hypothetical example, maturity could still be scored very high because a lot of the theoretical concepts are technically in place. Yet, in practice, traffic is largely ungoverned, too many accidents happen, and many people are not comfortable going out for a drive.

Creating Effectiveness in Both Design and Operations

A while back, I discussed this pattern in maturity assessments of overemphasizing data capabilities while ignoring how much they’ve been used in practice with my dear friend Jacklyn Osborne, currently a data strategy executive at Bank of America.

She coined the term “operational effectiveness” to refer to the extent to which data is actually governed, for example, as measured by the number of data domains, systems, or business processes that have data governance in place.

Operational effectiveness (the “work”) is enabled by design effectiveness (the “tools”), which refers to the extent to which foundational capabilities are established and in place. Both are required in equal measure to drive data management maturity. You cannot have done the work without the tools, but having the tools without doing the work is worthless, too — it’s like having 100 speed cameras but leaving them all unused in a shed.

Operational and design effectiveness are jointly required to drive data management maturity.
Operational and design effectiveness are jointly required to drive data management maturity.

Adoption and implementation do not only matter at a macro level or at the highest level of maturity. They matter for all data capability components and should be integrated into their respective maturity evaluations.

Taking data quality as an example, across the 4 or 5 steps of increasing maturity, we should definitely confirm that policies and standards are clear, roles and responsibilities are defined, and a dashboard is in place.

But we also need to see increasing amounts of data quality actually being controlled, for example, through a certain number of systems or domains with critical data that have data quality controls in place.

Or, if we go back to our traffic governance analogy, we should not just ask if fully automated speed cameras are in place — we also need to know if they are working, what percentage of drivers actually speed, how many tickets are created, and how many of those actually reach the corresponding drivers.

Enhanced Data Maturity Frameworks

This brings us back full circle to the maturity frameworks we discussed at the beginning of this article.

Data can only be governed if operational and design effectiveness are both present and data maturity frameworks should reflect this by accounting for both. Either that or we’re OK with leaving all those speed cameras waste away in an abandoned shed.

About the Author
Willem Koenders is a global leader in data strategy at ZS Associates with 10 years of experience advising leading organizations across all continents on how to leverage data to build and sustain a competitive advantage. He is passionate about data-driven transformations and a strong believer in "data governance by design." His views are his own.

References and Recommended Further Reading

Mentioned data maturity frameworks and models:

Further recommended reading:

mm
Global Leader in Data Strategy, ZS

Willem Koenders is a global leader in data strategy at ZS Associates with 10 years of experience advising leading organizations across all continents on how to leverage data to build and sustain a competitive advantage. He is passionate about data-driven transformations and a strong believer in data governance by design.

All your customer data in one place.