top of page
Search

Decision test drives for Performance Management

Writer's picture: Richard M. AdlerRichard M. Adler

Updated: Nov 7, 2019

My book, Bending the Law of Unintended Consequences describes a method for “test driving” critical decisions. This method uses simulations to practice decisions, by projecting their likely outcomes against diverse assumptions about possible futures. Such analyses enable companies to avoid “train wreck” decision options and to refine more promising alternatives to improve their outcomes.


One chapter in the book describes a test drive solution called CALM for practicing strategies for managing disruptive change. CALM enables organizations to validate or refine plans for minimizing the internal turbulence that often arises when implementing dramatic business decisions such as mergers, starting new lines of business, or adopting new technology platforms. However, CALM can be adapted to test drive a much broader set of operational decisions.


The CALM test drive solution is based on a design pattern for modeling, simulating, and analyzing decisions about enabling change. This pattern is broadly applicable to managing performance in key business functions such as security, data quality, and procurement. The term Organizational Performance Management (OPM) refers to how businesses apply resources to these vital operations. OPM methods generally:


1. Measure an organization’s performance against relevant metrics.

2. Diagnose shortcomings and set new performance targets.

3. Develop plans to achieve those goals.

4. Execute those improvement plans.


This is precisely the pattern underlying the CALM test drive solution. This similarity should not be surprising. Change management can be viewed as a special kind of OPM, namely managing performance during periods when organizations face disruptive change. Similar to managing change, the key to other types of OPM is an effective system of measurement. Without metrics, it is difficult, at best, to assess an organization’s current state and define unambiguous goals, much less to design improvement plans to achieve those goals, test them, and monitor progress.


Fortunately, a well-established approach to measuring performance has been developed to support the first two OPM activities, called a capability maturity model (CMM). A CMM defines a set of metrics for measuring organizational competency or maturity in terms of a set of recognized best practice processes, systems, and skills. Metrics are organized into categories and quantified on a performance scale, much like CALM defines a set of readiness metrics for its dimensions of change. Rating criteria allow organizations to benchmark their performance for these maturity metrics.


CMMs were originally developed by the Carnegie Mellon University Software Engineering Institute to improve the management of software development. Other CMMs have been developed to support various other process domains, including product acquisition, data governance, and cyber security.


Corresponding to CALM’s dimensions of change, a CMM for OPM typically defines a set of categories which group core functions or capabilities. Each capability identifies a best practice, for example, to ensure some aspect of data quality or data security. Typically, a best practice involves one or more of the following: documented procedures or a process (e.g., for verifying data values or integrity); databases and application software systems; staff with suitable experience, training, and certification; and designated management and executive oversight.


A CMM also provides a rating system, driven by criteria or a questionnaire that allows an organization to assess its current state or level. For example, Carnegie Mellon scores its process maturity metrics on a five point scale:


1. Initial: lowest or starting level; process is unpredictable, poorly controlled, and reactive.

2. Managed: process is characterized for projects and is manageable.

3. Defined: process is characterized for the organization and is proactive.

4. Quantitatively managed: process quantitatively measured and controlled.

5. Optimizing: focus on continuous process improvement.


OPM frameworks such as CMMs are diagnostic tools that are inherently static. They are valuable because they provide reference standards for gap analysis: businesses can assess their current status and identify differences between that and aspirational best practice targets.


However, CMMs offer no explicit support for key back-end activities that drive performance remediation: (1) formulating plans to improve performance levels; (2) testing improvement plans prior to roll-out to validate or refine them; and (3) monitoring execution results (and changing conditions) to confirm progress and making mid-course adjustments to plans as required. In short, they fall short as actionable process improvement methodologies.


The CALM solution extends and “animates” a static diagnostic framework developed by John Kotter for assessing change strategies. CMMs can be extended and animated similarly to support a full lifecycle OPM. Such dynamic CMMs follow the same Model-Simulate-Analyze process used by CALM to support performance improvement strategies:


1. Assess current business state based on the criteria defined by the OPM framework.

2. Identify a desired goal state for a business in terms of a set of target values for process units.

3. Identify relevant environmental influences and estimate their likely impact OPM metrics.

4. Define a performance improvement plan. Such plans specify activities and tasks to implement or enhance OPM units. Plan elements are defined in terms of estimated schedules, costs, and expected contribution to achieving the OPM unit (and improving performance ratings).

5. Simulate the improvement plan’s projected outcome, and compare against alternative plans and scenarios involving alternate assumptions about future conditions and plan impacts.

6. Select a performance improvement strategy and re-apply the process periodically during the strategy’s execution to detect and correct emerging problems.


Figures 1 and 2 depict results from a test drive of an OPM improvement plan for a Dynamic CMM for data governance. Figure 1 displays a spider (or radar) plot for the metrics making up a performance dimension called Information Lifecycle Management. The inner pentagon depicts the initial maturity scores, while the outer (red) pentagon shows the projected maturity metrics improvements six months later. Figure 2 summarizes the same results in a tabular format.


Radar plot

Figure 1. Spider Chart of Test Drive Performance for Data Governance Maturity Plan

(Info Lifecycle Management dimension, green = maturity levels at week 1, red at week 24)


Figure 2. Before/After Maturity Levels for Info Lifecycle Mgmt Maturity Improvement Plan


These figures make it clear that decision test drives offer robust process improvement methods for OPM that are dynamic and fully actionable.


Suggested Reading

Christopher Alexander introduced the notion of design patterns for architecture in his book A Pattern Language: Towns, Buildings, Construction. His concept was later widely adopted by software architects such as Design Patterns: Elements of Reusable Object-Oriented Software by Erich Gamma, et al. For information on CMMs, see https://en.wikipedia.org/wiki/Capability_Maturity_Model and www.cmmiinstitute.com. The National Institute of Standards and Technology has developed a CMM-like model for cybersecurity, while the Enterprise Data Management Council (EDM Council) developed a CMM benchmarking model call Capability Assessment Model (DCAM).

Recent Posts

See All

Comments


©2019 by Richard M. Adler

  • LinkedIn Social Icon
  • Facebook
bottom of page