• The Human Stream
  • Posts
  • Managing Variety: Part 3 of Harnessing Variability for Patient Safety

Managing Variety: Part 3 of Harnessing Variability for Patient Safety

A series on reliability thinking in patient safety

Landscape orientation shot of orange flowers

Nature has introduced great variety into the landscape, but man has displayed a passion for simplifying it.

Rachel Carson, Silent Spring, 1962

Introduction

Recent issues of The Human Stream have weighed up the pros and cons of reliability-oriented thinking within patient safety work. We also planted some flags along the way to caution against the potential for deleterious effects from the over-application of these tools.

In the final instalment of this three-part mini series on ‘harnessing variability’, we introduce the concept of variety and contend that having sufficient variety within systems is critical in responding effectively to emerging issues and maintaining problem-free operations.

We will make the case that the over-application of linear variation-reduction methods leave healthcare systems brittle and more prone to failure - much in the same way that the over-application of industrial production methods in farming and fishing led to degradation in ecological systems (embodied in the above quote by environmentalist and author, Rachel Carson).

Let’s look at why variety matters and what we might do differently in improvement practice to avoid losing variety that might be critical to success.

Cybernetics & Ashby’s Law of ‘Requisite Variety’

We all understand what the concept of ‘variety’ entails in a general sense, but much of the formal scientific work on the topic comes from the field of cybernetics.

While the term ‘cybernetics’ might conjure up images of villainous machine men^ to the uninitiated, cybernetics is an influential cross-disciplinary field concerned with the study of communication (with an emphasis on feedback loops and control).

^Ok maybe that is just me. But In my defence, I did grow up watching the original Dr Who series and the Cybermen episodes always gave 7-year-old me the worst nightmares!

Despite its relative obscurity today, cybernetics at its peak (in the 50s and 60s), heavily shaped an array of disciplines, from computation, biology, ecology, sociology, economics, complexity theory, control theory, cognitive science, cognitive systems engineering, to human factors engineering, and architecture.

Among its various important contributions, cybernetics offered a single, clear vocabulary and set of concepts to describe behaviour and control across a variety of systems (living, mechanical and social) and one such idea was the law of requisite variety.1

Only variety can absorb variety.

Stafford Beer, The Heart of Enterprise (1979)

The Law of Requisite Variety (LORV), was first introduced by Ross Ashby (1956)1 but was restated slightly differently by Stafford Beer2 several decades later (as quoted above). The meaning of this statement can seem a little opaque at first glance but it’s rich with implications.

From a cybernetic standpoint, variety is understood as a measure of complexity in a system, determined by the number of distinct states a system can be in, or the number of distinct actions it can take (also the range of behaviour it can display)3 .

The role of variety in reliable clinical performance and safety

Ashby’s law argues that for a system to succeed, the amount of complexity (or variety) required within a control system must be equal to or exceed the amount of complexity (or variety) of actions that the system needs to produce. By extension, the law also implies that the amount of complexity within a system (a team or an organisation for example) must match or exceed the complexity of demands it experiences from its environment.

This is expressed visually in the diagram below. We can say that V1 V2 V3 (where ‘V’ represents variety) if the system (in blue) is to succeed in its environment.

S Chari 2024

The implications of this seemingly simple idea are profound. Consider the following example. If I asked you how much variety would be required within a software control system for an infusion pump*, how would you answer?

*if this medical device is unfamiliar to you, then consider a bowser at gas station, which operates on a similar principle.

If you concluded that the control system would need sufficient internal variety to manage the breadth of tasks (external variety) that an infusion pump needs to achieve, then you would be on the right track. Logically, if the pump needs to deliver 60 drops of a medication every minute, then the control system would need to be able manage a state change every second.

Now if the control system was unable to dispense a drop every second for any reason,say it lacked the variety to do so and could only deliver a drop every five seconds, then this would lead to instability in the outcome of interest (constant overdosing and/or overcorrections).

When such oscillations are within acceptable tolerances (the quality limits we define), this is acceptable variation - in fact this type of oscillating behaviour is typically how simple control circuits manage a variety of things through the implementation of sensors and feedback mechanism (a thermostat for instance).

However Ashby’s Law would suggest that if we encounter unacceptable variation in the the outcomes of a system, we cannot manage this by merely tightening process.

Instead, we need a couple of different enhancements, namely: better sensing of the current state, better information flows, and more variety (complexity) rather than less. You increase complexity within the system to deal with variability in performance!

This might seem somewhat obvious when pursuing a more dynamic line of thought, but it can be quite counter-intuitive when thinking purely in reliability-oriented terms.

The (rudimentary) animation below attempts to capture this relationship even more clearly, using the example of the infusion pump.

Watching the red outcome node for a few seconds clearly illustrates the kind of system variability that can arise when a control system is unable to maintain adequate control (due to information delays and an inability to respond in a sufficiently calibrated manner - ie a lack of variety).

In the real world, the link between the patient and the readjustment of dosage would likely involve a clinician and/or other factors that influence actual performance. Regardless, it is not hard to see how fundamental deficits in design of this ‘archetypal’ system could underpin many error traps, inefficiencies in work practices and sources of potential for harm to patients.

S Chari 2024

Subscribe to keep reading

This content is free, but you must be subscribed to The Human Stream to continue reading.

Already a subscriber?Sign In.Not now