Work by Karl Weick and Kathleen Sutcliffe [2] into this area suggests that five key elements contribute to what he describes as a state of ‘mindfulness’:
1. Preoccupation with failure
2. Reluctance to simplify interpretations
3. Sensitivity to operations
4. Commitment to resilience
5. Deference to expertise
At first many of these processes appear to be self-defeating on multiple levels. But, as Weick further explains why these processes are necessary if a high reliability organization is to be successful their validity becomes increasingly more apparent.
Preoccupation with failure
HRO’s like most organizations celebrate their successes but Weick [3] also notes “a chronic worry in HROs is that analytic error is embedded in ongoing activities and that unexpected failure modes and limitations of foresight may amplify those analytic errors.”
Reluctance to simplify interpretations
Most organizations are happy to handle complex issues by simplifying them and categorizing them, thus ignoring certain aspects. HROs, however take nothing for granted and support cultures which attempt to suppress simplification because it limits their ability to envision all possible undesirable effects as well as the precautions necessary to avoid these effects. HROs pay attention to detail and actively seek to know what they don't know. They endeavor to uncover those things that might disconfirm their intuitions despite being unpleasant, uncertain or disputed. Skepticism is also deemed necessary to counteract the complacency that many typical organizational management systems foster.
Sensitivity to operations
Weick describes sensitivity to operations as pointing to “an ongoing concern with the unexpected. Unexpected events usually originate in ‘latent failures’ which are loopholes in the system’s defenses, barriers and safeguards who’s potential existed for some time prior to the onset of the accident sequence, though usually without any obvious bad effect.” [4]
Management focus at all levels to managing normal operations offers opportunities to learn about deficiencies that which could signal the development of undesirable or unexpected events before they become an incident. HRO’s recognize each potential near-miss or ‘out of course’ event as offering a ‘window on the health of the system’ – if the organization is sensitive to its own operations.
Commitment to resilience
HRO’s develop capabilities to detect, contain, and bounce back from those inevitable errors that are a part of an indeterminate world. The hallmark of an HRO is not that it does not experience incidents but that those incidents don’t disable it. Resilience involves a process of improvising workarounds that keep the system functioning and of keeping errors small in the first place.
Deference to expertise
HRO’s put a premium on experts; personnel with deep experience, skills of recombination, and training. They cultivate diversity, not just because it helps them notice more in complex environments, but also because rigid hierarchies have their own special vulnerability to error. As highlighted by the work of James Reason and HFACs, errors at higher levels tend to pick up and combine with errors at lower levels, exposing an organization to further escalation.
HRO’s consciously evoke the fundamental principle of risk management – that ‘risk should be managed at the point at which it occurs’. This is where you will find the expertise and experience to make the required decisions quickly and correctly, regardless of rank or title.
Unfortunately most organizations do not work at this level, preferring to manage risk through the introduction of standard operating procedures, policy and work instructions. While these undoubtedly have their place, and can help people to make quick and consistent decisions, a significant body of research also indicates that the blanket application of these controls can reduce individuals ‘mindfulness’ and personal responsibility, thereby contribute indirectly to increasing operating risk.
Other lessons from HRO’s
Other lessons from HROs include the strong support and reward for reporting of errors based on recognition that the value of remaining fully informed and aware far outweighs whatever satisfaction that might be gained from identifying and punishing an individual.
The Icarus Paradox
Many experiments have shown that people who succeed on tasks are less able to change their approaches even after circumstances change. (The hammer and the nail syndrome). Starbuck and Milliken in their analysis of the Challenger disaster said: “Success breeds confidence and fantasy. When an organization succeeds, its managers usually attribute success to themselves or at least to their organization, rather than to luck. The organization’s members grow more confident of their own abilities, of their manager’s skills, and of their organization’s existing programs and procedures. They trust the procedures to keep them appraised of developing problems, in the belief that these procedures focus on the most important events and ignore the least significant ones.” [5]
This level of complacency is a breeding ground for inadequate or ineffective organizational risk management and needs to be fully considered when reviewing the internal context and the risk management context.
==============================
[1] Rochlin, Gene (1996) "Defining 'High Reliability' Organizations in Practice: A Taxonomic Prologue," p. 15 in Roberts, Karlene, ‘New Challenges to Understanding Organizations’, Macmillan Publishing Company, New York, USA
[2] [3] [4] Weick, Karl & Sutcliffe, Kathleen (2001), Managing the Unexpected: Assuring High Performance in an Age of Complexity, Jossey-Bass, New York, USA
[5] Starbuck, W. H. and Milliken, F. J. (1988) “Challenger: Fine-tuning the odds until something breaks”, Journal of Management Studies, Vol. 25, 319-340, New York, USA