If you wish to contribute or participate in the discussions about articles you are invited to join SKYbrary as a registered user
Toolkit:Systems Thinking for Safety/Principle 3. Just Culture
People usually set out to do their best and achieve a good outcome
Adopt a mindset of openness, trust and fairness. Understand actions in context, and adopt systems language that is non-judgmental and non-blaming
Systems do not exist in a moral vacuum. Organisations, are primarily social systems. When things go wrong, people have a seemingly natural tendency to wish to compare against work-as-imagined and find someone to blame. In many cases, the focus of attention is an individual close to the ‘sharp end’. Investigations end up investigating the person and their performance, instead of the system and its performance. This is mirrored and reinforced by systems of justice and the media.
The performance of any part of a complex system cannot neatly be untangled from the performance of the system as whole. This applies also to ‘human performance’, which cannot meaningfully be picked apart into decontextualised actions and events. Yet this is what we often try to do when we seek to understand particular outcomes, especially adverse events, since those are often the only events that get much attention.
‘Just culture’ has been defined as a culture in which front-line operators and others are not punished for actions, omissions or decisions taken by them that are commensurate with their experience and training, but where gross negligence, wilful violations and destructive acts are not tolerated. This is important, because we know we can learn a lot from instances where things go wrong, but there was good intent. Just culture signifies the growing recognition of the need to establish clear mutual understanding between staff, management, regulators, law enforcement and the judiciary. This helps to avoid unnecessary interference, while building trust, cooperation and understanding in the relevance of the respective activities and responsibilities.
In the context of this White Paper, this principle encourages us to consider our mindsets regarding people in complex systems. These mindsets work at several levels – individually, as a group or team, as an organisation, as a profession, as a nation – and they affect the behaviour of people and the system as a whole. Do you see the human primarily as a hazard and source of risk, or primarily as a resource and source of flexibility and resilience? The answers may take you in different directions, but one may lead to the road of blame, which does not help to understand work.
Basic goal conflicts drive most safety-critical and time-critical work. As a result, work involves dynamic trade-offs or sacrificing decisions: safety might be sacrificed for efficiency, capacity or quality of life (noise). Reliability might be sacrificed for cost reduction. The primary demand of an organisation is very often for efficiency, until something goes wrong.
As mentioned in Principle 2, knowing the outcome and sequence of events gives an advantage that was not present at the time. What seemed like the right thing to do in a situation may seem inappropriate in hindsight. But investigation reports that use judgemental and blaming language concerning human contributions to an occurrence can draw management or prosecutor attention. Even seemingly innocuous phrases such as “committed an error”, “made a mistake” and “failed to” can be perceived or translated as carelessness, complacency, fault and so on. While we can’t easily get rid of hindsight, we can try to see things from the person’s point of view, and use systems language instead of language about individuals that is ‘counterfactual’ and judgemental (about what they could have or should have done).
For all work situations, when differences between work-as-imagined and work-as-done come to light, just culture comes into focus. How does the organisation handle such differences? Assuming goodwill and adopting a mindset of openness, trust and fairness is a prerequisite to understanding how things work, and why things work in that way. When human work is understood in context, work-as-done can be discussed more openly with less need for self-protective behaviour.
- Reflect on your mindset and assumptions. Reflect on how you think about people and systems, especially when an unwanted event occurs and work-as-done is not as you imagined. A mindset of openness, trust and fairness will help to understand how the system behaved.
- Mind your language. Ensure that interviews, discussions and reports avoid judgemental or blaming language (e.g. “You should/could have…”, “Why didn’t you…?”, “Do you think that was a good idea? “The controller failed to…”, “The engineer neglected to…”). Instead, use language that encourages systems thinking.
- Consider your independence and any additional competence required. Consider whether you are independent enough to be fair and impartial, and to be seen as such by others. Also consider what additional competence is needed from others to understand or assess a situation.
View from the field
Alexandru Grama Air Traffic Controller, ROMATSA R.A., Romania
"Sometimes it seems that organisations expect perfection from their imperfect employees; imperfect performance is considered unacceptable. This way, individuals are reluctant to come forward with their mistakes. They only become obvious to everyone when serious incidents or accidents occur, but then it is already too late. Punishing imperfect performance does not make the organisation safer. Instead it makes the remaining individuals less willing to improve the system. Just culture enables the transition from ‘punishing imperfect individuals’ to a ‘self improving system’. It supports better outcomes over time using the same resources, based on the trust and willingness of individuals to report issues. Through just culture we can look at the reasons that decisions made sense at the time. It is a continuous process that allows an organisation to become safer every day by listening to employees."
Source: Systems Thinking for Safety: Ten Principles. A White Paper. Moving towards Safety-II, EUROCONTROL, 2014.
The following Systems Thinking Learning Cards: Moving towards Safety-II can be used in workshops, to discuss the principles and interactions between them for specific systems, situations or cases.