Conditions for Safety:

Fostering psychological safety to navigate ethical dilemmas

Sarah Anne Freiesleben
12 min readSep 6, 2021

Daily Choices

Lene, whose name has been changed for anonymity, led a large team responsible for the finance modules of an ERP implementation at a pharmaceuticals company in Denmark. Since the Pharma industry is one of the most heavily regulated industries, any development that was part of a system that involved patient relevant data, was regulated by pharmacovigilance laws, and was held to the tightest compliance standards. Because of this, before each release of new changes to the system, Lene needed to sign off on the system documentation. This act was carried out with a special pen, a procedure that added an air of formality that reflected the seriousness of her responsibility.

Lene appreciated her responsibility and had always prioritized ensuring system documentation reflected reality even when working in environments with lower regulations. She saw it as an exercise in ensuring one’s current actions respected future needs. “Rettidig omhu” always came to her mind, reflecting on the story of AP Møller trusting his son to expand the family business that eventually grew into the world’s largest shipping line. She always loved the phrase because it could not really be translated into English. They called it “constant care”, but that was not quite right; there was a nuance to it that was specifically Danish. A special magic that was lost when translated. She loved nuance but did not really know why. It felt deeply connected to her deepest values.

But Lene faced a dilemma. She had just started in this role and had inherited many documents that had already been ceremoniously signed off by her predecessors, and they were not accurate. Maybe they had been at the time they were signed off, but things had changed, and they were wrong now. But there was no procedure for making changes to the documents. Once they were signed off, they were locked. She had organized meeting after meeting with compliance officers and senior leaders, advising that, as the time came closer to her sign off, a solution would need to be found, for it would not be her name that would go on such documents that she knew were very wrong. She would be happy to sign off on the new sections, that her team had ensured were accurate, but if they were lumped together with the inaccurate sections, she would not be able to.

All this effort from Lene seemed pedantic to her colleagues. Couldn’t she see there were bigger fish to fry? “This was is not a priority,” they supplicated. The implementation was not going well, people were working overtime, and here she was worrying about documentation. But then the day came. The implementation partner came into her office, a shared space which included her team of 28 dedicated colleagues. He presented her with the complete package of documentation and handed her a pen. She opened them up and saw the complete bundle, with all its inaccuracies. She calmly told the program lead she could not sign. He began to argue with her, and things escalated. He began to yell that she was delaying the whole project and did not deserve to be in her role. She became upset and shaken as she continued to try to explain her position, referring to her previous efforts to improve the compliance process.

Eventually, as she could tell her emotions were getting the better of her, she rose from her desk, visibly shaken, and walked with the papers over to the compliance office. She entered the much calmer, library-like, space and approached two men who were strangers to the chaos of ERP implementations. They appeared surprised by her intensity as she approached. With tears in her eyes and with a shaky voice, she held the documents up to the head of compliance and said, “In the name of compliance, you are asking me to lose my integrity; what do you expect me to do?”. “Don’t sign”, he said. It is your job to make this choice.

Lene was shocked by the simplicity of this. She realized in that moment, that she had done what she could. She had used her voice and had fought for what she thought was right and now the choice was in fact very simple. She did not sign. And her reputation never recovered. Despite all the good she had done on the project, she was now seen as “the difficult one.” The project found a way to work around the problem without her signature and without updating the documents. But Lene did not survive this ethical dilemma. After other similar but increasingly worse incidents, she eventually made the difficult choice to leave the company with her virtue intact.

The complexity of Ethics

This story may sound simple and solutions for it, easy to outline when isolated, but choices like this happen frequently, especially in heavily regulated environments. In fact, the situation of “being stuck between a rock and a hard place” happens often when rigid, predefined rules and procedures meet real life. In this case, the rule to not change the documents became more important than the information the rule was meant to protect. To put it another way, the pressure to meet project performance criteria became more important than the quality of the same. Ironically, the act of taking the e-learning about pharmacovigilance standards was prioritized and Lene was reminded to take it when she fell behind on the deadline, but that was also easy to measure; one had either completed the test or had not.

Ethics seem straightforward enough when working with binary rules, but what happens when contexts become more complex? How can companies guide and support their employees to do what is right, even when the opposite choice could also be considered right from a different rule or perspective?

A job posting for Head of Ethics was recently published for a large green energy company located in Copenhagen. The role, which was anchored in the compliance department, described that the successful candidate would identify rules, roll them out, and enforce them. There is no doubt that the leadership of the company is sincere in its desire for its employees to behave ethically. But philosophy and complex adaptive systems theory indicate that setting and enforcing rules is not enough.

To foster an ethical organization, one must not only establish the rules of compliance, but also foster an environment where employees can act according to their values. People must feel psychologically safe enough to do the right thing, even when there is a rule designed for a different context, standing in the way.

This article will look at why companies should set and enforce rules for the simple things and build a culture that fosters openness and psychological safety to help with the complex ones. It does not outline a recipe for mindset change or a 5-step plan. For, as with most human related change ambitions, there is a cultural element which is easy to say, but difficult to do. Rather, this article is rooted in empirical observations from the natural world which indicate that the “end” state of complex change (aka, evolution) cannot be predicted; therefore, a more feasible ambition is to acknowledge the realistic nature of the dilemmas that abound and work with purpose to influence systemic change for good. I will offer some insights on how one could embark on this journey, but each organization will have to find out for themselves, how they can create this space.

Rules and Values

Ethics as a discipline generally includes three somewhat overlapping and somewhat conflicting paradigms. Drastically simplified, they are as follows:

Ethics of Duty: Rule based, where there is a predefined right and wrong (Kant)

Virtue Ethics: Values based, focusing on character-building to adhere to those values contextually (Aristotle)

Utilitarian Ethics: Service based, with the greater good of the whole in mind (Bentham and Mill)

This article will zoom in on the boundary between Ethics of Duty and Virtue Ethics, as setting rules and working according to values pulls from these two very separate, sometimes conflicting, ethical paradigms.

Setting up rules for things that are known or knowable is an ordered process that can either be done easily in simple areas or with applied expertise in more complicated areas. But things get trickier when organizations have complex situations where there are high levels of integrations, various perspectives, and multi-faceted relationships. For the situations of high complexity, organizations need to be able to rely on human judgement in addition to the knowledge of the rules to ensure decisions are made with integrity. And in this case, both definitions of the word “integrity” are at play: “the state of being whole and undivided” and “the state of having high morals and being honest.” For, you cannot ensure the latter if you do not focus on the former. And being able to focus on being whole and undivided means that a person may sometimes have to grapple with paradox or dilemmas.

Fortunately, people generally have values and a natural ability to navigate complex contexts. But, as Peter Senge points out in The Fifth Discipline, “Organizations cannot build visions until people can build visions.” Likewise, an organization cannot have values unless the people in them feel comfortable living theirs. Organizations cannot treat values as aggregated “rules” that live in a context free vacuum.

The Space Between

Many would argue that aside from evil villains, most people, and organizations, want to behave ethically. So, it can easily be passed off as “obvious” when trying to make a point that rules and values need to be able to live together. It also sounds perfectly reasonable to just add up rules and values as two things to consider.

But what happens when they come into conflict with each other; namely, what happens when a dilemma occurs? Should employees be expected to navigate ethical dilemmas amid the stress of their daily work even though they have not been solved by philosophers for centuries? Can the overlapping of paradigms sometimes produce dilemmas where there is no “right” solution, or at least not one that an individual can be expected to find?

Interesting insights into navigating these dilemmas can be found by looking into complexity science and constraint theory.

In “Managing complexity (and chaos) in times of crisis. A field guide for decision makers inspired by the Cynefin framework,” Dave Snowden describes some tradeoffs between governing and enabling constraints. In this context, governing constraints would be the definable rules and enabling constraints would be the more context specific values. He writes that “governing constraints (pictured on the left in the diagram on the left) give a sense of stability but are sensitive to change. Enabling constraints, on the other side, provide guidance while allowing for distributed decision making.” They do not look much different but note how the one on the right gives more leeway for making a u-turn.

Organizations are often so heavily reliant on governing constraints, since they give the illusion of control, that there is no space for humans within them to apply judgement for contexts where the rules do not work.

Responsibility and Courage

Rigid and flexible constraints have a complicated relationship. According to a conversation with Snowden, “the two types of constraints cannot live together harmoniously unless boundary conditions are defined.” This means that if we expect the people in our organizations to both be able to follow rules and be able to make contextual decisions, we are giving them a big responsibility, without necessarily giving them the associated necessary power. This responsibility may even require courage to confront a rigid constraint, or rule, if there is a dilemma. And this creates a psychologically unsafe double bind.

Double Bind Theory was established by the founder of cybernetics, Gregory Bateson, in his work 1956 “Toward a Theory of Schizophrenia”, as “a situation in which no matter what a person does, he ‘can’t win’.” Being in a double bind is one of the most psychologically straining experiences a human can have, and often the people who are not experiencing the double bind do not see or acknowledge its existence, leaving the person experiencing highly vulnerable to gaslighting.

Since noticing dilemmas can lead to so much mental stress, it is no wonder that humans often resort to willful blindness, which is “a term to describe a situation in which a person seeks to avoid liability for a wrongful act by intentionally keeping themself unaware of facts that would render him or her liable or implicated (Wikipedia).” It is important to note, as as Margaret Heffernan points out in book, “The dangers of willful blindness: Why we ignore the obvious at our peril”, that willful blindness is the rule, not the exception. Only an estimated group of about 15% of people are willing to confront difficult truths if they must be the first one to speak up. Margaret writes, “As long as it (an issue) remains invisible, it is guaranteed to remain insoluble…You cannot fix a problem that you refuse to acknowledge.”

Setting the conditions for Psychological Safety

Courage and psychological safety go hand in hand. If someone is in a double bind, they need to have courage. But they also need to feel safe enough to use their courage to bring their ideas or concerns forward to a receptive audience. For an organization to work with both rules and values there needs to be a culture that supports openness and makes people feel safe.

Often leaders of companies decide they want a certain cultural trait, so they roll out a top-down initiative stating the culture they want to see. But wanting something is not the same thing as working to achieve something. Trust is not something that a person with power can declare into existence. To do this work, the leaders of an organization must first be open themselves to hearing the narrative of the people in the organization. Reality must be faced before trying to change it. This is not a one-time exercise or a checklist; this work is qualitative and progressive in nature.

The collective narrative of the people at all levels of an organization is vital because the day-to-day experiences people have in making decisions and interacting with others carry important signals about patterns. And patterns are the best “real time” indication of what is emerging, both destructively and creatively. Paying attention to them is important for all cultural change, but understanding patterns related to the following areas are of particular interest for creating the conditions for psychological safety regarding ethical dilemmas: Do people ask interesting questions? Do people speak up when they disagree? Is there conflict? What happens when there is conflict?

Stories allow us to tap into these patterns and once patterns are known, then you know what to work to influence. Do women in the organization feel their voices are not heard? Are there power struggles between certain departments? Are people generally holding back because they feel they have been burned before for raising their voice? With this work, leaders can decide which patterns need to change to create an open environment where people can speak up. Once the desired changes are known, one can design experiments and ways to find out if the narrative is changing, amplifying the actions that are working and dampening the ones that are not.

Conclusion

In Denmark, where Lene found herself in a psychologically unsafe situation, many organizations take pride in their awareness for mental health and even have established psychological hotlines for people who are experiencing stress. This can certainly be helpful if someone is experiencing acute stress. But it does not make sense to send people who are having legitimate reactions to incompatible constraints to a doctor to look for a pathology within them to “treat”. This is just a way of avoiding conflict, and is “mental-health-washing.” Taking time off from logical errors does not resolve the errors. If organizations wish to help their employees’ mental health, they need to adjust the conditions of the organizational ecosystem.

Organizations are becoming increasingly complex due to the increase of integration points in systems architecture and development initiatives. Data is becoming more distributed and the relationship between data sources and data uses more removed. The ethical dilemmas organizations face are becoming more severe and frequent. For the things that are known, compliance rules should be established and rolled out. But to navigate the more complex areas, there needs to be a space to allow for people to speak up without being afraid of the consequences.

Employees need to feel safe to explore hunches, if they see something that goes against their personal values, even if they cannot “prove” there is a problem and even if there is a related rule. It needs to be understood that rules may be created for one context, but not work in another. There needs to be the space for these conversations if companies are going to be thorough in their efforts to truly be ethical. This not only will help the company make better decisions, but it will also help the people who they purposely recruit for their courage and intellect, be able to thrive psychologically while making the best solutions fit for the future.

References

Snowden, D. and Rancati, A., Managing complexity (and chaos) in times of crisis. A field guide for decision makers inspired by the Cynefin framework, Publications Office of the European Union, Luxembourg, 2021, ISBN 978–92–76–28843–5, JRC123629.

Gregory Bateson, Don D. Jackson, Jay Haley, and John Weakland, Veterans Administration Hospital, Palo Alto, California; and Stanford University. “TOWARD A THEORY OF SCHIZOPHRENIA”; Behavioral Science [1956] 1(4): 251–254.

Heffernan, M. (2011). Willful blindness: Why we ignore the obvious at our peril. Toronto: Doubleday Canada.

--

--

No responses yet