top of page
Search

Failures that Teach: Psychological Safety and Organizational Learning

F 35 - Pilots
F 35 - Pilots

Some time ago, I lived through an experience that illustrates, in a very concrete way, how small failures in complex systems can reveal profound lessons about leadership, organizational culture, and institutional learning.

I was accompanying my father during a hospital stay. One night, still in the intensive care unit, a nurse entered to administer several medications. As I observed the procedure carefully, I noticed that he had separated a small plastic bottle labeled OtoCinalar. I am neither a physician nor a nurse, but I was familiar with the medication. It is an otologic product, typically prescribed to treat inflammation or infections of the external ear. Yet at that moment, the nurse opened the bottle and—apparently without realizing it—moved toward my father’s eyes to administer it. I had to intervene immediately. The nurse stopped and realized the mistake before it occurred. The moment, however, raises a series of inevitable questions:

  • What consequences might that action have produced?

  • What if the medication had been something else—perhaps injectable or with greater risk potential?

  • What if I had not been present that night?

  • What if another family member had been there, someone unfamiliar with the medication’s intended use?

These questions matter because when a failure occurs within a systemic process, the goal should not be to find someone to blame, but to learn from the error. The essential task is to identify its causes and eliminate them permanently. Searching for a culprit is always the easiest path. Once someone is blamed, the questions stop. And when the questions stop, learning stops as well.


Failures in Complex Systems

That night I witnessed a failure in a hospital environment—a system highly structured and governed by protocols. Some failures are benign; others can be catastrophic. Even systems widely considered models of operational discipline remain vulnerable to error. Commercial aviation provides a clear example. Every aircraft departure is preceded by a rigorous checklist performed jointly by the pilot and copilot. Yet even in such disciplined environments, failures can occur when procedures are followed mechanically rather than thoughtfully. A well-known example is the Air Florida Flight 90. The aircraft departed from Ronald Reagan Washington National Airport bound for Fort Lauderdale. It was January.

Before departure, the standard checklist was completed:

Copilot: Pitot heat system.

Pilot: On.

Copilot: Anti-ice.

Pilot: Off.

Copilot: APU (auxiliary power unit).

Pilot: Operating.

Copilot: Power levers.

Pilot: Idle.

At first glance, the exchange appears routine. Yet on that particular day, winter conditions were severe, with ice accumulating on the aircraft. Accustomed to operating in warmer climates, the crew automatically maintained their habitual configuration—with anti-ice systems turned off. The checklist was performed. But reflection did not occur. Routine had replaced attention. The aircraft departed with ice on its wings and compromised sensors. Shortly thereafter it crashed into the Potomac River. Seventy-eight people lost their lives. The failure was not due to technical ignorance. It resulted from something far more common—and far more dangerous: the automation of routine.


The Paradox of Good News

Now consider the everyday organizational environment. During a typical workweek, what do you hear more often?

  1. Good news: progress, alignment, “everything is fine.”

  2. Bad news: problems, disagreements, requests for help.

At first glance, the first scenario seems to indicate a healthy and efficient team. The atmosphere is lighter and morale appears higher. Paradoxically, however, this can be a warning sign. In any complex work environment—characterized by uncertainty, pressure, and interdependence—it is statistically improbable that no problems, disagreements, or operational difficulties arise. When failures stop being mentioned, they rarely stop existing. They simply become invisible.


Organizational Silence

Why do people avoid reporting failures?

The answer is simple—and deeply human. No one wishes to experience embarrassment, exposure, or judgment. Often there is fear of retaliation, damage to one’s career, or loss of professional reputation. Under these circumstances, employees frequently choose an individual solution: they solve the problem locally without informing the organization. On the surface, the problem disappears. But only on the surface. The structural causes remain. The system does not learn. The error may reappear—sometimes on a larger scale. Those who solve a problem locally rarely possess a complete systemic view, which means that local solutions may conceal deeper organizational vulnerabilities.

In organizations that search for culprits, the sequence typically follows a predictable script:

1.     A failure occurs.

2.     A responsible party is identified.

3.     A sanction is applied.

4.     The case is closed.

The real consequence, however, is that fear takes root. And fear is one of the most powerful drivers of silence. When failures are no longer reported, organizations lose their most valuable source of learning.


The Culture of Psychological Safety

In organizations that pursue operational excellence and genuine cohesion—what we might call a Culture of Extreme Teamness—failures are treated differently. Every employee, at every level of the hierarchy, must possess psychological safety in order to report problems.

Psychological safety means being able to say:

  • “There is an error here.”

  • “I need help.”

  • “This procedure may be wrong.”

And to say so without fear of punishment, exposure, or retaliation. This does not mean tolerating negligence or irresponsibility. Rather, it recognizes a fundamental reality: in complex systems, collective learning depends on individual transparency. Organizations that learn are not those that eliminate failures—an impossible task—but those that make failures visible quickly and learn from them. Such a culture does not emerge from manuals or institutional speeches. It is built daily, in the relationship between leaders and their teams. When mutual trust exists, people speak. When people speak, systems learn. And when systems learn, organizations become safer, wiser, and more resilient. Ultimately, the true strength of a team lies not in the absence of errors, but in the magnanimous capacity to confront them together and grow from them.


Conceptual reference: Amy C. Edmondson, The Right Kind of Wrong: The Best Teams Use Failure to Succeed.



by Asfene G. Macciantelli

The Author of EXTREME TEAMNESS — The Culture of Magnanimous Cohesion





 
 
 

Comments


bottom of page