top of page

How Bias Shows Up in Teams: Seeing the Patterns Before They Catch Us

In collaboration: Heather Tuffin, Kirsty Cohen, Brian Halse & Kaleb Lachenicht



The Problem with Being Human

“I’m the problem, it’s me.”

We laughed as we all identified with that statement. It was the kind of tired, knowing laughter that comes from professionals who have lived enough chaotic shifts to recognize themselves in the joke. “Past-me” keeps making promises that “present-me” can’t keep. Optimism is a kind of bias too, a tiny everyday act of self-deception that gets us through the week.


That’s the thing about bias: it isn’t an intellectual flaw reserved for careless people; it’s the scaffolding that lets us move through a complex world without burning out.

System 1 is that fast, intuitive mode of thinking, it keeps us alive and mostly functional. It’s only when it runs the show in the wrong setting that things unravel.

Bias isn’t the enemy. Being completely unaware of our bias is.


What We Mean by “Bias”

In patient safety and human factors, we often use the term cognitive bias,  but the more useful phrase is cognitive disposition to respond. It reminds us that bias is simply a tendency, a default direction of thought. There’s an emotional twin, too: affective bias, where feelings, frustration, fear and fondness shape what we notice or ignore.

System 1 thinking gives us shortcuts: pattern recognition, automatic judgments, gut feelings. System 2 is the slower, analytical process that checks the work. The art of safe practice lies in knowing when to toggle between the two — and when fatigue, pressure, or emotion have quietly locked us into the wrong one.


The Biases That Sneak Into Our Work

1. Anchoring and Framing: “Geography is Destiny”

Where a patient lands often determines how we see them.

A case in a resuscitation bay gets urgent attention; the same presentation in a gynaecology cubicle might wait an hour. The first piece of information, where, how, or by whom the story is framed — becomes the anchor around which every decision orbits. How we “label” patients to other healthcare

Awareness comes from asking, “What would I think if I met this patient somewhere else?”


2. Search Satisficing – The Mosquito Story

Once we find a cause, we stop looking. Heather calls it the “mosquito rule”: kill one, turn off the light, and the next one bites you. In clinical reasoning, that’s stopping at the first abnormality, the fractured wrist while missing the carpal dislocation, the aspirin overdose while overlooking the paracetamol. Finding the first problem and assuming it's the only one!

It feels efficient. It’s often premature closure.


3. Psych-Out Bias

When someone carries a psychiatric label, their new symptoms too easily get folded into that story. In an example from a patient that had been seen with a loss of sensation in her arm, seen by a GP, who in glancing at her mental-health history, suggested psychiatric admission before even checking for a neurological cause. We all do this in moment every single day.

Patients with mental illness often have more physical comorbidities, not fewer. Yet our cognitive framing tells us otherwise.


4. Fundamental Attribution Bias – The Traffic Lesson

When someone cuts us off in traffic, they’re a reckless idiot. When we cut someone off, we’re just tired, late, distracted.


The same bias creeps into clinical teams: we interpret others’ mistakes as moral failures but excuse our own as circumstantial.


Recognising this means pausing before judgment and asking, “What story might explain this behaviour if I assume they’re doing their best?”


5. Halo and Horn Effects

Once we decide someone is brilliant, we unconsciously justify whatever they do. The trusted senior’s questionable call is “probably fine.”

The reverse happens too: once someone irritates us, even their good decisions feel wrong.


Trust is essential, but blind trust is dangerous. True respect means double-checking each other anyway.


6. The Superman Bias

Emergency clinicians are notorious for this one...the belief that I can do everything at once.


Competence and overconfidence share a border that only humility can patrol. The safest leaders are the ones who know exactly how much they don’t know.


When Groups Get It Wrong

Humans are wired for belonging. That wiring saves us, and sometimes blinds us.

In the 1950s, psychologist Solomon Asch showed that one-third of people would knowingly give a wrong answer to match the group. Later, the Milgram experiment revealed how far ordinary people would go under perceived authority. We like to think we’d be different. We probably wouldn’t.


I remember hearing a story passed down through my family for years, of my great grandmother who lived in Nazi-occupied Europe. She was deeply opposed to the regime, yet found herself at a rally, arm raised in salute with the crowd. In the story told through my own mother - it was clear that my Great Grandmother was very uncomfortable with how easily she had given up her identity when the energy and expectation of the crowd was so powerful.


That’s how easy it is. Group emotion is contagious, for good or bad. In medicine, we see it when a resus room rallies behind a single diagnosis, or when an enthusiastic leader drives consensus without realizing its conformity.


The System 1 / System 2 Dance

“System 1 keeps us alive; System 2 keeps us right.”

We toggle between them all day long. The challenge is noticing when we’ve stopped toggling. And realising before we are too far down a flawed decision pathway to get back onto the right track without harming the patient.

Under stress, hunger, fatigue, or time pressure, System 2 collapses, and we fall back on the quick patterns of System 1. We go back to what we know, what is comfortable and what feels like it makes sense - even when the decisions are not right. That’s when cognitive forcing strategies become essential: deliberate pauses, structured checklists, prompts that slow thinking down.


Here are some examples of cognitive forcing strategies that we could use in our real lives:

🟥 Red / Blue Team

Deliberately assign someone to argue against the plan. One side builds the case; the other tests it. When challenge is expected, curiosity replaces defensiveness.

🔹 Rule of Four

Before committing, name four possible explanations or options, even the unlikely ones. It breaks the “first-answer wins” habit and surfaces what we almost missed. This also creates the expectation of challenge in each diagnosis, meaning we will hunt a little more and look a little deeper.

⏸️ Pause Button

Anyone, at any time, can call a two-minute stop. Everything freezes except lifesaving actions. The team asks, “What’s changed? What’s our priority?”

Thirty seconds of clarity can prevent thirty minutes of chaos.

📋 Checklists & Cognitive Offloading

External memory saves mental bandwidth. Use quick read-do or challenge-response checklists so the system carries the routine, and the humans can think about the complex.

None of these tools are fancy. They’re simple, repeatable acts of humility that slow the rush to be right and give space for better awareness — the real currency of patient safety and team performance.


Error, Harm, and the Myth of Perfection

We spend so much time trying to eliminate error that we forget: error is inevitable; harm is not.

The aim isn’t “no error.” It’s no preventable harm.

That mindset shift is the foundation of a just culture. It replaces blame with curiosity: What conditions allowed this decision to make sense at the time? What caught it? What could have caught it sooner?

Teams thrive when they expect error and build systems to catch it — not when they pretend it doesn’t exist.


The People Problem — and the People Solution

Group biases show up in quieter ways too: the bystander effect, where everyone assumes someone else will act; the authority gradient, where junior voices fall silent; groupthink, where harmony replaces honesty.


The antidote isn’t procedure in these situations, it’s permission. Permission to question, to pause, to say “I might be wrong” or “What else could this be?”. Sometimes it's about creating permission for those in the team to stand up and counter the accepted direction, plan or decision.


The safest teams are not the ones with the most checklists. They’re the ones that have normalised curiosity.


Awareness Over Perfection

Growing isn’t about stopping bias. It’s that the time between the bias and noticing it gets shorter.


In reflection I am moving closer to understanding myself in these high risk, challenging spaces, and realise that maybe we wont ever be better, maybe there is no moving closer to perfect, maybe there’s only better awareness that we will never get there, and that is enough to make us safer practitioners for our patients.

In the end, human factors is less about erasing the human and more about understanding the patterns that make us so wonderfully, dangerously human.

We are, after all, the same species that forgets lunch, mislabels results, builds helicopters, and saves lives, sometimes in the same hour.


Take-Home Reflections

  • Bias is inevitable; blindness is optional.

  • System 1 helps us cope; System 2 helps us correct.

  • Awareness grows in teams that speak up, check each other and place the patient over ego

  • The goal is not no error, it’s no harm (or at the very least, as little harm as possible)

  • The work of safety is the work of staying curious.

 

This summary comes from a conversation on patient safety and human factors, we enjoyed the conversation and hope you have enjoyed the read!

 



Comments


bottom of page