A conversation with Tom Geraghty. Safety Culture, Organisational Strategy & Transformation @ Psychological Safety.

I think this aspect of psychological safety is probably one of the most interesting that I think about in terms of psychological safety. So first of all, most of us learn management from our managers. We learn what management looks like, we learn what management is through our own managers. And because of the way time and careers work, most of the managers that we have, certainly early in our career, are themselves fairly inexperienced, fairly junior managers. So there’s this sort of feedback loop of inexperienced managers teaching inexperienced people how to be inexperienced managers. And so we don’t always get exposed to the really, really competent managers and leaders until later on in our career. And that’s when we realize, “Oh no, that’s not the way we should do things.” – Tom Geraghty

In this episode of Control the Room, I had the pleasure of speaking with Tom Geraghty about his journey helping make the world of work a safer, higher performing, more inclusive and equitable place.  He starts with reflections on microaggressions in the workplace.  Later, Tom shares why he helps organizations embrace emergence and experimentation. We also discuss simple tips for finding the sweet spot between productivity and exploring.  Listen in for thoughts on how and why embracing discomfort and disagreement lead to higher performing teams.

Show Highlights

[1:30] How Tom Got His Start 

[12:35] Examining Acquired Organizational Trauma

[22:22] A Definition Of Psychological Safety

[31:20] Finding The Psychological Safety Sweet Spot

[41:28] Modeling The Behaviors That Create Psychological Safety

Tom’s Personal Website 

Tom on LinkedIn

Tom on Twitter

Psychological Safety
https://psychsafety.co.uk

Amy’s 1999 research on clinical teams: https://journals.sagepub.com/doi/abs/10.2307/2666999

Psychological safety, diversity and team performance: https://hbr.org/2022/03/research-to-excel-diverse-teams-need-psychological-safety

Sidney Dekker talking about DevOps and aviation safety: https://www.youtube.com/watch?v=pmZ6wtOmTZU

About the Guest

Tom is the founder of psychsafety.co.uk. His first job title was “Experimentalist”, which set the tone for the rest of his career. He then made the move into technology, and many subsequent years of leadership roles has made Tom passionate about psychological safety and generative leadership. 

Outside of work, Tom spends as much time as possible outdoors, and is studying for a Masters Degree in Global Health and Humanitarianism.

About Voltage Control

Voltage Control is a change agency that helps enterprises sustain innovation and teams work better together with custom-designed meetings and workshops, both in-person and virtual. Our master facilitators offer trusted guidance and custom coaching to companies who want to transform ineffective meetings, reignite stalled projects, and cut through assumptions. Based in Austin, Voltage Control designs and leads public and private workshops that range from small meetings to large conference-style gatherings.

Subscribe to Podcast

Engage Control The Room

Voltage Control on the Web
Contact Voltage Control

Full Transcript

Douglas: Welcome to the Control the Room podcast, a series devoted to the exploration of meeting culture and uncovering cures to the common meeting. Some meetings have tight control and others are loose. To control the room means achieving outcomes while striking a balance between imposing and removing structure, asserting and distributing power, leaning in and leaning out, all in the service of having a truly magical meeting. Thanks for listening.

If you’d like to join us live for a session sometime, you can join our weekly Control the Room Facilitation Lab. It’s a free event to meet fellow facilitators and explore new techniques so you can apply the things you learn in the podcast in real time with other facilitators. Sign up today at voltagecontrol.com/facilitation-lab. If you’d like to learn more about my book, Magical Meetings, you can download the Magical Meetings Quick Start Guide, a free PDF reference with some of the most important pieces of advice from the book. Download a copy today at magicalmeetings.com.

Today I’m with Tom Geraghty, founder at psychsafety.co.uk, where he is on a mission to make the world of work a safer, higher performing, more inclusive and equitable place. Welcome to the show, Tom.

Tom: Thank you so much. I’m super pleased to be here. I really appreciate it.

Douglas: Yeah, it’s so great to have you. And as usual, let’s get started with hearing a little bit about how you got your start in this work of psychological safety.

Tom: Yeah, so my background is in tech and I started off in tech sort of maybe 20-ish years ago and started off as a sysadmin doing operations stuff and then ended up in a larger operations team heading up operations for a big European motor industry kind of thing. And I’m being intentionally vague so no one can identify who it was. And my manager there was a very stereotypical kind of finger pointy, shouty, screamy, blamey boss, the sort of boss that has a vein pulsing over their forehead when they get angry, that sort of boss. And we had a big open plan office. Everyone worked in there. Everyone worked together in this open plan office. And he had this fancy glass cubicle in the corner. And every now and then he would come out of it, that glass cubicle, and just rip someone apart for some minor mistake, some minor infraction, or anything, really. In fact, I remember him coming out and having a go at someone once for laughing.

And so he created this culture of fear. He managed through a culture of fear, he led a culture of fear. And in fact, out of work, he was a nice guy. He thought that leading through fear, managing through a culture of fear was the right thing to do. He thought that that was the way to a high performance team. But in fact, it was fairly obvious to me and many others that what was actually happening was that it was kind of calcifying and slowing down the organization.

People were afraid to do anything. People were afraid to try anything. People had to plan and plan and plan and plan just in the most basic of tasks in case something went wrong. And so they could always say, “Look, I planned it. This is all the planning I did.” And so technology got out of date. Stuff wasn’t patched and updated because it was more interpersonally risky to do something than it was to do nothing. The safest thing to do was to do nothing.

And it really held the organization back. You could see we were getting out-competed by competitors innovating, or in fact not even innovating, just keeping the pace of the market. And so I decided, I saw this and I didn’t know what the name for it was. I didn’t know the terms safety cultures or psychological safety. But I knew from that point on I wanted to do something different to that. And so from that point on, I was leading teams and I was building teams and I ended up in CIO, CTO roles subsequent to that.

And I would often think what’s the opposite of what he would do? I’m trying to do that. I’m trying to build a safe culture. And then it was only a few years ago, maybe five or six years ago, that I discovered the term psychological safety. And it was a proper, you imagine a light bulb going off over my head. It’s a proper light bulb moment, proper epiphany moment. This is the thing. This is the thing I’ve been trying to do. And it was a huge, suddenly all the doors opened, all the access to the knowledge and information kind of flood-gated in because I was able to then hang all this stuff on a term. And it also meant that was what sparked off the creation of psychsafety.co.uk. Because I had a place, I wanted to put all my stuff that I’d created in one place and share it with the world.

Douglas: Amazing. I can totally relate with that story because as most listeners know, I was a software developer early in my career and became a CTO. And my journey was always kind of this funny combination of being interested in the tech, but also being really curious about the people and how we support them and how we help them grow. And so it sounds like we got to similar points. And I wanted to touch on something that you had mentioned about the boss, and it was similar for me as well. Not only was I interested in people and growing, but there was examples of folks that are like these anti-patterns that I was like, “I don’t want to be that,” like the boss that came in and just made sweeping decisions about how everyone’s desk needed to look or be set up. And like just crazy things.

And as I was hearing your story, I was brought back to some of those moments and kind got a little bit sympathetic with these folks because a lot of these are learned behaviors. The world of work has conditioned people to behave and think they need to be that way, especially when you’re talking about him being a nice person outside of work. And so I’m just curious, in your coaching and the work you’re doing, how have you approached helping people understand that it’s learned behavior on both sides? Looking at the boss and saying, “Wow, that’s like, maybe they’re just echoing or parroting things that they’ve seen.” And then likewise, helping dismantle the behaviors for the folks that are doing those things.

Tom: And that’s a really interesting point. I think this aspect of psychological safety is probably one of the most interesting that I think about in terms of psychological safety. So first of all, most of us learn management from our managers. We learn what management looks like, we learn what management is through our own managers. And because of the way time and careers work, most of the managers that we have, certainly early in our career, are themselves fairly inexperienced, fairly junior managers. So there’s this sort of feedback loop of inexperienced managers teaching inexperienced people how to be inexperienced managers. And so we don’t always get exposed to the really, really competent managers and leaders until later on in our career. And that’s when we realize, “Oh no, that’s not the way we should do things.”

But sort of to add to that, and probably the more interesting point to that, is that so much of psychological safety and safety science and safety cultures are counterintuitive. So if we think about mistakes, if we think about a manager who wants to have their team make fewer mistakes, and that’s a fairly noble aim, it’s a sensible thing to try and achieve in a team. We don’t want mistakes to happen, so let’s punish people for making mistakes. Because if we punish people for making mistakes, they’ll make fewer mistakes. That’s pretty logical. But in fact, what happens in the real world is counterintuitive to that, what actually happens in the real world. In fact, so say a manager starts blaming and punishing mistakes, over the next few weeks and months, they’ll see fewer mistakes in the team and they’ll think that they’re really successful. “I’m doing a great job and the rate of mistakes are going down, we’re reporting few mistakes. This is excellent.” Until the big one blows up in their face.

But that’s because by blaming and punishing mistakes, actually all we’re doing is hiding them. They’re still there. They’re still there in the system, they’re still, especially in tech, someone leaves a bug in the system or makes a mistake or covers up, it’s still sitting there ready to explode or ready to cascade into failure. And in fact, this is what Amy Edmondson’s research from 1999 showed in clinical teams. The clinical teams that talked about their mistakes were able to do something about them and put in processes and mechanisms by which they didn’t happen so much in future and they were able to mitigate them in the future. And that’s the high performing team. And the low performing teams hid their mistakes. So they kept happening again. The same mistakes kept happening again, and the impact was so much greater.

We see the same thing with airlines. Sidney Dekker’s seen the same thing with airlines. The airlines that have the most incidents on their books are actually the airlines with the lowest passenger mortality rate. And so we see this at team level and organizational level as well.

Douglas: You mentioned the importance of not punishing mistakes. And I’m curious if you’ve seen organizations where the punishment of mistakes isn’t quite so obvious, but people still feel punished or are still noticing that it’s either socialized in a awkward way, or it’s not like you’re, “Hey, go stand in the corner because you made a mistake.” But what are some of the subtle ways you’ve seen people get punished for mistakes?

Tom: In fact, this is far more common than actually punishing people for mistakes. And this is what we see. We see this all the time. And in fact, in lots of workshops and sessions I run, I find lots of leaders and managers, as we do some self-reflection, they realize, “Oh man, I’m doing some of those things.” And it’s things like if someone gives you bad news, whether it’s a mistake or it’s just bad news, like the project is overrunning or we’ve broken through the budget or something’s gone wrong and we need to give the boss bad news, and their reaction is a sigh or an eye roll or some other, just a little physical or verbal tell that they’re a bit disappointed, then yeah, sure, that’s not as bad as actually punishing someone and sending to the naughty corner or something.

But over time, that will build, that coagulates into a team where they don’t want to give the boss bad news and they’ll only tell the boss bad news if they really have to, if the consequences of not telling them the bad news would be greater than the consequences of telling them the bad news. So yeah, it’s all those little verbal and physical tells, like the eye roll and the sigh and the arms crossing and things like that. We should be really, really careful as leaders and managers, in fact anyone on a team, to, if someone tells us bad news that we need to hear or is useful to hear, we should thank people for doing so. Even if inside we’re thinking, “Ah no, that’s my weekend ruined,” or whatever. Thank them for doing so because it took guts.

Douglas: That reminds me of how people might be bringing in past trauma as well. Because if they’ve gotten blamed a lot, then they may feel blamed even though, and the culture at this organization, we’re just trying to understand what’s happening. We’re not trying to find somebody, we’re just trying to find the truth. And I remember running into this with a startup that I was advising as a kind of a fractional CTO. They asked me to come help them diagnose some things at times, whatnot. And so there was this one moment where things were going a little sideways, we were trying to understand it. So we had a meeting to dive into it and was speaking with this contract developer and was asking some questions and trying to understand it and just spoken and whatnot.

And I could tell they were getting really defensive. And so I started to ask about that and get curious. And they were like, “Well, it seems like you’re just trying to point out what I did wrong.” And I was like, “Wait, this isn’t about you. This is about us understanding how we can improve it and what we could put in place to avoid this.” And so that just really spoke to me as like I’m glad I named it because then we could talk about it a little bit almost in this meta way. I’m just kind of curious if you’ve seen that, have any thoughts on those kinds of dynamics of people coming in with this past trauma?

Tom: Yeah, I really do. In fact, I was doing a workshop the other week and I was talking to a team and a manager. In fact, their manager was sort of talking about the culture they have, the culture that their team has. And they did. They had a really good psychologically safe culture. They had a bunch of good practices. And their manager kind of thought therefore everything is fine.

But we dived into the idea of this, what I sometimes call it, like the psychological safety backpack. This rucksack that we’re carrying from all our previous experiences, previous trauma, previous managers who have screamed at us and shouted at us for some mistake. And if the same sort of situation occurs in this new team, however safe we feel in this new team, there’s going to be a little of that muscle memory or that sort of mental, that reactionary memory that makes us feel a little bit like we need to put up a mask or put up a shield and be a bit more careful about how we interact, even though there are no indications that that team are unsafe at all. But we’re carrying this backpack.

And I think particularly in tech, like you were just saying, we’ve put together quite a lot of good practices. Things like in terms of incident response or incident analysis. We started off with the five whys. The five whys are a really effective tool. But we’ve sort moved on from there a bit and we are moving into trying to talk about blameless retrospectives, blame-aware retrospectives. Because we’re now realizing we can’t really be blameless, like we are humans and we’re wired to try and find blame and find cause. That’s how we’ve evolved. We’ve trying to point the finger at things and tried to find the cause of things or the root cause of things. There’s no such thing as one root cause.

So I think John Allspaw talks about the infinite hows instead of the five whys and asking, this is kind of what you were just talking about, asking how did this happen or how did this event occur is far more powerful than asking why. Why did this happen or why did you do that? Because asking why does tend to lean into, it starts to veer towards blame. It starts to, people sit there and think, “Well, why did I do that?”

Douglas: Why elicits a justification. How is more expansive. We start to contemplate how did that happen?

Tom: Yeah, yeah, yeah. We’re talking about the real world and the real things happen, not just why did I decide to do that? Sidney Dekker talks about some really, really powerful stuff in this way. So he talks about there being no such thing as human error. And he’s kind of right. Whenever we make a mistake, in tech we can more easily envision this. And in other domains it’s not quite so easy. But in tech, if we deploy something to production that then blows up, something should have caught that. Something, some process, some tool, some gate, something should have caught it. Something should have stopped us, prevented us from making that mistake. And if that did get into production, it’s not us at fault. We didn’t intend to release something that caught fire into production. So yeah, where’s the human error in bringing production down?

Douglas: I see that applying to everything we do in work. Because I think in tech there is oftentimes more documentation than we might see in other roles. Also, that software in itself is somewhat self-documenting because it is a procedural, it does a thing because we programmed it to do a thing. And if you look at it, you can say, “Oh, this is how it works, or this is how the test works.” So these structures we create are a process of themselves. But when the process is more organic or there’s handoffs across teams, doesn’t mean we can’t write it down. And if we do write it down, then we can look and we can say, “Oh, we set that up wrong.” It’s not like someone did it wrong. It’s like, no, they followed the procedure, but the procedure failed us.

Tom: Yeah, yeah. And in fact, yeah, I’m a big fan of documentation. I think it’s super powerful. One of the things I really like is to see teams, I’ve seen teams where an engineer, a developer or whoever will record a little video even, record a video or a bit of audio just explaining briefly why they made a certain decision, why they architected it this way instead of that way, why they coded it this way instead of that way or instead of a number of other ways.

Because documentation, documentation is really effective at telling us what this thing does and what it should do and what the input is and what the output should be. What it doesn’t often tell us is why is it like that? The amount of code and the amount of systems we look at, even stuff that we built ourselves over six months ago, we look at it and go, “Why is it like this? I wouldn’t build it like that.” And then you make a change. You rebuild it and then you realize, “Ah, no, it’s built like that because of this thing way over here.” So we’re going back to the whys. Those whys are really powerful.

Douglas: And what comes to mind to me, and this shows up in our work a lot, is the storytelling that’s happening when you get on the video. And narrative is different than technical writing. And I think what you’re talking about, documentation tends to be, like we tend to have a technical writer hat on when we’re doing documentation. And when we’re making a video, we’re more in the storytelling mode, it’s like campfire time. So we’re likely to get more into the purpose, the why. And we’re going to have more passion. Our cares and our deepest desires are going to come out more when we’re on video and talking. And people can connect to that.

Tom: Yeah, the motivation and the why and the rationale, really powerful.

Douglas: So you were talking about hiding mistakes earlier, and it struck me as, and there’s this, you kind of hit on this in the intro too. I like to think about how sometimes safe teams can look quite messy and a bit chaotic, whereas the unsafe teams might be orderly and might seem to have everything together. And it’s because they’re hiding things. They’re giving the curb appeal as everything’s whitewashed and looks like it’s clean and whatnot. And I’m just wondering if you’ve noticed that pattern and maybe some of the telltale signs that like, hey, this chaos is good chaos or this is actually good.

Tom: Yeah, and I think a lot of this actually comes back to some of the Agile principles, DevOps principles and stuff like that. Because most psychologically safe teams will work more in experiments than well architected waterfall-y plans. A well planned, a heavily planned project or team looks very controlled. But often a lot of that plan and a lot of the output is actually just fiction. It’s just what we’re reporting on. It’s not what we’re really doing. We also might not be moving that quickly at all. We’re doing the safe things, we’re doing the things that are safe to execute on. Whereas the psychologically safe team, they’re thinking of ideas, they’re making a bet, they’re identifying the things that are safe to fail on. Some things aren’t safe to fail, but we can design experiments that are small enough and focused enough that we can accept or embrace failure and we can learn from failure.

I talk a lot about experimentation and psychological safety because the only experiment that failed is the one we didn’t learn from. I’m always reminded of Elon Musk’s, all his rockets that he tried to land backwards and so many of them blew up on the platform. But there’s some great footage of when they land and they blow up and you can hear and see him saying, “That was great, that was great. We learned loads from that. This is great. That’s a great test. Well done team, brilliant work.” And it’s just blown up on the platform. It’s a huge rocket exploding.

So in another team, in a different world, that would be a massive failure. But in a team that embraces failure and learns from it, it can look more chaotic, it can look like they’re failing all the time, but actually they’re learning all the time and they’re getting better at what they’re doing all the time. So yeah, a more psychologically safe team could look more chaotic. It certainly look more unsure, more unpredictable, more iterative.

Douglas: They’re embracing the emergence. And so the emergence becomes more obvious and more transparent.

Tom: Yeah, yeah, yeah.

Douglas: I wanted to bring up something about, in the intro you said it was safe to do nothing in the organization. I was planning on bringing it up and then you just said some teams just do the safe thing. And then you talked about what is safe to do in a psychologically safe team. So it kind of came full circle because it dawned on me, or I had this epiphany, that safe is what you define it as. And what do you make safe in the organization? And psychological safety is a lens at looking at how do we create safety for people to speak their minds and say things. But also another lens for that could be what are the activities, what are the things that we want to encourage and allow people to feel more confident doing? How can we define what we want to be safe?

Tom: So that’s a good question. I think to some degree this comes back to, so I mentioned Amy Edmondson’s research earlier, the research she did. So it was 1999, and she was studying clinical teams who were, and she was looking at the number of mistakes they made. And through her research and then she did some qualitative research into the teams themselves, she came up with the definition of psychological safety, that it’s about someone feeling safe in a team to ask questions, raise concerns, admit mistakes, and to take interpersonal risks. And I think it’s really important to sort of surface those different things. If we don’t feel safe to suggest our ideas… So our ideas normally are in our head and they’re quite safe because no one can get at them, no one can criticize them while they’re still in our head. But if we take them out of our head and put them on the table, then they’re there. They’re vulnerable for all to see and criticize and change. But that’s a good thing. But if we’re not safe to do that, we’re never going to come up with new ideas.

And what we want to actually foster, what we want to create is a world where it’s safe to suggest unformed, as yet incomplete ideas. So we talked about mistakes earlier. We definitely want to create environments where people can admit their mistakes and surface their mistakes. Because if we don’t admit our mistakes, that results in disaster. That results in disasters like Chernobyl, the Volkswagen emission scandal, N1, the global financial crisis, all sorts of things. If we don’t create an environment where people can lay out the mistakes on the table, even the unformed, immature, incomplete ones, the ones that they’re not yet sure about but they just think there’s something there, that we’re not going to innovate. We’re not going to do anything new.

And so this is another thing I like about psychological safety, because whatever type of organization or team you’re in, whether you are in a sort of risk averse, highly safety critical world, whether it’s finance or nuclear power or healthcare or whatever. So you are in a risk averse space, or the other end of the spectrum where you’re in the highly innovative, fail fast, move fast and break things kind of world, a presence of psychological safety contributes to both those endeavors, both those goals. It helps us avoid mistakes and mitigate mistakes, but it also helps us innovate faster and do more things, do more interesting things quicker, and try out the experiments to see if they even work.

Douglas: So when you were chatting with me preshow, we were kind of talking a little bit about the misperceptions that are myths. So reminds me of that because you’re kind of sharing these definitions and these kind of nuances around psychological safety. But kind of expanding on that definition, it might be interesting to highlight some of the myths or misperceptions around where people get stuck or where do they get it wrong.

Tom: If we go back to the definition, one, a shared belief, the team is safe for interpersonal risk taking, and you feel safe to raise concerns, admit mistakes, ask questions. And when people hear sort of this interpersonal risk taking and those sorts of definitions, sometimes in some cultures people can feel that psychological safety is a bit like wrapping people in cotton wool. It’s a bit of a soft, snowflake-y sort of approach to team building or leadership. And there are some leaders out there, many managers out there, I suspect, who feel that no, they’re the iron fist in a velvet glove kind of approach. But I think it’s worth diving into this because it is a bit of a myth that psychological safety means just comfort or sheltering the team from the real world, wrapping them in cotton wool, because it’s really not. There’s a lot of great work being done by the US Army, the British Army as well, special forces, into psychological safety in their squadrons.

And so maybe another myth actually is that hierarchy is bad for psychological safety. And that’s not necessarily the case. Hierarchy can be bad for psychological safety. It depends on the structures and how the hierarchy is used. But hierarchy, the army, for example, is incredibly hierarchical. You’ve got ranks and divisions and squadrons and all the rest of that stuff. And you have to do what your commanding officer tells you to do.

But in military squadrons, psychological safety is high, it’s engendered to be high because you need to know that your squad have your back, have your six. But it’s in part that hierarchy that actually builds that psychological safety. It enables people to understand exactly what their role is, what their responsibility is, what their boundaries of decision-making and authority are. It’s that predictability within the unit that helps foster psychological safety.

But going back to the wrapping in cotton wool thing, those squadrons, they’re not wrapped in cotton wool. They’re being sent into incredibly dangerous environments and being asked to do incredibly challenging, dangerous things. But they do so. They’re able to do so through that great deal of psychological safety. And there’s more examples of mountaineering and climbing and other incredibly dangerous endeavors where a high level of psychological safety is the thing that allows people to face such existential danger, a lack of real world safety.

Douglas: That’s interesting that you bring up the hierarchy because it seems like the hierarchy gets the blame a lot when it’s really how the hierarchy is instrumented and what people are doing up and down the chain. What behaviors are they reinforcing, and well, how are they taught to be in their position and support various behaviors up and down.

Tom: Yeah, and I’m reminded of the power over versus power to kind of dichotomy. In many hierarchies, people consider it power over, that they have power over them, I have power over these people. In a well-designed hierarchy, it’s a generative culture, a progressive culture. That hierarchy is used to provide power to the people who need to do the thing. And that’s the difference. It’s whether you are empowering or taking power from. And hierarchies can do both, just like you say. So it’s how it’s used or abused.

Douglas: Yeah. It also reminds me of the messiness too that I kind of brought up, which is it’s not necessarily about, and to your use your words, the cotton wool. It’s like sometimes it might be uncomfortable, sometimes it might be scary and not feeling great. But if I can speak my mind and let people know things are happening in an unexpected way, then we’ll all benefit and we might save all our lives, in the military example.

Tom: Yeah, exactly. One of the examples I use quite often is that of Nimsdai and his team of sherpas who, I think, last year completed their mission to summit the 14 highest peaks in the world in seven months, when the previous record was seven years. And part of the reason they were able to do that was because they were a tight-knit team of incredibly accomplished, experienced expert individuals. But a big part of the reason they were able to do so is that psychological safety that.

For example, when they were climbing up the mountain, if one of them spots an avalanche, what they think might be an avalanche risk, say, over to their side, they feel psychologically safe to raise that and point that out. Whereas maybe on a different team, they’d be accused of being, “Oh, don’t worry. You’re just being negative,” or something like that. The safety and being able to point out the dangers or raise concerns, like “I’m getting a headache, it might be a cerebral edema. Think we need to go back down 500 meters,” or something like that. That’s what enables teams to do such incredible things. And it’s not about wrapping people in cotton wool. It’s about that safety in being able to face the danger.

Douglas: Two things come to mind and they’re kind of related. It’s basically two ends of the spectrum. On one end of the spectrum, there’s kind of, I think you used the word weaponizing psychological safety in the preshow chat, and on the other end of the spectrum, we’re kind of so safe and maybe so tolerant and so curious that maybe we’re prioritizing conversations or listening to things that aren’t providing business value. So how do we find that sweet spot of making sure that we, like take the mountaineering example. Let’s say someone’s just so afraid of avalanches, like every little noise there’s like might be an avalanche. That’s not going to be a healthy mountaineering group. So any tips or advice to avoid either one of those ends of those spectrums?

Tom: Yeah, I think that’s a good call, actually. Because in a psychologically safe team, other members of the team will feel safe to speak about, safe to surface the behaviors and dynamics of everyone in the group. And so if there is someone in the group who’s behaving in a certain way that’s detrimental to getting the job done or even detrimental to the psychological safety of the group itself, then members of the group feel safe to call that out without fear of repercussion, without fear that they’re going to be punished for doing so.

So in, say, a software development team, if you’ve got one person in the group who, I don’t know, say, they always turn up to standups five minutes late. And they do so because they feel safe, they feel sort of comfortable, they feel like there’s an environment where they’re not going to get punished or there’s no consequences. But actually a true psychologically safe team doesn’t mean zero consequences for bad behavior. I mean, in fact, it almost means the opposite. It means we more rigidly and more explicitly talk about those boundaries of behavior and what we expect and don’t expect. And that evolves over time. We’re not going to be able to set a social contract or something year one and that stays true and stays immovable for the next few years. This has to evolve. We’re going to discover boundaries and behaviors and things that we need to add or remove from this contract, as it were.

Douglas: So thinking about the other end of the spectrum, which is that kind of weaponized psychological safety or psychological safety at all costs, I think you even mentioned this notion of positive vibes. I’m curious to hear your thoughts on that or if you’ve encountered teams like that, how you coach them or advice you have.

Tom: Yeah, one of the things I do come across is this kind of positive vibes only, this good vibes only environment, where, and it comes from good intentions, it comes from people trying to do the right thing and create really nice cultures, really good cultures. But what it can result in is a team where any concern, any criticism, anything that’s considered bad news or negative is shut down. And it’s shut down because you’re destroying our vibe. We’ve got a good vibe here.

And again, psychological safety doesn’t mean freedom from discomfort. It means embracing the discomfort, talking about it and surfacing those things. So if someone does have a genuine concern about a project, say a project that’s running over time, running over budget and maybe someone thinks we should just cancel this project. This is not going anywhere and we should can it now. So in a good vibes only team, we don’t want to have that discussion. Let’s just carry on. It’ll be fine. But actually the sensible thing might be to can it. So we need to have those discussions. If someone’s being overly cautious, overly concerned, then a psychologically safe team will also be able to have that discussion.

Douglas: Yeah, it’s making me think about General Patton’s quote that if everyone’s thinking the same, nobody’s thinking. Or Sloan had an interesting thing about one of his meetings that he canceled because there were no points of disagreement. But the thing is, a psychological safe team is going to have points of disagreement, they’re going to have different points of view. So if everything’s smooth sailing and no one’s interjecting with different ideas that are not going with the grain, either we don’t have a diverse enough team or there’s probably a lack of psychological safety.

Tom: And I’m really pleased that you’ve just said diversity or diverse enough team because this is a key point. And there’s some really interesting research by Amy Edmondson and others looking into diversity, psychological safety, and performance of teams. Now, her research and loads of other research shows that psychological safety is harder to achieve in a diverse team. So the more diverse the team is, the more difficult it is to obtain, to build a degree of psychological safety. Because we all come with these different behaviors, different contexts, different experiences that we bring into the room.

And psychological safety is easier to achieve in a very homogenous team. We’re all the same, we’re all bringing the same experiences. You can imagine a room full of Silicon Valley bros all getting together, and they’ve never met each other before. But put a few of them in a room, they’re probably going to feel psychologically safe the moment they walk in. But we put together a team of really diverse individuals from all around the world speaking different languages, different socioeconomic backgrounds, different neurodiversities, it’s going to require more work to build that psychological safety.

However, what’s really interesting is that once we’ve put in that work to build psychological safety in that more diverse team, the potential performance of that diverse team is way higher than the homogenous team. So it’s kind of an accelerant, like a catalyst. Psychological safety is a catalyst for performance in the presence of diversity in teams. So yeah, diverse teams do perform better. They bring more ideas to the table. But we need that psychological safety in order for them to feel safe and putting their ideas on the table and speaking up.

Douglas: One thing I wanted to hit on before we wrap today is this idea of long-lived versus short-lived teams and how our tactics or our approach for psychological safety might differ. And this is something we spoke about in the preshow chat. So just for our listeners, I might take a moment to explain what we mean. Short-lived being, hey, we want to create a space. We might be doing some participatory design with the community. These people are coming together for a few days or even just a few hours. And we want to create an environment where people can speak their minds and make it comfortable. But then also there’s long-lived safety, which is a team, a sports team, or a company. And we want to foster that teamwork together over months and years even. So I’m curious how that’s surface for you as far as different techniques or considerations.

Tom: Yeah, I think this is a really interesting challenge because they’re very different cases, just as you said. For example with the long-lived sports team. A sports team that play every weekend and they’ve been going for years. They know each other really well. They know what each of them are like. They know each other’s strengths and weaknesses. They know how people behave on the team. They know which one’s the introvert, which member of the team is the extrovert. So they understand each other. They innately, implicitly understand what it means to be on that team, what the boundaries, expectations, what the limits of behavior are, and what’s expected of them on that team.

Now if we come together as a short-lived group, it’s much more difficult to build that psychological safety. Because psychological safety, I think one way of looking at it is about predictability, interpersonal predictability. If we come together as a group, especially a big group, so generally the bigger a group is the more difficult it is to build psychological safety. If we’ve just got a group of three or four people, that’s going to be much easier than 3,000 or 4,000 people.

But if we’re coming together and convening just for a short amount of time, what we need to try and do is create shortcuts, if you like. We need to create these little shortcuts to predictability. And we might not have the time for every single person to really deeply get to know each other. But what we can do is create systems, scaffolds in place to aid with that predictability. So a social contract, like a temporary social contract for the space is really, really powerful. Whether it’s if you’re on a remote Zoom call or whatever, whether it’s just talking about whether you have cameras on, cameras off, what the mechanism is for speaking up, whether you raise your hand, whether you just speak up, whether there’s a little button to press or something like that. All those little things or anything that matters to people, anything that increases the known space that we’re in, the knowability of the space that we’re in and the knowability of the people within it.

And so the predictability of what will be the consequences to me if I speak up with an idea, ask a question, admit a mistake, raise a concern. Getting closer to knowing what the consequences of doing those things are vastly aids in psychological safety and, of course, aids in the outcome of the group of whatever you’re trying to achieve in that session. So all these scaffolds, particularly things like social contracts, great facilitation techniques, really good practices, workshop practices, even tools like Miro or Jamboard and whatever, they can help because they can help with the mechanism of putting our ideas on paper, in brackets.

Douglas: As you’re talking, a few things came to mind just from some facilitation fundamentals. You mentioned the why earlier, and I love the idea of the infinite why. And so it can be important to clarify purpose because if people are united by a shared purpose, that can make them feel more safe because it’s something they believe in. And if the facilitator is really attentive to signals, they might see where someone’s a little bit, like they’re holding back or they’re not speaking much. I think also if the facilitator’s modeling the behaviors, I’ve even seen people model mistakes early on. So it just creates a space where it’s okay to make mistakes. It’s almost like treating the container like a Petri dish.

To your point, these norms or these behaviors we want to agree to today, people can’t learn that for the long term, but we can certainly try and make that happen in the next two hours. And I think a lot of people will hate on icebreakers and I think that’s because people use icebreakers, they’re just going to throw them around like candy. If you really want safety, why not choose an icebreaker that models safety? And we can create that. And it probably won’t stick permanently, but it might. There’s a half life to that. But it’ll stick around long enough for that two hour workshop.

Tom: So admitting new mistakes is a really, really powerful way of doing that. Celebrating failure. There’s all sorts of practices and sort of ceremonies you can do around that. There’s even, I’m aware of a sort of an event called Fuck Up Nights, where people, it’s like a meetup and people go to and they share all their massive mistakes, and it’s such a powerful thing to do. And I think particularly for tech teams, well, actually for any team, I would say, but I think I see this in tech teams where the more experienced engineers, the engineers with a few years or decades of experience behind them, just them admitting their mistakes and just talking about the big outages they’ve caused, the ones that are named after them, those big, big mistakes, that’s such a powerful way of breaking down this fear, breaking this fear of admitting mistakes and creating a safe space to admit those mistakes for other people on the team, particularly the junior folks who are still probably convinced if they deploy one bug to production, that’s their career over.

Douglas: I remember a friend’s startup, and we were actually at the same conference together when this happened. So we were both CTOs at a tech conference. We were kind of sitting next to each other at one of the keynotes. And all of a sudden his pager starts going off. It’s back when we used to have pagers. And he’s like, “Oh.” And I knew something was bad because he was at a conference and they knew not to bug him. Someone had basically deleted the whole database or at least enough of it to cause major problems. And so it took him away from the conference pretty much for the rest of the conference. They restored it from backups, maybe lost an hour of data or something and it wasn’t a huge loss. And I remember he shared the story on some online forum and majority of the comments said, “Don’t let her anywhere near the database again.” And he said, I remember his comment back to them that, “I put her in charge of all future database updates.”

Tom: Yeah. Damn right.

Douglas: Someone that goes through that experience and that level of trauma is going to be the most cautious person on production from that day forward. And not punishing mistake at that level sends a really strong message.

Tom: It does, it does. And it recognizes that we learn a lot more from mistakes. Innately, we learn a lot more from mistakes than we do from successes. Because the success is, it’s so hard to pinpoint what went right in a success. There’s a concept known as Safety-I and Safety-II, which really, really interesting because Safety-I looks at what went wrong and examining what went wrong and what were the factors that made an incident occur. Safety-II looks at what went right. How do we make what went right happen more often. And we need both approaches. But yeah, absolutely right. Those folks who have really got those battle wounds from the big incidents, the big outages, they’re the people you want running your high availability systems, for sure.

Douglas: So we’re kind of coming up at the end here and I want to make sure to give you an opportunity to leave our listeners with a final thought.

Tom: Yeah, so a final thought around psychological safety. I think if I was going to ask people to do one thing, so I should say the mission of me and our business, psychsafety.co.uk, and everything around psych safety is to make the world of work a safer, higher performing, more inclusive and equitable place. That’s our north star. That’s how we work out what we should be doing, what we shouldn’t be doing. If other people feel the same, if other people have a similar sort of goal, if that feels like something you feel like is a good mission, a good thing to try and do, head over to psychsafety.co.uk. You can sign up to the newsletter, which goes out every week. There’s a whole bunch of stuff around psychological safety and various subjects and stuff, research and other things in the newsletter that goes out each week.

There’s an online community of, I think, more than 500 people now. There’s regular meetups. The last meetup was great. In fact, all the meetups have been great. Our last meetup was Sam Newth at Red Hat talking about neurodiversity and psychological safety and autism and that whole sphere. That was fascinating. And yeah, there’s also a whole bunch of resources, tools, articles, and other information available at psychsafety.co.uk. So if there’s anything you want, head over there. And if there’s anything I haven’t covered or any other question that people have, then you can email me at tom@psychsafety.co.uk and we can do some cool stuff together.

Douglas: Awesome. Well, we’ll make sure those links are in the show notes along with the research you mentioned about Amy. So people can dive in deeper and reach out to you. And want to say appreciate all the work you’re doing. And it was a pleasure chatting today, Tom.

Tom: I really enjoyed it. Thank you so much. Thanks for having me on.

Douglas: Thanks for joining me for another episode of Control the Room. Don’t forget to subscribe to receive updates when new episodes are released. If you want to know more, head over to our blog where I post weekly articles and resources about radical inclusion, team health, and working better, voltagecontrol.com.