The Safety of Work

Ep. 126: Is it time to stop talking about safety culture?

Episode Summary

Welcome to our first episode for 2025. Is it time to rethink the traditional notion of "safety culture" in today's organizations? Join us as we explore this provocative question, inspired by the article, “Seeking a scientific and pragmatic approach to safety culture in the North American construction industry” by Fred Sherratt, Emi Szabo, and Matthew R. Hallowell in Safety Science Volume 181, January 2025.

Episode Notes

In this discussion, we dissect various models of safety culture, scrutinizing how organizations perceive, measure, and manage these concepts. From artifacts like management systems to individual attitudes and behaviors, we delve into the inconsistencies and challenges of these models. We also revisit historical perspectives, such as Dov Zohar's work, to understand their influence on contemporary safety paradigms. Our conversation critically examines the missteps of industries like nuclear and aviation, which have mandated the management of ambiguous concepts without solid scientific grounding. We advocate for a shift from vague cultural mandates to actionable strategies, offering insights into enhancing clarity and effectiveness in both regulatory practices and organizational improvements. This episode aims to inspire a reevaluation of safety culture, pushing for a more scientifically grounded and practical approach to safety science.

 

Some highlights from the paper:

 

Discussion Points:

 

Quotes:

“The paper itself is very very stylish and self -aware and that's important not just for readability but for the state that this conversation is in...it's got all of these references that show that they're very aware of the landmines that people keep stepping on, in just even trying to write and untangle safety culture.” - Drew

“When someone uses the term ‘safety culture’, it's very common for them to be thinking about everything from commitment of people, compliance with procedures, level of resources, the balancing of goals, safety communication, leadership. All of these individual things just get lumped together into this term ‘safety culture.” - David

“The moment you start trying to turn it into practical actions, that's when everything starts to crumble - when there aren't good, agreed definitions.”- Drew

“You can't just wander into a company and say, ‘I want to study company culture.’ That's like a marine biologist going into the ocean and saying, ‘I want to look at things that live in the ocean’...Be precise, be narrow, be specific about what it is that you actually want to look at.” - Drew

 

Resources:

Seeking a scientific and pragmatic approach to safety culture in the North American construction industry


Ep.44 What do we mean when we talk about safety culture?

Dov Zohar’s Published Research

The Safety of Work Podcast

The Safety of Work on LinkedIn

Feedback@safetyofwork

Episode Transcription

David: You are listening to the Safety of Work podcast, episode 126. Today we’re asking the question, is it time to stop talking about safety culture? Let’s get started. 

Hey, everybody. My name’s David Provan. I’m here with Drew Rae, and we’re from the Safety Science Innovation Lab at Griffith University in Australia. Welcome to the Safety Work Podcast. In each episode, we ask an important question in relation to the safety of work or the work of safety, and we examine the evidence surrounding it. 

Drew, welcome 2025, our first podcast for the year, and fortunately it’s going to be coming out still in January. I think I dropped the ball a bit last year, but I’m committed that we’ll get back into a nice routine in 2025 for our listeners, so they can stop maybe hassling you and me about where the next episode going to come.

Drew: Yeah, it’s certainly not the case that people have stopped publishing interesting safety research through 2024. We do have a little bit of a backlog of some interesting papers (I think) to have a look at this year.

David: And on average, it hasn’t been too bad. I think the podcast turned five three or four months ago, so the pre-COVID 2019 start that we had. So 25 or so episodes a year on average isn’t too bad.

Drew: We are no longer a young puppy just yapping for attention. Slowed down a little bit, but hopefully still decent in quality.

David: A fun topic to kick the year off safety culture. Do you want to introduce the start of this episode?

Drew: This is one of our favorite sets of authors that we have talked about before. As usual, the Construction Safety Research Alliance has written one of those papers that we were meaning to get around to writing ourselves and never did. They got in both before us and probably better than we should have done. 

But I think this is going to be a fairly short episode. The question we are asking is, is it time to stop talking about safety culture? The short answer is yes, absolutely. We hope this episode has been thought-provoking and useful in shaping the safety of work… In case listeners would like a slightly more elaboration than that, let’s get into the paper and talk about it.

David: During episode 44, which was 2020 by the way, we did do an episode titled What Do We Mean When We Talk About Safety Culture? We used one of the literature reviews, the meta-analysis, and we talked a lot about some of the issues that we’ll talk about in this paper, about definitional issues, research issues, practical issues with the term safety culture. Let me introduce the paper and then we’ll dive into what’s said in this paper. 

The title of the paper is Seeking a Scientific and Pragmatic Approach to Safety Culture in the North American Construction Industry. Our Construction Safety Research Alliance friends are in Boulder, Colorado. The authors are Fred Sherratt, Emi Szabo, and Matt Hallowell. 

Big shout out to Fred and Matt for the work that they continue to do in the US industry. I guess it’s great to see that there’s a partnership extended through Helen Lingard at RMIT in Australia, to work together through the new Shine Network here in Australia as well, so some great industry academic collaborations are happening.

The paper’s actually a 2025 paper, even though it was accepted (I think) in about August, 2024. Drew, do you want to talk about what type of paper that we’re looking at today?

Drew: Sure. This is (I guess) you’d call it a critical literature review. There’s a little bit of original data in the paper in order to flesh out one of the arguments they wanted to make. They did a brief survey and we might talk about that when we come to it. 

Basically, it’s selectively picking out literature in order to explain the broader picture of what’s going on, with probably as much commentary by the authors of this paper as there is summary and discussion of the literature. It’s a fairly common style when you’re trying to tackle a difficult topic like safety culture, where you’re just summarizing the literature or synthesizing the literature wouldn’t really move the conversation forward. It needs a fair bit of sense-making. 

For this work, you really care about how recent it is so that it is currently contributing to the debate. You care about where it’s published because you want to be sure that it has been thoroughly peer-reviewed for how they represent the literature. But also you need to trust the authors themselves because they’re going to be very selective. 

There are thousands of papers about safety culture. They’re going to pick out a few and try to make some broad arguments, so you want to trust that they are going to represent that broader body accurately and be informed about how the debate has evolved. 

Unfortunately in this case, we can easily tick all three boxes. These are authors who have a track record in doing exactly this type of thing on other topics. We’ve talked about them doing it before on episode 55, talking about drift and injury rates. But this is something that they do a lot of. 

Safety science, questions about how good the peer review is, peer review is always up and down, but it is, as far as safety journals go, still the flagship journal; it’s the best we’ve got. Recently, the date on the paper is actually in the future as we’re talking about it.

David: This reminds me. This is a strong call to action paper and making a fairly bold recommendation or finding or way forward. It reminds me a little bit of our manifesto paper. 

I think we recorded that around episode 20 where you said tackling a topic like safety science and safety research, the state of play and what might be directions for moving forward. So really important pieces of work, these types of papers, but it is also a really fun read. 

The good thing about the CSRA, the way that it works, and the way that Matt has set that up is all the papers are open access. So any of our listeners, we’ll link it in the show notes. You can go grab this paper and have a read, and it’s a fun read. 

Drew, do you want to talk a little bit more about this paper? Because you mentioned at the start that it’s a paper that we’ve often talked about writing, but the guys did a good job of it.

Drew: Before we get too much into the flattery, let’s just point out that CRSA does not know how to write paper titles. Absolutely boring title. But I guess everything in safety culture has already been said. All of the neat metaphors, all the neat text have already been done in the titles. But the paper itself is very, very stylish and self-aware. That’s important not just for readability, but for the state that this conversation is in. 

One way to think about the academic literature is it’s not just each individual paper’s doing a bit of research. A lot of the stuff that’s written are authors in conversation with each other. We as readers are eavesdropping on this conversation that’s gone on for decades. 

It’s really quite frustrating actually when you read papers as an academic, and the authors don’t even seem to be aware that they’re part of that conversation. They think they’re the first people to have written about the thing that they’re writing about. 

The great thing about this paper is that as you go through, it’s got all of these references that show that they’re very aware of the landmines that people keep stepping on in just even trying to write and untangle safety culture. They step around the landmine, then they point it out to their readers. Hey, look. This is where we stepped around this bit. 

We’ll get into it a little bit more in the meat of the paper, but in reading anything about safety culture, you need to be aware that safety culture has been around, prevalent as a concept ever since the IAEA report into Chernobyl. That’s the first time the idea was used and became prominent. 

Right from that very first start, they used the term safety culture, and then had to republish another entire report trying to explain what they meant by safety culture. Ever since, we’ve had more work trying to explain what we mean about safety culture than we have about almost any other topic in safety. 

Every now and then, people come along and they try to rejuvenate the topic, or clear things up, or tidy it up. They do a meta analysis of everything that’s been written about safety culture. They try to put it into buckets and explain what the different means are. They try to separate out, oh, this is safety climate and this is safety culture and this is why they’re different, or this is why we need to get back to basics. 

It’s gotten to the point where so many people have tried to fix it, that even the fixes are now just a mess of different fixes. Basically, I tell students if they come along wanting to do a project about safety culture, just don’t. You do not go into that bog. You are never going to come out again. It’s a very brave thing to wade in. I think they handle it really well.

David: I think some of the motivation around this paper in terms of, and we’ll talk a little bit about the philosophical approach that the authors have taken, because seeing increasingly more and more regulators getting involved. 

We know we’ve had safety culture regulation in aviation and nuclear for a while now, but it’s creeping into the energy sectors and the construction sectors, where increasingly regulators are getting involved and industries needing to have answers to a question that they don’t even understand. We’ll talk about (I guess) maybe some urgency around a way forward around this topic. 

Philosophically, I think the approach that the authors are taking is (I guess) consistent with the research institute itself, the Construction Safety Research Alliance, because really the purpose of these researchers (Matt, Fred, [...]) and the team is to basically help industry be better at safety. So really, what they’re looking at the safety science is how able is the safety science to provide practical direction advice to people who work with hazards and risks every day. 

Drew, do you want to talk a little bit about research philosophy then? Maybe because you are closer to how researchers come at what they’re trying to achieve with their research than I am.

Drew: I guess there’s nothing unique about wanting your research to help industry. The trouble is most people who have a very heavy outcome focus from their research go too heavy on the helping industry, and they abandon science along the way. 

What the CSRA does is they basically use that ‘how is this useful for industry?’ as an ongoing test at each stage of the research, so have we framed this question in a way where the answer is going to help industry. Have we defined this construct or this idea? Is it something that’s going to be useful for industry?

That’s an ongoing test, but they’re not abandoning doing the scientific process because they believe that for something to be useful for industry, it has to be scientifically grounded. They’re not just producing new techniques or new methods, which is an easy way to influence industry, just not necessarily for the better. You see that in each step is this joint test of is this scientifically meaningful and does this help industry? 

When you turn that to safety culture, safety culture often is grouping together a bunch of different things under a single concept. That’s the question that they largely ask in this paper is if, and is it scientifically meaningful to group these things? Are our reasons for grouping them for industry and our reasons for grouping them scientifically consistent? Because otherwise you’ve got really two different groups. As you’re going to see, that’s going to lead to their explanation for why safety culture is such a mess.

David: I guess part of this paper there was some subtle and not so subtle digs at the safety science, and even the broader safety practitioner community potentially around this, because at one point in the paper they talk about every safety culture paper starts the same. You start with an opening sentence about the reference to Chernobyl, and then you talk about the fact that it's very incoherent and poorly defined. 

Then they said, what every paper does then is go, but we now have a way and this is what it is, or this is a method for doing it, or this is the answer. Where this paper (I guess) takes a different direction, it goes, and because we’re in this situation, we actually need to think differently about what we’re trying to achieve and go in a different direction.

Your point there, which is really interesting—I really like the way they go through explaining that the implications of that lack of definition and that fragmented and incoherent body of literature in this idea—that basically people have a different view of what safety culture is, but most people agree that it’s made up of a whole bunch of things. Then some people even get to the point that it’s anything that’s got anything to do with safety. 

I guess we commonly think, and just to give an example of that because a paper doesn’t give a broad example, is when someone uses the term safety culture, it’s very common for them to be thinking about everything from commitment of people, compliance with procedures, level of resources, the balancing of goals, safety communication, leadership. All of these individual things just get lumped together into this term safety culture. 

Drew, do you want to talk a little bit about (I guess) how that’s a problem in research and in practice maybe?

Drew: If we start at the research end, it’s a problem because if you just have these very vague ideas about what something is, then everyone who researches it is going to be researching a different part of it. We are not assembling those different parts into progress. 

This isn’t a perfect metaphor, but it’s sometimes useful. If you imagine research is like building a wall. No one gets to just build the wall. Good research is about building a really, really good brick, and placing that carefully onto the wall so that the wall gradually gets higher. 

When you don’t have good definitions about the thing that you are researching, that’s like we all just go to the building site and we all just create our own thing that might be a brick and it might be a plank and it might be a hammer, and then we just like throw it towards the building site. 

You might start to make a little bit of progress as you throw the first couple of things on that you get a higher elevation, but eventually the pile’s just going to start collapsing with each new thing that you throw on, and it’s never going to look anything like a house. 

If you are then at the industry side trying to use that, you might go onto that site and you might pick up one of these bricks and think, oh, this looks okay. But then you realize you’re talking about different things to other people, something that seems nice and clear, and sometimes people make safety culture sound super clear. 

Culture is just the way we do things around here. That is just so compelling and easy, but what do you do with that? How do you use that to get better? The moment you start trying to turn it into practical actions, that’s when everything starts to crumble when there aren’t good agreed definitions.

David: I agree. I think the industry component of that is exactly as you just mentioned then. I was actually thinking of Hopkins’ definition there about the way we do things around here, because organizations are looking for simple, understandable, and actionable things that they can do. You get a very complex topic and industry will naturally simplify it, so they can do something about it. 

One of the challenges in industry is for every complex issue, there’s a simple solution that’s wrong or unhelpful. I think that applies here on safety culture. People going, oh, we’ve got this really broad topic and our safety culture is X. Therefore the action we need to take is Y. That’s not (I don’t think) helping industry make any real progress.

Drew: Just to clarify, there are lots of things in the world that we can only define by how we use them. A classic example we use when we are talking about ontology is games. You cannot define a game by just giving a set of clear rules for what is a game, or a set of properties for what is a game. Your game is a category that is clear in the middle and vague at the edges, and you really only define it by saying what is and isn’t in the category. 

But that nebulous category isn’t really what we are talking about here. What we are talking about is some actually fairly well-defined things that are individually quite useful, that we are grouping together because we think that that whole is more than the sum of its parts.

If you talk about what values or beliefs people have, that’s fairly clear. That’s fairly well-defined. That can be fairly easily investigated. You talk about whether a company has a safety policy or what’s in the safety policy that’s fairly well-defined. You talk about the priority of safety. You talk about how the organization is structured. You talk about leadership behaviors. All of those things are relatively well-defined. 

The question is what do you get by putting them together and unify them? That’s where people have really quite different, even just concepts, about how they fit together, let alone what’s the usefulness of fitting them together.

David: Like you said, we just keep adding and aggregating. People prioritize and focus on different elements of what they might think of as safety culture based on their own either personal views or the researchers that they align most with. 

But I like the way that in this paper that the authors lay out these three potential definitions for culture coming from three different (like you said) ontological type perspectives around, is safety culture something that’s social? Is safety culture something that’s more of an average individual? Or is safety culture something that’s more organizational? 

To touch on all of those points you made there, Drew, the paper doesn’t answer it and I don’t think there is an answer. But the implications for research were quite fascinating to me because if you think culture’s a social process, then you need to research it in certain ways. If you think it’s an individual behaviors, attitudes, beliefs thing, then you research that in a few different ways. If you think it’s organizational, then use interpretive research. 

I actually thought that without the definition, it actually doesn’t let you research it. If you can’t research it, you can’t understand it. Then industry can’t do anything. There are a lot of implications of not actually having any agreement about what something is.

Drew: Let me just give a really simple example of that. A lot of people, their justification for why safety culture should be a separate concept and why it’s a useful concept has to do with the way that values and beliefs are self-perpetuating and can sustain and influence. Even as we change individuals as an organization, even as we try to change our structures and our processes, the underlying culture will still persist and will still influence that. 

Let’s say that’s your idea of culture. That has really serious implications then for how you are going to measure culture and how you are going to manage it. Because if you really believe that it’s those underlying beliefs and attitudes and they’re stable despite the organizational features, then you can’t ask questions about the structure of the organization to measure the culture. You can’t ask individual attitudes to measure the culture because you’ve claimed that these things go beyond structures and go beyond individuals.

On the other hand, if you think that culture is an aggregate of all different things—structures, behaviors, and attitudes, they all just work together—then you got to ask, why don’t we just individually treat those things? Why have the concept of culture at all? 

Your different explanations for why culture matters or what culture is lead to very different ways of managing it. These things are often really quite inconsistent once you get down to looking at what people say culture is versus how they actually research it, or how they measure and manage it in their organization.

David: I mentioned your views. I quite liked the layout of those three, we call them in the paper models of culture that we’ve got these three different models. I think it’s good for our listeners. You may not have thought quite deeply about your own position on what you think safety culture is. 

It was a good way for me to test what are the things that I always carried in my mind when I thought about this topic. I think one of the practical takeaways we’ll get to at the end is about what you think you mean when you say safety culture is unlikely to be the same as what anyone else means when they use the word safety culture as well. 

I actually quite like these models, the description around it, how you would understand it, and what you’d do with it. How do you find that part of the paper?

Drew: I’ve read too many safety culture papers that try to do this exercise of neatly dividing up the literature, and I’m glad they didn’t spend too much time on it because the closer you look at this, the more you realize that even those three things, they’re not internally consistent and they start to fall apart a bit. But as a rough way of looking at it, you can definitely see that research tends to fall into one of these three buckets. So we describe them.

The first one is the idea that you can see culture through the artifacts that it creates. It goes into like, often those people who do this research will say that there are underlying beliefs and attitudes. But everything that they do, everything that they research, everything they measure is looking directly at the artifacts rather than trying to find some underlying representation.

Do companies have mature management systems? Do they have rules? Do they have policies? Do they have procedures? A typical thing you might see is something like a cultural ladder, which says that a poor cultural organization is very disorganized, they don’t have anything for safety, they don’t talk about safety, they don’t have systems, they don’t have processes, they don’t have people appointed in safety positions, they don’t have safety responsibilities, they don’t have clear lines of accountability. 

A very mature organization is very structured. They have a safety system, they have people to operate it, they have competent people, they have clear lines of accountability or authority of safety artifacts. Hopefully, most of our listeners are familiar with those scales or ladders of culture that go from disorganized to very mature.

David: I think this reminded me of (I guess) a lot of work on safety climate, and some of the initial work of Dov Zohar in the late 70s and 1980, which is a decade before the safety culture term, where it was looking at, does an organization have an audit program for safety? Does it have dedicated resourcing for safety? And some of that early climate work, which is (again) this idea that you can engineer safety management into an organization.

Drew: I don’t want to pick on poor Dov Zohar, but he’s responsible for a lot of the confusion here because he described it as one thing and then started measuring the other thing. Lots of people have followed in his footsteps in doing that.

I think this command is very clear when we look at the second one, which is the idea that it’s mainly something that you see and measure through attitudes. This is where a lot of researchers will make a distinction between safety climate and safety culture. They’ll say safety culture isn’t well-defined, but safety climate is. They’re often picking out this model too and saying, this isn’t even culture at all. 

This is where people do things like safety culture surveys or safety climate surveys. They’ll ask staff about their attitudes to safety. They might supplement that with doing interviews or observations, but not usually. It’s usually done by surveys. The idea is that the culture is the sum of all of the individual attitudes. 

David: And it leads to this idea about, I need people to care more. I need people to take more accountability. I need people to take more ownership. That idea that if every individual’s thinking about safety in the right way, then collectively we’re in a good place.

Drew: A lot of the research you’ll see on this pairs safety culture with some other concept. They’ll ask, does more engaged leadership improve safety culture as measured by the attitudes? Does having organizational structures, does having it accredited to organization lead to better safety culture? They use it as a comparison to see where the other things make the culture better or worse. 

This is also the model that’s pushed towards a lot by the people who are trying to regulate safety culture. Because they already regulate all of the model one stuff about structures and rules and processes and policies. Then they additionally say, oh, you’ve also got to have culture, by which they mean you also got to have some way of measuring, monitoring these attitudes and beliefs.

David: And then third, Drew?

Drew: I think the authors reveal a little bit of their own research positionality here, which I’m very sympathetic to and I’m very interested to how people who don’t share that read this paper, but they basically say the third one is the only one that really makes coherent sense, which is this idea of if culture is this underlying nuanced thing, that is supposedly nebulous, is supposedly separate from structures and attitudes, then the only way to look at it is you actually need immersive research. You need longitudinal studies, you need ethnographic work, you need anthropology, you need to get into where culture started with, which is this idea of social science of organizations. 

But they say the trouble with that is that’s not actually easy to do and it’s certainly not directly applicable to workplaces. It’s a concept researchers could use, but it’s not something that an organization can say, oh, let’s improve our safety culture by doing a multi-year ethnographic study of our safety culture. What’s the actual utility of that as a safety concept for organizations trying to manage? It’s not really.

David: And I think that’s (for me) what this paper has done really well is, this cannot get better for us. It makes no sense to continue down this path. Because even if we can say, even those three different types, different categories of culture or ways of thinking about culture, even if we can agree on those as three in the field, this idea that if you need to go and really study the third with the ethnographic type of approach, which is much closer to (I guess) what cultural origins are in, like you said, psychology and social science, then you don’t end up with generalizable findings. You can go and study one organization, but it doesn’t necessarily tell you anything about the next organization. 

Towards the end of the paper, I talk about even if we did go out and do all these really deep cultural studies—maybe like what Dianne Bourne did after the Challenger accident with NASA—it doesn’t necessarily help industry move forward. For these researchers who really want to help industry, I guess they’re saying that culture just isn’t a helpful thing for researchers to worry about because it’s not going to ultimately be able to help industry.

Drew: And if you pick any of those models, there are other better alternatives now. If you like the idea that culture is revealed through structures, policies, procedures, and rules, then they make the argument you could just study each of those things individually. Lumping them together doesn’t give you any extra utility as an organization. It doesn’t give you any extra way of measuring or improving, and it doesn’t help you as a researcher.

The second one, attitudes and beliefs, putting those together might help. But that’s what safety climate is now for and is doing, and is doing it more clearly. Or there’s the whole so-called psychosocial stuff, and how we are trying to manage that is probably more useful ways of thinking about attitudes, beliefs, and perceptions. 

Then if you’re an organization that’s interested in some of that more nebulous underlying stuff, well partly that is just researcher space not directly helping the organization. To the extent that it is, that’s where some of the more learning tools, some of the more Safety Differently, Safety II approaches, are generating more useful tools for an organization to do that ethnographic work themselves.

David: Should we talk about the survey that they did in the paper?

Drew: We want to double down. The academics don’t know what safety culture is. Let’s ask industry what safety culture is and see what they do. I love this because it’s the best survey ever. It’s just one question. What is safety culture? With a free text submission. Five hundred sixteen respondents, about half of them, 56% are safety professionals. Because this is construction, this is most of the others. Construction managers, 20%; field supervisors, 10%; other construction professionals, 30%. Not quite sure why they’ve got oil and gas thrown in here as well.

David: The CSRA (I think) just announced recently they signed up or they’ve received funding from companies like Amazon. I know there are a few. I think they’re increasingly getting a lot of interest from outside the construction industry. 

The way that they did this convenient sample was they just emailed their email list of all of their contacts within their member organizations and posted on LinkedIn as well, to people who follow the CSRA on LinkedIn. It was a self-selecting convenience sample. Drew, that’d be why there are a few others in there.

Drew: So this is not intended to be like, what does everyone think?

David: It’s not a [...] study or anything like that.

Drew: But it is grab a bunch of industry people and just see how safety culture is used. Then they did a very rough coding, basically taking each answer and saying okay, what is this answer basically about? You have things like, this answer is basically about practices, this answer is basically about actions, this is about buy-in, this is about commitment, this is about leadership, this is about norms, this is about values, beliefs. 

They reckoned that about 40% of the answers in total were about tangible things—practices, actions, and behaviors; 60% were about intangible things—attitudes, perceptions, values, beliefs; and 6% were, we avoid the idea of cultural entirely.

David: But I think through the way they did this, from my understanding of the method, is every time something was mentioned, they say 2% of the 500 people mentioned buy-in, 2% mentioned accountability in their thing. They just pulled it out and literally just how many times did this get mentioned? 

Most are just really small amounts. I guess that just comes to individual how people write their answer to that question. Some people write forward, some people might write a paragraph. But if you look at the significant ones, the four most significant were attitudes, values, beliefs, and behaviors. I guess it goes to show that the common industry idea of what is culture, what is safety culture (I guess) out of this research according to industry, is for the most part that combination of values, beliefs, attitudes, and behaviors. Which I think we’d say isn’t what culture is.

Drew: Did you say which isn’t what—

David: Which we’d say basically, I guess from that perspective of the broader not safety culture, but when we think about what cultural research going back 100 years from a sociological perspective is all about, we would say that’s not so much what culture is. Somewhere along the line between the safety culture, and the research and the industry adoption, it’s just become this very (I think) individual attitude, behavior-type definition.

Drew: They talk a little bit about different ways to redeem the situation. One of the things they mention is the idea of trying to create frameworks for culture that try to take the various elements and assemble them. They point out that there’ve been numerous attempts to do this, and none of them have succeeded in shifting the research landscape. 

What tends to happen is the research will come along and propose a framework, but we don’t then get a whole body of other research filling in the little bits of the framework and showing how they stick together. We just get the next researcher coming along and saying, it’s all really confusing. Here’s my new framework to try to make sense of the landscape.

We have people who try to say culture is really one of these three things, and try to pull all field towards one of the directions. But none of that has succeeded in getting any consensus to build a foundation and actually make progress with the research.

I think the fundamental point is that other people have said safety culture is terrible, let’s abandon it. What they actually say is, there is no need to continue trying to fix it. Safety culture research has done some interesting things, but it’s done it along the way. Those things have spun off into more useful concepts that are now building on their own. 

Maybe that was the utility of the idea. It’s an interesting metaphor. It’s an interesting concept. As we think about it, we get a few other ideas. We build on those other ideas. We don’t need to bring everything back into that single concept. It’s not offering anything to do that unity. Instead, abandon the concept and just go and follow all of those rabbit trails that we generated along the way.

David: I like that there’s an example in there about rather than just using some cultural climate survey to try to understand supervision in your organization, go and explore supervision. 

Go and look at what your supervisors are doing, how well trained they are, what they’re spending their time on. What impact that’s having on the workers, and developing a better understanding of what your supervisors do, why they do it, how they do it, and what the results are, rather than just do some broad culture survey, and then try to jump to conclusions about what this means for supervisors. 

They’re saying from an industry perspective, well not even but from an industry perspective, what researchers should be doing is giving industry some really useful actionable things to do.

Drew: One thing they don’t talk a lot about in the paper, though they do hinted (I think) in a couple of places without being aggressive about it, is the more cynical explanations for why safety culture has stuck around in research. I think part of that is when we try to improve safety, we want to be able to measure the improvement. We want to be able to say, okay, we improved this thing. How does that really have an effect on safety? 

We’ve put in a new leadership program, but was that good for safety? We put in a new policy, we put in a new procedure, we put in a behavioral observation program. Was that good for safety? As these exact same authors have shown in other work, we can’t reliably measure that by the number of injuries going up or down. 

One thing that safety culture does do is it offers us an apparently easy way to measure whether safety has gone up or down? Because we could say safety culture has improved based on measuring safety culture using a survey. I think a lot of researchers and organizations fall into that lazy trap of needing to prove that things have improved. This is something that will show that.

David: I want to extend that a bit because I think there is a limitation in this paper that was quite pronounced in the last few pages, where the authors did talk about evaluating all of these individual pieces for its effect on “safety performance.” 

Now, I’m going to use those two words in quotation marks because I think safety performance is a concept that is exactly the same as safety culture in a sense of we don’t have an agreed definition of what it is. There are a lot of different ways of thinking about, is performance injury rates? Is performance capacity? All these things. I think this paper did fall into the trap of what it was arguing against at the end of the paper by saying, we need to evaluate all these individual pieces on performance. 

I would’ve really liked this paper, and if I was peer reviewing this paper, I would’ve actually gone back to these authors and asked them, you now need to actually say what you mean by safety performance, or everyone’s going to run off and fall into the same trap, redesign a safety culture something, and call it a safety performance something, and start the whole cycle again. I think that was a gap for me because these authors have already argued mathematically that we can’t count injuries and understand the effectiveness of these safety practices. 

For me, the answer here is if I want to do a program to improve leadership, I need to have a way of evaluating whether leadership has improved. If I want to do a policy, why do I have a policy? Because I want policy to inform action. I need to have a way of evaluating whether or not I’ve improved my informing action in the organization. 

I think that you talk a lot, Drew, in research about the mechanism, and that’s really important because we can’t just say we’re going to evaluate auditing against safety, leadership against safety with no step in-between.

Drew: I’m with you entirely there, David. I think researchers have this little bit of utopia of, we can prove that we’ve actually improved safety. It’s too nebulous a concept, too undefined to do that. You’re putting in replacements like, oh, we’re not going to say we’ve improved safety, we’ve improved safety culture. Doesn’t fix it. It just replaces one undefined concept with another one. 

There is a risk that if we get rid of the safety culture, people are still going to have that ambition to prove this ultimate effect we’ve made the world better. Then we recreate safety culture under another name.

David: We’re seeing the way, how you said the ‘we.’ We run around and we talk about safety capacity, or in this paper we talk about safety performance. If you say you want to mention safety, these authors should have then gone and defined and by safety performance, this is the definition that we mean there, because I think that’s a bit of a gap at the end. I’m sure [...] and mad about [...].

Drew: We might get into this a bit in the takeaways, but the trap with safety culture was trying to aggregate things, to trying to put together things, that there was no utility or scientific reason to put together all in a cluster. That’s the mistake that we need to avoid. Not just safety culture itself, but that attitude of let’s aggregate things for the sake of having a single metric, a single performance measure, even though they’re not the same thing that we’re adding together.

David: In terms of that definitional piece, I was reflecting on a few papers that we’d written, like the safety clutter paper and things like that. I think the very opening thing of safety clutter, and this is the definition when we use this word and I think we did it with safety work and safety of work as well. I think in the papers that we’ve written together and it’s your influence, we’ve always clearly defined the thing that we’re talking about.

Drew: That’s part of my attitude of be wrong clearly. Every time we’ve done that, I’ve come back later and said, I’m not happy with that definition I used. There are problems with it. But I would rather have said clearly what I meant and be wrong, than have hidden that wrongness inside me being vague.

David: Are we ready to talk about some takeaways?

Drew: Struggling a little bit with how to structure these, I’ve actually thrown in (I think) our first meta takeaway that applies to all of the takeaways. Whenever you attempt to use the term safety culture—it doesn’t matter who you are if you’re a researcher or a regulator or practitioner—catch yourself using that word ‘culture’ and just go one step narrower. 

When I said culture, what did I actually mean? Pick one thing that you actually meant, one specific thing, and then keep the conversation going about that one specific thing.

First takeaway, if you’re talking about culture in your organization, talk specifically about what you’re referring to. If what you really mean is leadership priorities, then talk about leadership priorities. If what you really mean is you want people to care more about safety, then say you want people to care more about safety. If what you mean is you want more conversations about safety, say you want more conversations. 

If you are talking about culture instead of those things, at best it’s just a metaphor which is making things more vague. But I suspect at worst it’s really a euphemism to avoid the conflict of saying, what I actually mean here is bad leadership. Or what I actually mean here is bad behavior. Be specific about what you want to say. If you’re uncomfortable saying that, maybe you’re saying the wrong thing. But don’t just use culture as a bandaid to make it more vague.

David: I think adding an extension to that, I just thought, never ever have it as a cause or a corrective action for an incident in your business.

Drew: Oh yes, we need to have a better culture. Sorry, the very first accident report I ever wrote in anger, spent the first paragraph blaming bad culture. We talk about utility to business. I didn’t. I helped no one by using that as an explanation.

David: Explaining away, yeah. Number two, Drew.

Drew: This one is for researchers. Stop researching safety culture. If you’re tempted, read this paper and then go and read our own manifesto of reality-based safety science to offer you some alternatives for how you can still study what you want to study. But getting away from thinking that the starting point is culture. 

One suggestion might be, ask yourself what data are you actually going to look at? Because if you’re genuinely serious about studying culture, then the only way to do that is with deep anthropological or ethnographic methods. You can’t study culture by doing surveys. You can’t study culture by looking at structures and policies. If you don’t want to do the deep work, then don’t say you’re studying culture, say what you want to study. 

If you do want to do the deep work, then you still need to be more specific. You can’t just wander into a company and say, I want to study company culture. That’s like a marine biologist going into the ocean and saying, I want to look at things that live in the ocean. Or you are an engineer doing a PhD about making stuff. Don’t define your research to start with that, with vague words. Be precise, be narrow, be specific about what it is that you actually want to look at.

David: Then for our friends in our regulatory offices?

Drew: I noticed, David, you’re making me say all of these.

David: Oh no. You just fleshed them out so good.

Drew: I’m happy to say this directly to CASA, our regulator here for aviation safety. Doing work in the aviation space at the moment. Ran into one of these requirements, and a company trying to meet it. If you are a regulator and you are trying to regulate culture, stop it. That should be a priority to get those words out of the regulation, because it is not the job of a regulator to try to lead the state of knowledge in the requirements they give to companies. That’s not how regulations work. 

It was a dumb thing to do, to take a concept that had not yet been scientifically defined or studied and do what the nuclear regulator did, and then the aviation regulators did, which is to mandate companies to measure and manage something that the regulators couldn’t even define. 

Now it’s an understandable mistake because it’s very easy to think you’ve got to grasp on something to think you’ve defined it, and then actually you later realize, no, actually it wasn’t. This is one of those cases. It looked clear, it looked simple, it looked like it was well-defined, it looked like it was important, but we do not understand it enough to manage it, let alone to regulate how people manage it. We just can’t.

David: And I think that’s a really good point. There’s a comment in the paper about the assumption benefit of doubt, assumption to regulators, that they actually assumed that the topic was probably more well-defined and more advanced than it actually is. 

Drew, the question that we asked this week was, is it time to stop talking about safety culture?

Drew: Yup.

David: Okay. That’s it for this week. We hope you found this episode thought-provoking, and ultimately useful in shaping the safety of work in your own organization. Send any comments, questions, or ideas for future episodes to feedback@safetyofwork.com.