In this episode, we dig into conflicting research, question the role of data, and offer practical insights on transforming safety from a compliance burden to a strategic asset. Instead of a research paper, we look into an upcoming Sept/Oct 2024 Harvard Business Review Workplace Health and Safety Magazine Article called "Safety Should Be a Performance Driver - It’s more than just a compliance issue” by Vikas Mittal, Alessandro Piazza, and Sonam Singh.
We challenge the notion that high injury rates are punished by market forces, as we dig into this article that posits the opposite: that safety should be a performance driver. Our analysis dives deep into the credibility and methodologies of the article, emphasizing the critical role of peer review and the broader body of knowledge.
We'll also scrutinize the use of data as rhetoric versus evidence, focusing on the transparency and rigor of research methods when interviewing executives about safety practices. Is safety merely seen as a compliance issue or a strategic investment? We dissect the methodologies, including participant selection and question framing, to uncover potential biases. Finally, we critique a proposed five-step process aimed at transforming safety into a competitive advantage. From aligning on the meaning of safety to incentivizing employees, we expose significant gaps in academic rigor and alignment with established safety literature.
This conversation serves as a powerful critique of superficial analyses by those outside the safety science domain, offering listeners critical insights into the complexity of safety management and its potential alignment with organizational goals.
Discussion Points:
Quotes:
“The trouble is, then we don't know whether what they're referring to is published research that might be somewhere else that we can look for for the details, or work that they did specifically for this article, or other work that they've done that was just never published.” - Drew
“We've got to be really careful…this is using data as rhetoric, not using data as data.” - Drew
“I wouldn't be surprised that most people see safety as both a cost and as an outcome.”- Drew
“So you've got two-thirds of these companies that don't even have any safety metric, like not even an injury metric or anything that they monitor.” - David
“So we kind of assume business performance means financial performance, but that in itself is never clarified.” - David
Resources:
The Article: Safety Should Be a Performance Driver
Episode 121: Is Safety Good for Business?
The Safety of Work on LinkedIn
David: You're listening to the Safety at Work podcast episode 124. Today we're re-asking the question, is safety a key value driver for business? Let's get started.
Hey, everybody. My name is David Provan, and I'm here with Drew Rae. We're from the Safety Science Innovation Lab at Griffith University in Australia. Welcome to the Safety of Work Podcast.
In each episode, we ask an important question in relation to the safety of work or the work of safety, and we examine the evidence surrounding it. Drew, I think that might be the first time in 124 episodes, where we've changed that introduction and deliberately mentioned re-asking the question. Would you like to give a bit of context to that change?
Drew: Sure. In episode 121, we talked about a paper which analyzed where good safety performance improved the viability of a business. I'll be frank, we were fairly complimentary about the paper. The authors got back to us on LinkedIn, and were pleased with how we talked about the paper. But just as a reminder, the conclusion of that paper was that market forces on average tend to punish rather than reward companies with a low number of injuries.
That's not a definitive answer, and there's still some remaining uncertainty about why that might be the case. We used that paper to make the argument, or at least to call into question this common idea that good safety is good for business. Of course, the fact that it is a common idea means that there are papers out there that make the exact opposite argument. Pretty soon after we talked about this paper, which says that maybe good safety is not good for business, we saw this article drawing attention saying the exact opposite.
The paper itself doesn't quite make the argument necessary that safety is good for business, but that's certainly the way it's being talked about, as if this paper is evidence that safety is good for business. As we go through the paper, we'll get into a bit of nuance about what they say, don't say, and what they directly claim versus what they imply.
Given that this is a paper that is drawing attention to say the exact opposite to a paper that we'd reviewed, we thought it was a good opportunity to talk about how we select our own papers, how we critique papers, and how you might look a little bit skeptically at the evidence.
When you have these contentious issues where different people are saying different things, what does good evidence look like? Bottom line up front, this episode is a cautionary tale, but we'll try to be fair as we go through it and explain just how we look at papers, how we see papers, why we're happy to stand by episode 121, and why you might be a bit skeptical about this new paper making the rounds.
David: Drew, let's introduce the paper and then let's talk a little bit about it. There are a bunch of things that we want to talk about today, about scientific research, publication, research methods, and those types of things. This popped up in my LinkedIn feed about 4 or 5 times in a 24-hour period and drove a lot of really positive commentary around it. It's an article that's due to be published in the September-October edition of the Harvard Business Review.
The HBR, we may all be familiar with, at least maybe in the Western professional world. It's essentially a professional magazine for business leaders and business professionals. I don't actually know their publishing process or whether they have any kind of peer-reviewed papers, but we featured research before that has been published in professional journals through the safety professional associations and so on, professional safety in the US published by ASSP.
So this is a Harvard Business Review. The title of the paper is Safety Should Be a Performance Driver. Three authors, Vikas Mittal, Alessandro Piazza, and Sonam Singh. They're all marketing and strategy academics based in the US, so they are professors and researchers in academic institutions. They have done some research in safety in the past, at least in terms of some of the studies that they talk about and call into as evidence in the paper.
It's going to be published. You've you might've seen it doing the rounds. Today we're going to talk a little bit about the paper and maybe what it can and can't tell us about whether safety should be a performance driver.
Drew, is there anything else you want to say before I add to the high level structure, just about authors and research publications?
Drew: Okay. A couple of things I'll say here. When we would normally do this episode, we would be talking a little bit more in detail about the author's background, what they had published, and why that might give credibility to the types of arguments they make. This is going to be a little bit of a counterexample because this is not work which is published in a safety journal. It's not in fact even work that is published in a management journal.
The particular paper we're looking at is not work that's been peer-reviewed at all. It's based on some work that has been peer-reviewed but not peer-reviewed in a safety venue. What is the risk of that is that there may be concepts that are well-understood in the discipline of safety that these authors aren't aware of. Not because they're not respectable researchers, not because they don't have distinguished careers, not because they're not very good with it in their own fields, but just simply because they're unaware of what the safety literature says and why maybe that draws into question some of the ways they're going about their research and the types of evidence they're selecting.
Why authors matter and why peer review matters is it's not there to screen against nefarious people. It's not there to screen against dodgy people. It's there to make sure that the right things have been thought about, that work grows on is embedded within a wider body of knowledge that should inform these sorts of questions. That is obviously not the point of the Harvard Business Review. It's meant to be presenting stuff for executives, but it should be based on the best research in its field.
This is a bad job by the Harvard Business Review, by picking these particular authors to talk about this particular topic and to present it to business executives. They are not doing a service either to the academia, to these authors, or to the executives by drawing attention to this particular work and putting it in front of company executives.
David: I think to highlight one of the points here, and I think for our listeners who clearly have an interest in safety science, if you're listening to the Safety of Work podcast, when we get to what the paper suggests as the action plan to make safety a strategic value driver for business, you might see where it's the gaps in knowledge potentially of the safety science literature by the authors that have put together what we think or what I think, at least—I'm interested in your thoughts, Drew, when we get there—is fairly a weekend and transactional compliance-oriented safety program.
Drew, there are three parts to this paper. This is designed to be something that is an accessible reading in almost like an essay type format as people familiar with the Harvard Business Review. There are no citations, there are no references, but the authors do quote numbers and quote certain conclusions from research without actually saying where that research is or how it was done.
The paper is split into three parts. The author lays out the current problem for safety in organizations, then they lay out the evidence that connects safety to business performance to try to make that really clear. The third part is a five-step action plan to essentially get there, to make safety that strategic value driver for business performance. Drew, do you want to start with the background and we'll talk a little bit about the current problem?
Drew: Sure. I'm going to start with a little bit of a quote from the paper. "And yet considerable evidence suggests that most companies mismanage safety. For all the policing and investment in compliance, products are frequently recalled and workplace accidents continue to happen.
In 2023 in the United States, 3300 recalls affected more than 135 million products, a high not seen since 2016. US employers reported 2.8 million injury and illness cases in 2022, a 7.5% increase over 2021. The United States saw 5486 fatal work injuries in 2022, the equivalent of one death every 96 minutes. Why aren't companies doing better on safety?"
David, peer reviewer, go. How many statements in there would you be as a peer reviewer, your eyes lighting up and thinking, nope, you can't say that?
Drew: We're conflating lots of things here. The first sentence about companies mismanaging safety needs a considerable number of citations to make that claim that most companies mismanage safety, and then it needs to lay out a very strong argument. That would be a paper in that one sentence to try to validate that claim. We start talking about product safety and product recalls. We then start talking about injuries or minor injuries and illnesses, and then we start talking about time-bound fatalities.
Drew: In your experience, do companies manage consumer product quality using the same systems as processes as they prevent worker injuries with the type of consumer products that get recalled?
David: Very unlikely. Product safety is often much more a quality function disconnected from the workplace health and safety function in an organization. Particularly what we'd be talking about here, which is manufacturing companies, there would be regulatory compliance and quality teams that would be managing those aspects of product safety and compliance. Why that's part of the workplace safety paper, I'm not sure.
Drew: Any particular reason you can think of that there might have been more injuries in 2022 than in 2021?
David: People might have still been at home with Covid, although most of the US was back at work at that point in time. We saw a whole bunch of things happening in the industry, shifts going on with turnover of workers, and different supply chain arrangements. Even if we look at some of the other papers, we've looked at other people like the Statistical Insignificance of TRIR from Matt Halliwell and the CSRA at Boulder, we know that that 7.5% increase is meaningless. We know that that doesn't carry statistical significance.
Drew: We're being a little bit mean and unfair here because this is a business review article. The purpose of this introductory paragraph is not, in fact, to present peer-reviewed evidence. It is to make a rhetorical point to introduce the article, but this is what we got to be really careful of when we start sharing these sorts of things on LinkedIn, and we start using these sorts of things as evidence that we're putting in front of managers, that this is using data as rhetoric, not using data as data.
In the introduction, when you're setting the scene and you're just trying to say, hey, safety is a problem, we could do better, there's not a lot of harm caused by relying on rhetoric, by cherry picking your evidence, by making it all sound good. But the deeper we get into the paper, the deeper and deeper you're making this argument, and starting to try to influence the way people are thinking about things, the more and more the problem with this use of data as rhetoric is going to become. I just wanted to point it out here because the introduction is just really egregious in terms of things that peer reviewers would not let you get away with.
David: Drew, let's talk a little bit about methodology so we know what we're talking about. There is research presented in this paper, as well as a bunch of other studies that are referenced as well. These authors wanted to understand how companies manage safety.
To do that, they interviewed 76 executives from about 30 companies across different industries, including distribution, education, healthcare, facilities, management, oilfield services, finance, IT, and manufacturing.
We don't know what those 30 companies are, we don't know the mix across industries. We don't know how many executives from the same company, whether it was one from 29 companies and then 50 from the 30th company. We actually don't know anything more about it. I imagine that IT and finance companies are very different from oil field services companies when we're talking about workplace safety.
Drew: I was a little bit struck by the about 30. Again, no peer reviewer lets you get away with about 30. What was the number? That's nitpicking. One of the difficulties in reading this article—this is not the fault of the authors at all—is that Harvard Business Review articles don't include a reference list. Guardian articles include reference lists, but Harvard Business Review does not.
The trouble is we don't know whether what they're referring to is published research that might be somewhere else that we can look for for the details, or work that they did specifically for this article, or other work that they've done that was just never published. In a few cases, I've tried to dig up where it is. In the case of these interviews, I cannot find anywhere else that's been published.
One of the authors recently published a book that is based on CEO interviews, but that wasn't about safety. If safety came up in those interviews, it was not the purpose of the interviews.
This is what you really need to know to understand data coming from an interview. How did they pick these people? How did they conduct the interviews? Who conducted the interviews? What were the topics and questions of those interviews? How was the data analyzed afterwards?
If you're going to make a claim based on interviews, 76 interviews is a lot. You could do a PhD thesis out of that if you did it properly, if you reported it properly. But we don't even know if these interviews were about the management of safety. They could have just been about other stuff. They threw in one question, or a couple of the interviews talked about this.
This is why when we review papers, we look at the methods and we care about the methods. Without knowing how the interviews were conducted, you just have to assume that all of the data was prompted by the way the researchers conducted the interviews and does not reflect what the CEOs actually believe, let alone how safety is actually practiced.
Interviews can be good evidence, but as a critical reader, when someone says they've done a lot of interviews, unless they tell you the details, you cannot believe what they say about those interviews. The verification is not in the claims, the verification is in the how they do it, explaining how they did it, and having peer review around those processes.
David: Drew, I think that's a nice caution about now. When we don't know much about the methods or we have big question marks over the methods, then I think the starting position needs to discount any conclusion that has been drawn, because you can't validate how those conclusions were arrived at. You need to be really cautious of being confident in the conclusions that are being presented.
The starting point for this paper in terms of the current situation based on these interviews is that the author suggests that currently, at least for these executives interviewed, they frame safety as a compliance issue, not as a strategic business issue but as a compliance issue. They see safety as a cost rather than as an investment in improved business performance. They treat safety as an abstract value for the organization rather than a really deliberate strategic lever.
What tends to happen is that organizations and executives respond to safety crises and incidents with financially and managerially unsustainable measures that are aimed at managing public image more than resolving the underlying issue.
Pretty harsh assessment of the current state of play in terms of these executive interviews. I'd be incredibly surprised if executives were talking like this, Drew, but that's just my own. Maybe the executives that I talk to talk a bit differently about safety than this.
Drew: I was going to ask you about that, David. How they got to this point is questionable, but this doesn't strike me just as immediately implausible. It's certainly not very nuanced. Framing something as a compliance issue could mean lots of different things. I'm sure there are times when executives do in fact look at safety through that compliance. Are we meeting our responsibility? Are we governing lenses? That could range from anything from a genuine concern about governance to nitpicking about compliance. It's hard to tell.
I wouldn't be surprised that most people see safety as both a cost and as an outcome. I'm pretty sure most organizations do in fact report injury rates up to board level and are concerned about those going up and down, which immediately is more than just safety as a cost. Certainly, I'd be willing to accept the underlying point that they're really trying to make. People don't necessarily see safety as a driver of other types of performance.
David: Drew, I'll just jump to some conclusions here and I'm keen to test this with you a bit. I'm going to make an assumption that these executives are all based in the US, although the paper doesn't specifically say that. We know that compliance is really important for US-based businesses and US-based executives. That makes sense. It does surprise me that an executive would say that we focus on managing our public issues rather than fixing safety problems in our business. I'm not sure that would be how executives would talk about this.
The last point I'll make is there's another claim that's made later in the paper. If you were going out and trying to find executives to talk about safety, would you go to finance and IT and education consulting businesses? Because there's another stat later that says only 30% of these companies actually measure any safety performance indicator, one or more performance indicators.
You've got two-thirds of these companies that don't even have any safety metric, not even an injury metric or anything that they monitor. It just doesn't seem like the people you'd want to talk to if you wanted to understand this issue. There are just a few assumptions that I'm making that just seems strange.
Drew: That one about safety metrics makes me very suspicious that the interviews that they're talking about are actually from an unrelated study.
David: I think this is a business strategy set of interviews. Again, like the question, if you ask the question about what are some of the most significant costs that you have to manage as a business or the most significant compliance issues, you could ask that of an executive. They might start talking about safety and then you go, oh, they're framing safety as a compliance issue. I've asked them what their big compliance issues are and they've said safety. This is where I think this data is coming in an indirect way.
Drew: Likewise, you ask them about what are the big PR issues that you're concerned about. What would be a big PR crisis for your company? They answer safety. Then you report back, oh, they're framing safety as a PR issue.
David: They don't even mention safety. They mentioned something like product recalls. The product recall becomes a public image issue, and product recalls become conflated with workplace safety from the authors. All of a sudden, all of safety in general is a PR issue. Yeah, I don't know where to go with the starting point. In any case, should we move on a bit?
Drew: Yeah. Just before we do one, I do want to just reinforce the point here that we don't like to pick up papers just to dunk on them. Really, the message that I would really like to get across here is this is what you should be doing when you read articles like this.
In your own head, think about what question would have produced this answer. That gives you a healthy degree of skepticism about some of these claims that you get in these articles, is by playing this game that David and I are playing now, which is trying to speculate. What might they have done? How might they have conducted these interviews? Who might you have asked that would actually realistically give you these answers and these numbers? It's what a skeptical reader should do.
David: The quote in the paper is "To help companies get out of this rut, we present evidence that safety can be a key driver of performance." That becomes the main claim. Safety can be a key driver of performance. Drew, do you want to talk a little bit about the key claim of this paper?
Drew: Okay. The first thing to note here is that if you take it literally, this is an incredibly weak claim that no researcher would actually research. Safety can be a key thing. Okay, all you need there is just one example where safety once was a key driver of performance, and that statement would be literally true.
But that's not what the authors want you to take away from this article. What they want you to take away and what they specifically lend to is they think they've got a recipe that you can follow under which safety will be a key driver of performance and will be, moreover, a positive driver of performance.
You've got to be careful. Do they support the claim that they're making? Is that really the claim that they want you to believe, or are they actually trying to sell something that is much harder and deserves a much higher standard of evidence and to believe that a particular formula will lead you to better safety and therefore to better performance? That is a big claim that needs really solid evidence directly testing that process across a range of companies to be able to make that claim.
David: Interestingly, Drew, as well, is never once in the paper saying what they mean by performance. It's in the title, they say performance. At times they talk about customer satisfaction performance, at times they talk about sales growth performance, at times they talk about profit margin performance, at times they talk about shareholder return performance, and other times they talk about company valuation performance. They talk about at least five different potential aspects of business performance, but they never once clarify what they're actually trying to say of the performance that safety connects to. We assume business performance means financial performance in some way, but that in itself is never clarified.
Drew: Contrast that with the paper we reviewed in episode 121, where they really pinned it down and they said, okay, this is exactly what we mean by safety, this is how we're measuring safety, and this is exactly what we mean by performance. This is how we're measuring performance.
You might disagree with either of those. You might disagree with how they measured safety, which was purely by outcomes. You might disagree with how they measured performance, which was purely by short-term and long-term survivability of the company. But because they've made really concrete claims, they can really back up those claims. You can ask those skeptical questions. These slippery, undefined claims that slide all around the place are really hard to test and make it really easy to cherry pick evidence against.
David: I mentioned just before about the 30%-35% of companies that monitored safety metrics. They also conclude that of the people they interviewed, 94% claim that their company views safety as a core value. Only 7% or 8% of those people said that safety was explicitly part of their company strategy.
Again, we got one in five executives. If you're having a safety interview and one in five people say we don't really have safety as part of our company strategy and only one in three say we actually monitor safety performance, it feels to me as a researcher, Drew, if you started finding out these things, you'd be going, maybe I'm speaking to the wrong people if I want to understand the connection between safety and business performance.
Drew: Hey, David, what's 94% of 76 interviews?
David: It says companies, so it's 94% of the about 30 companies. But what happens if you're interviewing two executives from the one company, one company says it is a core value, and one company says it doesn't? Is it 1 person doesn't out of 76?
Drew: It's not a whole number. You can't get those statistics out of the number of people they claimed that they interviewed.
David: Seventy-three out of 76 people or 74 out of 76 people? I don't know.
Drew: Something like 71-⅓ people said it.
David: Excellent. They're rounded somehow. Seventeen percent said that safety was part of their company strategy. Yeah, okay, and 35%. Okay.
Drew: Yeah. These are things that peer review would pick up and ask you to pin it down. Maybe there is a reasonable explanation for this. Maybe there are particular decisions that they've made, interviews that they've included or not included, people that they didn't count, maybe they deliberately didn't double count if they had two people from the same company. Or maybe they just made these statistics up, and there's no way to verify which it is when you don't have the level of detail.
David: There's a bunch of other research that gets pulled in here. Drew, one of the papers they talked about was a media study. You know how these media companies do their own polling and contacting those. They called up this media study that found that 97% of workers thought that physical safety of work was important, but only 54% of those workers believed that their employers shared the same opinion.
This is what makes me think that two of these authors are marketing professionals. They go to these media studies, marketing studies, and things like that, but we know nothing about this. We don't know if they were speaking to teenagers working at McDonald's or whether they just did a random survey of 50 people who answered their mobile phones. We don't know anything about this research.
Again, it's not calling peer-reviewed studies. I don't even know why they needed to bring this claim into the paper. I think they were maybe trying to speak to the current situation that they were trying to reinforce.
Drew: I don't have an answer to that. I wasn't sure exactly which study they were referring to at this point, and it really didn't seem to work. That's their claim that safety improves performance. Satisfied customers result in increased sales. How the heck does that show that and supposedly better safety results in more satisfied companies?
But they're measuring safety in that case by product recalls. Your company doesn't produce defective products and has to recall them, resulting in more sales. Okay, I can believe that, but it says nothing about whether safety and spending investment on safety improves company performance.
David: There’s a bunch of things here to make this performance claim. They talk about one study where safety was associated with a 9% lift in customer satisfaction. That becomes very dubious because we talk a lot in this podcast about what's the mechanism. If of all the variables that could cause a 9% lift in customer satisfaction, how is that study isolated a change in safety as being the variable that's associated with not caused? That's a loose claim.
They go on to say that customer satisfaction creates a 13% increase in sales, that companies pick contractors with better safety training, and then they make this generic claim that we know that customer satisfaction is strongly associated with sales growth, and we know that sales growth is associated with bigger margins, and we know that bigger margins is associated with shareholder returns. Therefore, if you do safety, you just become a better business. I don't think there's anything that compelling in that whole part of the paper that would say clearly that safety is a driver of business performance.
Drew: I'll be honest, I reckon the paper we reviewed, episode 121, the one that claimed that safety was not a driver of business performance, actually outlined the argument that it was a driver of performance better than this paper makes the argument, even though it's arguing in the opposite direction rather than in the same direction. Of course, there is an argument that safety is a driver of business performance, and there is an argument that it's not. Both arguments are plausible. That's why you need good studies to disentangle them. You can't just cherry pick evidence at each step along the argument, and therefore claim that you know what the answer to that difficult problem is. You've got to actually test the question that you're claiming to have an answer to.
David: I think at this point, personally, and I think I mentioned this in episode 121, I'd really like to believe that there are strong and clear ways that doing the things that improve safety are also going to be useful things to improve operational performance and business performance. I genuinely believe that.
My part of the critique of this paper is not necessarily about trying to defend a position from 121 about don't do safety because you'll go out of business. I just think that this isn't the research that makes that case. I wanted to just maybe go on record as David thinks safety isn't good for business. It's not necessarily what I think, it's just this paper doesn't scientifically demonstrate that.
Drew: My intuition is fairly similar to yours and I think very aligned with the authors of this paper, actually. I think that it can go either way, that depending on how we do safety, sometimes it will be good for business and sometimes it won't be. It is important that we think about how we do safety, not just that we do safety.
In terms of that board conclusion, my intuition goes absolutely this way and absolutely against episode 121. If we're going to be evidence-led, then we need to not use evidence as this rhetorical trick, rather than using evidence to actually securely build our arguments and build our business cases with.
David: Drew, if we go into the final bit of the paper, we might reinforce some of the practical takeaways. This article in the Harvard Business Review proposes a five-step process to turn safety from a constraint into a source of competitive advantage. I didn't like the lack of academic rigor of this paper, but I could generally agree with how the authors were laying out some of the current state issues and some of the ways that safety might be connected to business.
When they called this five-step process that I'll talk about now as a fundamental rethink of safety, this is where we start to really see the gap in the understanding of the body of literature in safety.
They say there's a five-step process to make safety a strategic advantage. Number one is align on the meaning of safety. The example they use is, being clear to your organization that safety means zero injuries, for example.
The second is agree on metrics. They don't give you specific examples. They say don't have too many because it's hard to manage more than a few things.
This third is you need to anticipate and prevent problems. The examples they use in anticipating and preventive problems are about compliance programs, doing more inspections, defining behaviors and monitoring compliant behaviors before they become incidents, giving nurses torches so they can see bed sores on patients and giving people some safety equipment. Very compliance and behavioral.
The fourth is target training, giving people training. One example they use is giving people online compliance training that's relevant to the behaviors and tasks they need to perform in point three.
And number five is to incentivize employees. One of the examples they give is a company that went around identifying people who were wearing compliant PPE and putting them in a drawer for a monthly gift card reward.
Drew, that's the five-step process in the Harvard Business Review to make safety a source of competitive advantage and a fundamental rethink of workplace safety.
Drew: David, I almost feel like putting this out as a quiz with a prize for our regular listeners to which episodes line up with these things and which ones are true, which ones are ambiguous, and which ones is the evidence squarely against. Because some of these are topics that we have covered, and in every case, at the very least, the literature is deep and complex around something that they are trying to make out as having a simple answer that they know. In at least two of those cases, the deep and complex literature also points squarely in the opposite direction to what they are recommending.
David: Yeah, we did in one of the early episodes, I think it was 12, talk about, is zero injury as a vision good or not good for safety around the zero accident vision. We also did an episode on the meaning of safety culture and how divided we are around that. Definitely done some metrics ones that leading indicators are a better lagging indicators.
We've definitely done things about compliance, behaviors, and rules. We did the manual handling training one that looks like it's going to be referenced in the Australian Institute of Health and Safety body of knowledge about manual handling training. Isn't that good? We did one episode on rewards as well about rewards actually driving performance down. We may have episodes that suggest that none of these five steps are a useful thing to do if you want to take an evidence-based approach to safety.
Drew: David, there is one that I wanted to pull out. I said I didn't really want to dunk on the authors. This one was actually peer-reviewed and published by these authors. We do know the detailed methods behind their claim that compliance training is effective. David, I don't think you've read this one, so I want your live reaction to how they did this study.
What they did is they noticed that when a company has a major accident, that major accident is followed by an uptick in compliance training. Therefore, they say, this is an opportunity for a natural experiment. We have an environment that is not affected, they claim, by anything else, except for the fact that naturally, we have some situations where training has gone up. What they noticed is that when you have a major accident, it is followed by an uptick in training, followed by a decrease in hazard reporting, therefore, the logical explanation is that compliance training improves safety.
David: I saw those numbers in the paper about a 10% increase in training resulting in a 6%–10% decrease in hazards in the workplace or something. I didn't go dig it out, but Drew, that is a nonsensical line of reasoning. I'm sure any one of our listeners could pull that apart.
Drew: Yeah. The point I want to make here is that it is a fairly thorough paper. They have done their best as marketing researchers to disentangle the other explanations, to think through other things that might have affected this, and to rule out other ways in which an accident might influence hazard reporting after the accident.
They've done that in isolation and ignorance from everything we know in the literature about how organizations respond to accidents, the relationship between hazard reporting and actual safety, the relation between hazard reporting and accidents, and the effect that training has on hazard reporting, all of these connections that are deeply and in-depth discussed in the safety literature that they just don't know about because it's not their field. They're stepping their toe into deep waters that they just are unaware of because they're experts in something different.
This is why peer review exists. This is why we have disciplines and why research gets siloed and isolated. There are problems with peer review, there are problems with research getting siloed. It's great to have interdisciplinary work.
Much as I like to dunk on safety science sometimes, safety science is a field of expertise, and it's filled with people who have spent years and years studying these problems. It's just not so simple that you can come in from the outside unaware of all the work that's previously been done and just not know why this isn't just an easy answer.
David: Yeah, Drew, it's good. Is there anything else you want to say about the paper before we talk? I've got a couple of practical takeaways. I wouldn't mind just bouncing off you as well, but anything you want to talk about first?
Drew: No, absolutely. Let's get on to practical takeaways.
David: I think the first one has been probably pretty clear through this episode about just how we need to, as professionals and practitioners, critically review any literature and even literature that we'd really like to believe. I think that sometimes we're our own biases. Probably, we may even need to be more critical of where we see and read something that immediately aligns with something that we'd like to believe. We still can't be lazy in terms of our critical review of that and particularly methods, results, and claims.
How was the research done? How were the results of that research analyzed? What are the conclusions that the researchers have drawn as a result? That needs to be robust, it needs to be rigorous. You can do that yourself in some ways, or you can try to lean more on, like what Drew said, the peer-reviewed literature by authors that are experienced in the field that they're researching.
David: I'd certainly agree with that. I would say be particularly careful of this gray space between popular press literature and academic literature. At first glance, this looks like it is academic work. Harvard Business Review, lots of journals are called such and such reviews, but this is not. This is a glossy magazine for industry professionals.
But you might be deceived because the authors are all published researchers. They're all professors at reputable places. They're just not writing in the genre of an academic article. This article has got lots of literature and near citations. It looks like it's based on the research, but again, you don't have the references. You don't have the opportunity to check it out for yourself. It's a space where it's easy to look and sound academic, look and sound evidency, but it just isn't.
David: Drew, the second one is thinking about when I read this is being cautious of really big claims in the literature. This paper claims that safety is currently being mismanaged, and safety is a driver of business performance. These are really big and bold claims. In my experience, and I'd like to test it with you, is our scientific understanding evolves a lot more piece by piece. Particularly in the social sciences and the organizational sciences, these big claims have very rarely been proven comprehensively.
Drew: I'll go further. It's not even piece by piece like building a wall where each brick makes the wall higher. It's like we're chucking stones at the wall, and over time it gradually builds up. Any given claim might be contradicted, then remade, then contradicted, have evidence that indicates it, and then evidence that indicates the other. It's a gradual accumulation of evidence, not even just a bit by bit, this is true, this is true, this is true, this is true.
David: The third one here, Drew, and I've put this, is we know that workplace safety or managing safety is a complex undertaking. If it was something that could be done with a five-step process for all of the effort and resources and attention that companies have put on safety, it would be largely solved by now. I think it's just that there's a beware of any three-step, five-step linear process for achieving anything complex that should probably be immediately discarded. I don't think we can write a five step process for turning safety into a strategic driver of business performance.
Drew: If all of these companies are supposedly getting safety wrong, I'm pretty sure a lot of them have got five-step processes. What is the chance that this latest new shiny one is just going to suddenly do the job? This one isn't particularly new or shiny, these are ideas that have been beaten around for a long time. Turning them into a five step process doesn't make them any more true.
David: I think I'm involved in this. Having a vision, having a metric, having a compliance and behavior program, giving people online training and giving people gift card rewards every month, that's not a five-step process for safety.
Another one that I don't have on here, Drew, is make sure you've got some good relationships with your executives in your organizations because your executives are going to read this stuff, and they're going to read this stuff even less informed than the authors who have written this thing. They're going to think that it's the next answer for your organization. They may even come to you and go, we need to do what is in this paper. You need to have the relationship, the confidence, and the ability to talk your leaders through where this thinking might be insufficient.
Drew: The ideal case is you've already got a good relationship with your executives. They're going to come to you and ask you, hey, I saw this article, what do you think? It may be that you agree with a lot of the suggestions, or you agree with the general message, but just be careful about when you use dubious evidence to support things that you want to say anyway, because it's going to come back and bite you next time you need to use evidence to try to change someone's mind.
David: Any other takeaways, Drew?
Drew: I know we've got a lot of researcher listeners and a lot of early career researcher listeners. I don't want to say, stay in your lane. Instead, I'd rather say, approach opportunities to share your work and to publicize your work with the appropriate level of confidence and humility based on knowing what your expertise is. Be proud of the work that you've done, and don't be afraid to put yourself out as an expert in a space where people knowing less than you are going to hold themselves out to be experts.
Equally, sometimes you just have to turn opportunities down, even if it's an exciting opportunity, even if it's a big publication, even if it's a big conference. and just say, I'm not the best person to do this. Maybe I can direct you to someone else who knows what they're talking about in this area and can give you what you need but with more credibility.
David: Good point, Drew. So the question that we re-asked this week was, is safety a key value driver for business?
Drew: After one more paper, the short answer is we still don't know. We've had a good paper, we've had a bad paper. We do know some of the things we do to improve safety. Actually, this is probably more of a strong intuition that some of the things that we do to improve safety probably do have strategic value for our company, and some of the things that we do to improve safety might not even improve safety, let alone have other strategic value for the company. But we can't make this strong claim that good safety is good for business or that we know how to neatly align safety with business success because we just don't know the answer to that.
David: That's it for this week. We hope you found this episode thought-provoking and ultimately useful in shaping the safety of work in your own organization. Send any comments, questions, or ideas for future episodes to f David: You're listening to the Safety at Work podcast episode 124. Today we're re-asking the question, is safety a key value driver for business? Let's get started.
Hey, everybody. My name is David Provan, and I'm here with Drew Rae. We're from the Safety Science Innovation Lab at Griffith University in Australia. Welcome to the Safety of Work Podcast.
In each episode, we ask an important question in relation to the safety of work or the work of safety, and we examine the evidence surrounding it. Drew, I think that might be the first time in 124 episodes, where we've changed that introduction and deliberately mentioned re-asking the question. Would you like to give a bit of context to that change?
Drew: Sure. In episode 121, we talked about a paper which analyzed where good safety performance improved the viability of a business. I'll be frank, we were fairly complimentary about the paper. The authors got back to us on LinkedIn, and were pleased with how we talked about the paper. But just as a reminder, the conclusion of that paper was that market forces on average tend to punish rather than reward companies with a low number of injuries.
That's not a definitive answer, and there's still some remaining uncertainty about why that might be the case. We used that paper to make the argument, or at least to call into question this common idea that good safety is good for business. Of course, the fact that it is a common idea means that there are papers out there that make the exact opposite argument. Pretty soon after we talked about this paper, which says that maybe good safety is not good for business, we saw this article drawing attention saying the exact opposite.
The paper itself doesn't quite make the argument necessary that safety is good for business, but that's certainly the way it's being talked about, as if this paper is evidence that safety is good for business. As we go through the paper, we'll get into a bit of nuance about what they say, don't say, and what they directly claim versus what they imply.
Given that this is a paper that is drawing attention to say the exact opposite to a paper that we'd reviewed, we thought it was a good opportunity to talk about how we select our own papers, how we critique papers, and how you might look a little bit skeptically at the evidence.
When you have these contentious issues where different people are saying different things, what does good evidence look like? Bottom line up front, this episode is a cautionary tale, but we'll try to be fair as we go through it and explain just how we look at papers, how we see papers, why we're happy to stand by episode 121, and why you might be a bit skeptical about this new paper making the rounds.
David: Drew, let's introduce the paper and then let's talk a little bit about it. There are a bunch of things that we want to talk about today, about scientific research, publication, research methods, and those types of things. This popped up in my LinkedIn feed about 4 or 5 times in a 24-hour period and drove a lot of really positive commentary around it. It's an article that's due to be published in the September-October edition of the Harvard Business Review.
The HBR, we may all be familiar with, at least maybe in the Western professional world. It's essentially a professional magazine for business leaders and business professionals. I don't actually know their publishing process or whether they have any kind of peer-reviewed papers, but we featured research before that has been published in professional journals through the safety professional associations and so on, professional safety in the US published by ASSP.
So this is a Harvard Business Review. The title of the paper is Safety Should Be a Performance Driver. Three authors, Vikas Mittal, Alessandro Piazza, and Sonam Singh. They're all marketing and strategy academics based in the US, so they are professors and researchers in academic institutions. They have done some research in safety in the past, at least in terms of some of the studies that they talk about and call into as evidence in the paper.
It's going to be published. You've you might've seen it doing the rounds. Today we're going to talk a little bit about the paper and maybe what it can and can't tell us about whether safety should be a performance driver.
Drew, is there anything else you want to say before I add to the high level structure, just about authors and research publications?
Drew: Okay. A couple of things I'll say here. When we would normally do this episode, we would be talking a little bit more in detail about the author's background, what they had published, and why that might give credibility to the types of arguments they make. This is going to be a little bit of a counterexample because this is not work which is published in a safety journal. It's not in fact even work that is published in a management journal.
The particular paper we're looking at is not work that's been peer-reviewed at all. It's based on some work that has been peer-reviewed but not peer-reviewed in a safety venue. What is the risk of that is that there may be concepts that are well-understood in the discipline of safety that these authors aren't aware of. Not because they're not respectable researchers, not because they don't have distinguished careers, not because they're not very good with it in their own fields, but just simply because they're unaware of what the safety literature says and why maybe that draws into question some of the ways they're going about their research and the types of evidence they're selecting.
Why authors matter and why peer review matters is it's not there to screen against nefarious people. It's not there to screen against dodgy people. It's there to make sure that the right things have been thought about, that work grows on is embedded within a wider body of knowledge that should inform these sorts of questions. That is obviously not the point of the Harvard Business Review. It's meant to be presenting stuff for executives, but it should be based on the best research in its field.
This is a bad job by the Harvard Business Review, by picking these particular authors to talk about this particular topic and to present it to business executives. They are not doing a service either to the academia, to these authors, or to the executives by drawing attention to this particular work and putting it in front of company executives.
David: I think to highlight one of the points here, and I think for our listeners who clearly have an interest in safety science, if you're listening to the Safety of Work podcast, when we get to what the paper suggests as the action plan to make safety a strategic value driver for business, you might see where it's the gaps in knowledge potentially of the safety science literature by the authors that have put together what we think or what I think, at least—I'm interested in your thoughts, Drew, when we get there—is fairly a weekend and transactional compliance-oriented safety program.
Drew, there are three parts to this paper. This is designed to be something that is an accessible reading in almost like an essay type format as people familiar with the Harvard Business Review. There are no citations, there are no references, but the authors do quote numbers and quote certain conclusions from research without actually saying where that research is or how it was done.
The paper is split into three parts. The author lays out the current problem for safety in organizations, then they lay out the evidence that connects safety to business performance to try to make that really clear. The third part is a five-step action plan to essentially get there, to make safety that strategic value driver for business performance. Drew, do you want to start with the background and we'll talk a little bit about the current problem?
Drew: Sure. I'm going to start with a little bit of a quote from the paper. "And yet considerable evidence suggests that most companies mismanage safety. For all the policing and investment in compliance, products are frequently recalled and workplace accidents continue to happen.
In 2023 in the United States, 3300 recalls affected more than 135 million products, a high not seen since 2016. US employers reported 2.8 million injury and illness cases in 2022, a 7.5% increase over 2021. The United States saw 5486 fatal work injuries in 2022, the equivalent of one death every 96 minutes. Why aren't companies doing better on safety?"
David, peer reviewer, go. How many statements in there would you be as a peer reviewer, your eyes lighting up and thinking, nope, you can't say that?
Drew: We're conflating lots of things here. The first sentence about companies mismanaging safety needs a considerable number of citations to make that claim that most companies mismanage safety, and then it needs to lay out a very strong argument. That would be a paper in that one sentence to try to validate that claim. We start talking about product safety and product recalls. We then start talking about injuries or minor injuries and illnesses, and then we start talking about time-bound fatalities.
Drew: In your experience, do companies manage consumer product quality using the same systems as processes as they prevent worker injuries with the type of consumer products that get recalled?
David: Very unlikely. Product safety is often much more a quality function disconnected from the workplace health and safety function in an organization. Particularly what we'd be talking about here, which is manufacturing companies, there would be regulatory compliance and quality teams that would be managing those aspects of product safety and compliance. Why that's part of the workplace safety paper, I'm not sure.
Drew: Any particular reason you can think of that there might have been more injuries in 2022 than in 2021?
David: People might have still been at home with Covid, although most of the US was back at work at that point in time. We saw a whole bunch of things happening in the industry, shifts going on with turnover of workers, and different supply chain arrangements. Even if we look at some of the other papers, we've looked at other people like the Statistical Insignificance of TRIR from Matt Halliwell and the CSRA at Boulder, we know that that 7.5% increase is meaningless. We know that that doesn't carry statistical significance.
Drew: We're being a little bit mean and unfair here because this is a business review article. The purpose of this introductory paragraph is not, in fact, to present peer-reviewed evidence. It is to make a rhetorical point to introduce the article, but this is what we got to be really careful of when we start sharing these sorts of things on LinkedIn, and we start using these sorts of things as evidence that we're putting in front of managers, that this is using data as rhetoric, not using data as data.
In the introduction, when you're setting the scene and you're just trying to say, hey, safety is a problem, we could do better, there's not a lot of harm caused by relying on rhetoric, by cherry picking your evidence, by making it all sound good. But the deeper we get into the paper, the deeper and deeper you're making this argument, and starting to try to influence the way people are thinking about things, the more and more the problem with this use of data as rhetoric is going to become. I just wanted to point it out here because the introduction is just really egregious in terms of things that peer reviewers would not let you get away with.
David: Drew, let's talk a little bit about methodology so we know what we're talking about. There is research presented in this paper, as well as a bunch of other studies that are referenced as well. These authors wanted to understand how companies manage safety.
To do that, they interviewed 76 executives from about 30 companies across different industries, including distribution, education, healthcare, facilities, management, oilfield services, finance, IT, and manufacturing.
We don't know what those 30 companies are, we don't know the mix across industries. We don't know how many executives from the same company, whether it was one from 29 companies and then 50 from the 30th company. We actually don't know anything more about it. I imagine that IT and finance companies are very different from oil field services companies when we're talking about workplace safety.
Drew: I was a little bit struck by the about 30. Again, no peer reviewer lets you get away with about 30. What was the number? That's nitpicking. One of the difficulties in reading this article—this is not the fault of the authors at all—is that Harvard Business Review articles don't include a reference list. Guardian articles include reference lists, but Harvard Business Review does not.
The trouble is we don't know whether what they're referring to is published research that might be somewhere else that we can look for for the details, or work that they did specifically for this article, or other work that they've done that was just never published. In a few cases, I've tried to dig up where it is. In the case of these interviews, I cannot find anywhere else that's been published.
One of the authors recently published a book that is based on CEO interviews, but that wasn't about safety. If safety came up in those interviews, it was not the purpose of the interviews.
This is what you really need to know to understand data coming from an interview. How did they pick these people? How did they conduct the interviews? Who conducted the interviews? What were the topics and questions of those interviews? How was the data analyzed afterwards?
If you're going to make a claim based on interviews, 76 interviews is a lot. You could do a PhD thesis out of that if you did it properly, if you reported it properly. But we don't even know if these interviews were about the management of safety. They could have just been about other stuff. They threw in one question, or a couple of the interviews talked about this.
This is why when we review papers, we look at the methods and we care about the methods. Without knowing how the interviews were conducted, you just have to assume that all of the data was prompted by the way the researchers conducted the interviews and does not reflect what the CEOs actually believe, let alone how safety is actually practiced.
Interviews can be good evidence, but as a critical reader, when someone says they've done a lot of interviews, unless they tell you the details, you cannot believe what they say about those interviews. The verification is not in the claims, the verification is in the how they do it, explaining how they did it, and having peer review around those processes.
David: Drew, I think that's a nice caution about now. When we don't know much about the methods or we have big question marks over the methods, then I think the starting position needs to discount any conclusion that has been drawn, because you can't validate how those conclusions were arrived at. You need to be really cautious of being confident in the conclusions that are being presented.
The starting point for this paper in terms of the current situation based on these interviews is that the author suggests that currently, at least for these executives interviewed, they frame safety as a compliance issue, not as a strategic business issue but as a compliance issue. They see safety as a cost rather than as an investment in improved business performance. They treat safety as an abstract value for the organization rather than a really deliberate strategic lever.
What tends to happen is that organizations and executives respond to safety crises and incidents with financially and managerially unsustainable measures that are aimed at managing public image more than resolving the underlying issue.
Pretty harsh assessment of the current state of play in terms of these executive interviews. I'd be incredibly surprised if executives were talking like this, Drew, but that's just my own. Maybe the executives that I talk to talk a bit differently about safety than this.
Drew: I was going to ask you about that, David. How they got to this point is questionable, but this doesn't strike me just as immediately implausible. It's certainly not very nuanced. Framing something as a compliance issue could mean lots of different things. I'm sure there are times when executives do in fact look at safety through that compliance. Are we meeting our responsibility? Are we governing lenses? That could range from anything from a genuine concern about governance to nitpicking about compliance. It's hard to tell.
I wouldn't be surprised that most people see safety as both a cost and as an outcome. I'm pretty sure most organizations do in fact report injury rates up to board level and are concerned about those going up and down, which immediately is more than just safety as a cost. Certainly, I'd be willing to accept the underlying point that they're really trying to make. People don't necessarily see safety as a driver of other types of performance.
David: Drew, I'll just jump to some conclusions here and I'm keen to test this with you a bit. I'm going to make an assumption that these executives are all based in the US, although the paper doesn't specifically say that. We know that compliance is really important for US-based businesses and US-based executives. That makes sense. It does surprise me that an executive would say that we focus on managing our public issues rather than fixing safety problems in our business. I'm not sure that would be how executives would talk about this.
The last point I'll make is there's another claim that's made later in the paper. If you were going out and trying to find executives to talk about safety, would you go to finance and IT and education consulting businesses? Because there's another stat later that says only 30% of these companies actually measure any safety performance indicator, one or more performance indicators.
You've got two-thirds of these companies that don't even have any safety metric, not even an injury metric or anything that they monitor. It just doesn't seem like the people you'd want to talk to if you wanted to understand this issue. There are just a few assumptions that I'm making that just seems strange.
Drew: That one about safety metrics makes me very suspicious that the interviews that they're talking about are actually from an unrelated study.
David: I think this is a business strategy set of interviews. Again, like the question, if you ask the question about what are some of the most significant costs that you have to manage as a business or the most significant compliance issues, you could ask that of an executive. They might start talking about safety and then you go, oh, they're framing safety as a compliance issue. I've asked them what their big compliance issues are and they've said safety. This is where I think this data is coming in an indirect way.
Drew: Likewise, you ask them about what are the big PR issues that you're concerned about. What would be a big PR crisis for your company? They answer safety. Then you report back, oh, they're framing safety as a PR issue.
David: They don't even mention safety. They mentioned something like product recalls. The product recall becomes a public image issue, and product recalls become conflated with workplace safety from the authors. All of a sudden, all of safety in general is a PR issue. Yeah, I don't know where to go with the starting point. In any case, should we move on a bit?
Drew: Yeah. Just before we do one, I do want to just reinforce the point here that we don't like to pick up papers just to dunk on them. Really, the message that I would really like to get across here is this is what you should be doing when you read articles like this.
In your own head, think about what question would have produced this answer. That gives you a healthy degree of skepticism about some of these claims that you get in these articles, is by playing this game that David and I are playing now, which is trying to speculate. What might they have done? How might they have conducted these interviews? Who might you have asked that would actually realistically give you these answers and these numbers? It's what a skeptical reader should do.
David: The quote in the paper is "To help companies get out of this rut, we present evidence that safety can be a key driver of performance." That becomes the main claim. Safety can be a key driver of performance. Drew, do you want to talk a little bit about the key claim of this paper?
Drew: Okay. The first thing to note here is that if you take it literally, this is an incredibly weak claim that no researcher would actually research. Safety can be a key thing. Okay, all you need there is just one example where safety once was a key driver of performance, and that statement would be literally true.
But that's not what the authors want you to take away from this article. What they want you to take away and what they specifically lend to is they think they've got a recipe that you can follow under which safety will be a key driver of performance and will be, moreover, a positive driver of performance.
You've got to be careful. Do they support the claim that they're making? Is that really the claim that they want you to believe, or are they actually trying to sell something that is much harder and deserves a much higher standard of evidence and to believe that a particular formula will lead you to better safety and therefore to better performance? That is a big claim that needs really solid evidence directly testing that process across a range of companies to be able to make that claim.
David: Interestingly, Drew, as well, is never once in the paper saying what they mean by performance. It's in the title, they say performance. At times they talk about customer satisfaction performance, at times they talk about sales growth performance, at times they talk about profit margin performance, at times they talk about shareholder return performance, and other times they talk about company valuation performance. They talk about at least five different potential aspects of business performance, but they never once clarify what they're actually trying to say of the performance that safety connects to. We assume business performance means financial performance in some way, but that in itself is never clarified.
Drew: Contrast that with the paper we reviewed in episode 121, where they really pinned it down and they said, okay, this is exactly what we mean by safety, this is how we're measuring safety, and this is exactly what we mean by performance. This is how we're measuring performance.
You might disagree with either of those. You might disagree with how they measured safety, which was purely by outcomes. You might disagree with how they measured performance, which was purely by short-term and long-term survivability of the company. But because they've made really concrete claims, they can really back up those claims. You can ask those skeptical questions. These slippery, undefined claims that slide all around the place are really hard to test and make it really easy to cherry pick evidence against.
David: I mentioned just before about the 30%-35% of companies that monitored safety metrics. They also conclude that of the people they interviewed, 94% claim that their company views safety as a core value. Only 7% or 8% of those people said that safety was explicitly part of their company strategy.
Again, we got one in five executives. If you're having a safety interview and one in five people say we don't really have safety as part of our company strategy and only one in three say we actually monitor safety performance, it feels to me as a researcher, Drew, if you started finding out these things, you'd be going, maybe I'm speaking to the wrong people if I want to understand the connection between safety and business performance.
Drew: Hey, David, what's 94% of 76 interviews?
David: It says companies, so it's 94% of the about 30 companies. But what happens if you're interviewing two executives from the one company, one company says it is a core value, and one company says it doesn't? Is it 1 person doesn't out of 76?
Drew: It's not a whole number. You can't get those statistics out of the number of people they claimed that they interviewed.
David: Seventy-three out of 76 people or 74 out of 76 people? I don't know.
Drew: Something like 71-⅓ people said it.
David: Excellent. They're rounded somehow. Seventeen percent said that safety was part of their company strategy. Yeah, okay, and 35%. Okay.
Drew: Yeah. These are things that peer review would pick up and ask you to pin it down. Maybe there is a reasonable explanation for this. Maybe there are particular decisions that they've made, interviews that they've included or not included, people that they didn't count, maybe they deliberately didn't double count if they had two people from the same company. Or maybe they just made these statistics up, and there's no way to verify which it is when you don't have the level of detail.
David: There's a bunch of other research that gets pulled in here. Drew, one of the papers they talked about was a media study. You know how these media companies do their own polling and contacting those. They called up this media study that found that 97% of workers thought that physical safety of work was important, but only 54% of those workers believed that their employers shared the same opinion.
This is what makes me think that two of these authors are marketing professionals. They go to these media studies, marketing studies, and things like that, but we know nothing about this. We don't know if they were speaking to teenagers working at McDonald's or whether they just did a random survey of 50 people who answered their mobile phones. We don't know anything about this research.
Again, it's not calling peer-reviewed studies. I don't even know why they needed to bring this claim into the paper. I think they were maybe trying to speak to the current situation that they were trying to reinforce.
Drew: I don't have an answer to that. I wasn't sure exactly which study they were referring to at this point, and it really didn't seem to work. That's their claim that safety improves performance. Satisfied customers result in increased sales. How the heck does that show that and supposedly better safety results in more satisfied companies?
But they're measuring safety in that case by product recalls. Your company doesn't produce defective products and has to recall them, resulting in more sales. Okay, I can believe that, but it says nothing about whether safety and spending investment on safety improves company performance.
David: There’s a bunch of things here to make this performance claim. They talk about one study where safety was associated with a 9% lift in customer satisfaction. That becomes very dubious because we talk a lot in this podcast about what's the mechanism. If of all the variables that could cause a 9% lift in customer satisfaction, how is that study isolated a change in safety as being the variable that's associated with not caused? That's a loose claim.
They go on to say that customer satisfaction creates a 13% increase in sales, that companies pick contractors with better safety training, and then they make this generic claim that we know that customer satisfaction is strongly associated with sales growth, and we know that sales growth is associated with bigger margins, and we know that bigger margins is associated with shareholder returns. Therefore, if you do safety, you just become a better business. I don't think there's anything that compelling in that whole part of the paper that would say clearly that safety is a driver of business performance.
Drew: I'll be honest, I reckon the paper we reviewed, episode 121, the one that claimed that safety was not a driver of business performance, actually outlined the argument that it was a driver of performance better than this paper makes the argument, even though it's arguing in the opposite direction rather than in the same direction. Of course, there is an argument that safety is a driver of business performance, and there is an argument that it's not. Both arguments are plausible. That's why you need good studies to disentangle them. You can't just cherry pick evidence at each step along the argument, and therefore claim that you know what the answer to that difficult problem is. You've got to actually test the question that you're claiming to have an answer to.
David: I think at this point, personally, and I think I mentioned this in episode 121, I'd really like to believe that there are strong and clear ways that doing the things that improve safety are also going to be useful things to improve operational performance and business performance. I genuinely believe that.
My part of the critique of this paper is not necessarily about trying to defend a position from 121 about don't do safety because you'll go out of business. I just think that this isn't the research that makes that case. I wanted to just maybe go on record as David thinks safety isn't good for business. It's not necessarily what I think, it's just this paper doesn't scientifically demonstrate that.
Drew: My intuition is fairly similar to yours and I think very aligned with the authors of this paper, actually. I think that it can go either way, that depending on how we do safety, sometimes it will be good for business and sometimes it won't be. It is important that we think about how we do safety, not just that we do safety.
In terms of that board conclusion, my intuition goes absolutely this way and absolutely against episode 121. If we're going to be evidence-led, then we need to not use evidence as this rhetorical trick, rather than using evidence to actually securely build our arguments and build our business cases with.
David: Drew, if we go into the final bit of the paper, we might reinforce some of the practical takeaways. This article in the Harvard Business Review proposes a five-step process to turn safety from a constraint into a source of competitive advantage. I didn't like the lack of academic rigor of this paper, but I could generally agree with how the authors were laying out some of the current state issues and some of the ways that safety might be connected to business.
When they called this five-step process that I'll talk about now as a fundamental rethink of safety, this is where we start to really see the gap in the understanding of the body of literature in safety.
They say there's a five-step process to make safety a strategic advantage. Number one is align on the meaning of safety. The example they use is, being clear to your organization that safety means zero injuries, for example.
The second is agree on metrics. They don't give you specific examples. They say don't have too many because it's hard to manage more than a few things.
This third is you need to anticipate and prevent problems. The examples they use in anticipating and preventive problems are about compliance programs, doing more inspections, defining behaviors and monitoring compliant behaviors before they become incidents, giving nurses torches so they can see bed sores on patients and giving people some safety equipment. Very compliance and behavioral.
The fourth is target training, giving people training. One example they use is giving people online compliance training that's relevant to the behaviors and tasks they need to perform in point three.
And number five is to incentivize employees. One of the examples they give is a company that went around identifying people who were wearing compliant PPE and putting them in a drawer for a monthly gift card reward.
Drew, that's the five-step process in the Harvard Business Review to make safety a source of competitive advantage and a fundamental rethink of workplace safety.
Drew: David, I almost feel like putting this out as a quiz with a prize for our regular listeners to which episodes line up with these things and which ones are true, which ones are ambiguous, and which ones is the evidence squarely against. Because some of these are topics that we have covered, and in every case, at the very least, the literature is deep and complex around something that they are trying to make out as having a simple answer that they know. In at least two of those cases, the deep and complex literature also points squarely in the opposite direction to what they are recommending.
David: Yeah, we did in one of the early episodes, I think it was 12, talk about, is zero injury as a vision good or not good for safety around the zero accident vision. We also did an episode on the meaning of safety culture and how divided we are around that. Definitely done some metrics ones that leading indicators are a better lagging indicators.
We've definitely done things about compliance, behaviors, and rules. We did the manual handling training one that looks like it's going to be referenced in the Australian Institute of Health and Safety body of knowledge about manual handling training. Isn't that good? We did one episode on rewards as well about rewards actually driving performance down. We may have episodes that suggest that none of these five steps are a useful thing to do if you want to take an evidence-based approach to safety.
Drew: David, there is one that I wanted to pull out. I said I didn't really want to dunk on the authors. This one was actually peer-reviewed and published by these authors. We do know the detailed methods behind their claim that compliance training is effective. David, I don't think you've read this one, so I want your live reaction to how they did this study.
What they did is they noticed that when a company has a major accident, that major accident is followed by an uptick in compliance training. Therefore, they say, this is an opportunity for a natural experiment. We have an environment that is not affected, they claim, by anything else, except for the fact that naturally, we have some situations where training has gone up. What they noticed is that when you have a major accident, it is followed by an uptick in training, followed by a decrease in hazard reporting, therefore, the logical explanation is that compliance training improves safety.
David: I saw those numbers in the paper about a 10% increase in training resulting in a 6%–10% decrease in hazards in the workplace or something. I didn't go dig it out, but Drew, that is a nonsensical line of reasoning. I'm sure any one of our listeners could pull that apart.
Drew: Yeah. The point I want to make here is that it is a fairly thorough paper. They have done their best as marketing researchers to disentangle the other explanations, to think through other things that might have affected this, and to rule out other ways in which an accident might influence hazard reporting after the accident.
They've done that in isolation and ignorance from everything we know in the literature about how organizations respond to accidents, the relationship between hazard reporting and actual safety, the relation between hazard reporting and accidents, and the effect that training has on hazard reporting, all of these connections that are deeply and in-depth discussed in the safety literature that they just don't know about because it's not their field. They're stepping their toe into deep waters that they just are unaware of because they're experts in something different.
This is why peer review exists. This is why we have disciplines and why research gets siloed and isolated. There are problems with peer review, there are problems with research getting siloed. It's great to have interdisciplinary work.
Much as I like to dunk on safety science sometimes, safety science is a field of expertise, and it's filled with people who have spent years and years studying these problems. It's just not so simple that you can come in from the outside unaware of all the work that's previously been done and just not know why this isn't just an easy answer.
David: Yeah, Drew, it's good. Is there anything else you want to say about the paper before we talk? I've got a couple of practical takeaways. I wouldn't mind just bouncing off you as well, but anything you want to talk about first?
Drew: No, absolutely. Let's get on to practical takeaways.
David: I think the first one has been probably pretty clear through this episode about just how we need to, as professionals and practitioners, critically review any literature and even literature that we'd really like to believe. I think that sometimes we're our own biases. Probably, we may even need to be more critical of where we see and read something that immediately aligns with something that we'd like to believe. We still can't be lazy in terms of our critical review of that and particularly methods, results, and claims.
How was the research done? How were the results of that research analyzed? What are the conclusions that the researchers have drawn as a result? That needs to be robust, it needs to be rigorous. You can do that yourself in some ways, or you can try to lean more on, like what Drew said, the peer-reviewed literature by authors that are experienced in the field that they're researching.
David: I'd certainly agree with that. I would say be particularly careful of this gray space between popular press literature and academic literature. At first glance, this looks like it is academic work. Harvard Business Review, lots of journals are called such and such reviews, but this is not. This is a glossy magazine for industry professionals.
But you might be deceived because the authors are all published researchers. They're all professors at reputable places. They're just not writing in the genre of an academic article. This article has got lots of literature and near citations. It looks like it's based on the research, but again, you don't have the references. You don't have the opportunity to check it out for yourself. It's a space where it's easy to look and sound academic, look and sound evidency, but it just isn't.
David: Drew, the second one is thinking about when I read this is being cautious of really big claims in the literature. This paper claims that safety is currently being mismanaged, and safety is a driver of business performance. These are really big and bold claims. In my experience, and I'd like to test it with you, is our scientific understanding evolves a lot more piece by piece. Particularly in the social sciences and the organizational sciences, these big claims have very rarely been proven comprehensively.
Drew: I'll go further. It's not even piece by piece like building a wall where each brick makes the wall higher. It's like we're chucking stones at the wall, and over time it gradually builds up. Any given claim might be contradicted, then remade, then contradicted, have evidence that indicates it, and then evidence that indicates the other. It's a gradual accumulation of evidence, not even just a bit by bit, this is true, this is true, this is true, this is true.
David: The third one here, Drew, and I've put this, is we know that workplace safety or managing safety is a complex undertaking. If it was something that could be done with a five-step process for all of the effort and resources and attention that companies have put on safety, it would be largely solved by now. I think it's just that there's a beware of any three-step, five-step linear process for achieving anything complex that should probably be immediately discarded. I don't think we can write a five step process for turning safety into a strategic driver of business performance.
Drew: If all of these companies are supposedly getting safety wrong, I'm pretty sure a lot of them have got five-step processes. What is the chance that this latest new shiny one is just going to suddenly do the job? This one isn't particularly new or shiny, these are ideas that have been beaten around for a long time. Turning them into a five step process doesn't make them any more true.
David: I think I'm involved in this. Having a vision, having a metric, having a compliance and behavior program, giving people online training and giving people gift card rewards every month, that's not a five-step process for safety.
Another one that I don't have on here, Drew, is make sure you've got some good relationships with your executives in your organizations because your executives are going to read this stuff, and they're going to read this stuff even less informed than the authors who have written this thing. They're going to think that it's the next answer for your organization. They may even come to you and go, we need to do what is in this paper. You need to have the relationship, the confidence, and the ability to talk your leaders through where this thinking might be insufficient.
Drew: The ideal case is you've already got a good relationship with your executives. They're going to come to you and ask you, hey, I saw this article, what do you think? It may be that you agree with a lot of the suggestions, or you agree with the general message, but just be careful about when you use dubious evidence to support things that you want to say anyway, because it's going to come back and bite you next time you need to use evidence to try to change someone's mind.
David: Any other takeaways, Drew?
Drew: I know we've got a lot of researcher listeners and a lot of early career researcher listeners. I don't want to say, stay in your lane. Instead, I'd rather say, approach opportunities to share your work and to publicize your work with the appropriate level of confidence and humility based on knowing what your expertise is. Be proud of the work that you've done, and don't be afraid to put yourself out as an expert in a space where people knowing less than you are going to hold themselves out to be experts.
Equally, sometimes you just have to turn opportunities down, even if it's an exciting opportunity, even if it's a big publication, even if it's a big conference. and just say, I'm not the best person to do this. Maybe I can direct you to someone else who knows what they're talking about in this area and can give you what you need but with more credibility.
David: Good point, Drew. So the question that we re-asked this week was, is safety a key value driver for business?
Drew: After one more paper, the short answer is we still don't know. We've had a good paper, we've had a bad paper. We do know some of the things we do to improve safety. Actually, this is probably more of a strong intuition that some of the things that we do to improve safety probably do have strategic value for our company, and some of the things that we do to improve safety might not even improve safety, let alone have other strategic value for the company. But we can't make this strong claim that good safety is good for business or that we know how to neatly align safety with business success because we just don't know the answer to that.
David: That's it for this week. We hope you found this episode thought-provoking and ultimately useful in shaping the safety of work in your own organization. Send any comments, questions, or ideas for future episodes to feedback@safetyofwork.com.