The Safety of Work

Ep 116. Do audits improve the safety of work?

Episode Summary

Could what we perceive as the bedrock of workplace safety be merely an illusion? Safety audits are supposed to be our safeguard, but Ben Hutchinson, PhD student from the Safety Science Innovation Lab at Griffith University, is here to shatter some long-held beliefs. Ben is a National HSEQ Manager, Fatigue Specialist and Exercise Physiologist with a focus on adaptive and system principles - including system safety. He brings to light the potential gap between the comforting assurance of ‘safety protocols’ and the stark reality of their execution in the workplace. Our discussion traverses the terrain of his research, dissecting the effectiveness of safety management plans and audits, and revealing the unsettling prevalence of "fantasy documents" that promise more than they deliver.

Episode Notes

Ben's expertise guides us through an analysis of audit reports and accident investigations, laying bare the counterfactual reasoning that often skews post-incident narratives. It's an eye-opening examination that calls for a reimagined approach to audits, one that aligns with the genuine complexities of organizational culture and safety. Together, we confront the silent failure of safety audits and management systems, debating the need for a fundamental shift in how these are designed and conducted to truly protect workers. Join us for this critical dialogue that challenges preconceptions and seeks to reforge the link between safety audits and the real work of keeping people safe.

 

Discussion Points:

 

Quotes:

"Some audits were very poorly calibrated to actually exploring and eliciting work - day-to-day work and operational risk.." - Ben

“you've got to pick and choose what to pay attention to. So unless something is really standing out as needing attention, then it can be hard to be curious and to notice these weak signals.” - David

“I'm proud to work in safety. I'm proud to call myself a sector professional. What really drove me to understand these systems was my love for safety, and I had just become so disillusioned with the amount of safety work I had to do. It wasn't helping.” - Ben

“Audits, like most activities, are very socio-political. There's a lot of vested power and conflicting interests.” - Ben

 

Resources:

Paper: Audit masquerade: How audits provide comfort rather than treatment for serious safety problems

Paper: How audits fail according to accident investigations: A counterfactual logic analysis
Ben Hutchinson LinkedIn

The Safety of Work Podcast

The Safety of Work on LinkedIn

Feedback@safetyofwork

Episode Transcription

David:  You're listening to the Safety of Work podcast episode 116. Today, we're asking the question, do audits improve the safety of work? Let's get started.

Hey, everybody. My name is David Provan, and I'm here with Ben Hutchinson. We're from the Safety Science Innovation Lab at Griffith University in Australia. Welcome to the Safety of Work podcast. In each episode, we ask an important question in relation to the safety of work or the work of safety, and we examine the evidence surrounding it.

If you caught in the intro that I mentioned I'm here with Ben Hutchinson, I am, and not with Drew Rae today. Ben's a PhD student in the Safety Science Innovation Lab and delivered a recent presentation at the Global Safety Innovation Summit in Wollongong in New South Wales, Australia on the topic of his research, and it received quite a lot of attention. Drew and I thought it'd be a great opportunity to have Ben come onto the podcast and talk directly about it.

Welcome, Ben. I thought by way of introduction, maybe share a little bit about your background in safety and maybe how you came to be doing another person doing a mid career PhD.

Ben: Yeah, thanks for having me. I love the show. My journey to safety I think is similar to a lot of others that I fell into it. I didn't really set out to work in safety. I actually started in sports science for [Ex Vis 00:01:30]. I was really fortuitous that after UNI/O, I was able to go into consulting. I really started to cut my teeth around the human factors, fatigue, and sleep science world. By that, I just really developed a keen interest and a frustration around safety management.

I think frustration was probably the operative word. I found a lot of things I liked and a lot of things that really annoyed me about what we did day to day. That really just pushed me to want to work in safety and see, am I contributing to this? What can I do to help make it better? Then I worked in a lot of roles, oil and gas, chemical, pipeline, but almost always the construction arms. Right or wrong, whether I've tried to escape or not, it's always been construction related. That's where I am today. A

David: Your PhD, it's been a fascinating one to sort of watch on the sidelines of, because you've been tackling a lot of core safety work activities in organizations like safety management plans and what we'll talk about today, which is auditing as a practice. How did you come across your topic, or how did you settle on your topic for your PhD?

Ben: It was really the work from a sociologist called Lee Clark. He was a student from Charles Perrault. Andrew Hopkins referred to Lee's work periodically. Lee coined this term called fantasy documents. He looked at how some of the genesis of some major accidents was in part predicated on false assurance and symbolic safety planning.

For, instance at the Exxon Valdez disaster where an oil tanker ran aground and spilled, it was hundreds of thousands of liters of oil into the bay, there was an emergency management plan that spoke about the worst case scenarios and how they would respond to this disaster that was signed off by the Enviro Inspectorate. The community groups were happy with it. It had the all the right sciencey words and what you'd expect in a emergency management plan.

The problem was it was completely disconnected from reality, so it made claims. People were doing the right thing with it. They thought they were doing the right thing. They did their risk assessments, but it was just wildly optimistic, bordering on complete fantasy about the amount of oil that could be recovered. They made estimations that they could recover from the open ocean, x amount of thousands of liters of oil, but it had never been accomplished in the history of mankind.

I just wondered, how does an organization with that level of resources get to the point where they can convince themselves of the impossible? We want them to understand, how does that apply in day to day? An emergency plan for a huge tanker is pretty exceptional, but what does it look like day to day for a safety practitioner, for a manager, for a supervisor? That was  the pivot around my work.

David: Right, Ben. You published, I think, it might have even been your first PhD paper. What was the title of that? Fantasy plans?

Ben: Yeah. We looked at fantasy planning that's the gap between systems of safety and safety of systems.

David: Great. Let's talk about auditing. Sometimes a fairly dry topic, but we have tackled auditing on the podcast before. I guess it's one of those things where you say I'm doing a PhD in safety and safety, and the core part of that's been safety auditing. People must really, really gravitate around you at barbecues.

Ben: I'm the life of the party for sure.

David: Yeah, but it's a fascinating topic. I'm going to introduce two papers that were accepted last year and even this year. These are very recent papers. I'll ask Ben then to talk about the two methodologies. I think, Ben, you said we'll talk about the findings together because it's just like one big set of findings.

The first paper is titled How audits fail according to accident investigations: A counterfactual logic analysis in the Journal of Process Safety Progress accepted in December last year, published 2024. The authors are, of course, Ben Hutchinson and his two PhD supervisors, Professor Sidney Dekker and Associate Professor Drew Rae. Drew obviously from this very show.

The second paper is titled Audit masquerade: How audits provide comfort rather than the treatment of serious safety problems published in Safety Science, accepted in October last year, and also published 2024. Two 2024 papers, so congratulations on the publications, Ben. Do you want to talk a little bit about, through the methods and the two studies, how they're combined, how they're different, and then we'll tackle the findings?

Ben: Yeah. I probably should outline first that I never actually set out to study auditing. I had a really particularly souring experience with external auditors when I was working in an organization. It really put me on the spot. In fact, I was there for less than four hours and they invoked the C word. They said culture. For me, it was immediately triggering. My scientific pay went into overdrive. I'm like, culture is a property of groups. How on earth can you come to the conclusion of culture with small numbers?

Anyway, I wanted to really then progressively stick it to auditors. My journey is I've actually come to appreciate and love the work of auditing now as far as what they do. Hello, auditors, I love you guys.

The first paper was the audit masquerade paper. We had access to a large data set of audit reports from this organization. The organization is a large tier one engineering and construction services company in Australia. We took a random sample of a hundred audit reports that was on their internal database, and we had a number of inclusion criteria around needing to be a complete audit report. You need to have findings and corrective actions, et cetera.

At the end, we had 71 audit reports that met inclusion to the study. We're really interested around what do auditors focus on. What observations do they draw? If an audit question asks X, does the auditor answer directly? Do they infer? Do they read between the lines? What evidence do they draw on?

I should note, it was from one company, but it was actually 16 separate independent audit firms that provided that data. Most of it was those audit firms who were engaged by the client to audit this company. That company would upload those audit reports. That one was looking very much at the products of auditors. It's limited and of course, it's not looking at the work of auditors per se, it's looking at an abstract of the work. It's still very interesting around the wording they use and things that they look at.

The next study, how audits fail, what we're particularly interested in that study was, how do accident investigators discuss audits in the context of the accidents? It's quite a curious thing because there's lots of examples of audits that maybe didn't achieve the goals that was expected of them. We're those goals ever clear? Did the stakeholders know what those goals were supposed to be? How would they know what those goals are? How would they know when those goals weren't been achieved?

We screened thousands, 10,000 probably accident reports, mostly in oil and gas, pipelines, and high hazard industries. We looked for how the investigators essentially talk about audits. Actually, only 44 major or fatal accident reports actually met inclusion criteria. Very few accidents reports actually talk about audits. When they do, it's normally very motherhood statements like audit is important, but they provide almost no context around why, how, and those questions.

David: Yeah, great. You look at it from both ends. I like that you clarified there the audit reports you looked at in the first study were from 16 different audit companies. If we see patterns and trends, that's actually quite a lot of different individual audit organizations and individual auditors.

I've always been a bit fascinated about the accident investigation side of it because I know and it's mentioned in your paper, you talk about the work of Andrew Hopkins in some of his more narrative style descriptions of major accidents. People will know from his books on Macondo, Longford, and Texas City. He talks a lot about a lot of his narratives contained quite a lot of counterfactual reasoning around process hazard analysis, auditing practices, and things like that. I'm fascinated about the way that it's specifically spoken about in the accident report itself.

Ben: Yeah. It’s a really important point that you highlight there is  it was something that we had to grapple with in the sense that the intellectually simple or lazy way is just to accept the findings at face value. What we wanted to understand though is what you mentioned, the counterfactuals.

Counterfactuals are really describing a reality that didn't happen. That's a really strange way to structure an investigation by talking about should haves, would haves, could haves. It is describing what didn't happen rather than what did happen and the logics around why it made sense for those particular actions and decisions to make sense to people.

What we decided was, instead of really making it a binary discussion between those very objective positivistic perspectives and then constructivist at the other end, we went to the midpoint and decided we can embrace both, it doesn't have to be one or the other. You can accept there is an objective reality as far as there are real things, but also that people construct reality too. They have to decide what evidence is relevant. They have to construct what it means and piece it together in a narrative. For me personally, it was a very interesting activity to help clarify my own thoughts around what are these findings tell me, how would they know that, those sort of questions.

David: Let's talk a little bit about the findings then. We discussed before we started the episode that we talk about all these findings together. Maybe if you want to get us started with some of the very top level key outcomes or findings across one or both  of the studies, and then we'll sort of dive a bit, a bit deeper into what it might mean.

Ben: Yeah. The unpopular finding, I think, is audits seem to have almost a mechanistic or militant focus on paperwork. Paperwork itself isn't necessarily the problem, but as you've already explored with your safety work research, it's the excessive focus and the misplaced symbolism around what that paperwork actually tells us. That was the first one, audits, particularly general desktop audits, but also audits in the field focused excessively on paperwork.

The second one was that we didn't think the audits were very well coupled or calibrated to evaluating work. I found that one interesting. Some people will dispute whether audits should do that. But when I think about it, really the whole point of safety management systems is to help create the right safety structures around work, efficiency too, and all those other issues. Audits were very poorly calibrated to actually exploring and, and elucidating day to day work and operational risk.

David: Yeah, I like those first two points a lot. I think they're really impactful. This idea of documents and surface issues and compliance, I was even thinking then when you were talking about even a permit to work order that a site could just go into the permit office, just do a random sample of the permit documents themselves, check compliance with completing the permit process, and form a conclusion that the system's working effectively without paying any attention to matching that permit to the work activity, the understanding and the implementation of permit requirements.

Almost, you could run a permit to work audit without anyone doing the work actually knowing that the audit was even even done in the first place. We intuitively know some of these things, but the way that you've laid out these types of findings, I think, really caused me to sit back and reflect about that.

Ben: Yeah, I think you're right. I don't think there will be many surprises for people who work in safety, either auditors, operational leaders, or supervisors. But probably the extent of it and some of the nuances around how those artifacts are used. Artifacts, the products of people, the permits, the documents, the plans.

Probably something really important actually to bring up here is the reason why we think this militant focus on paperwork is problematic is Pentland and Feldman, they're researchers. They play in more of the routines in the business world. They have this statement around the frequent disconnect between goals and results. For instance, the safety work versus the work of safety, really arises, at least in part, because people design artifacts when what they really wanted was to change patterns of action.

Essentially, what they're saying is we create a document when what we really wanted to do was change something tangible in the field. We wanted to change a practice. We wanted to change the way that someone interacts with a risk. We conflate the two as the same. We conflate changing the permit as changing the actual work and the interaction of the permit to work system. That's problematic if you conflate the two, and it's just very natural to conflate the two.

David: We do it. We do it a lot. Many, many, many accident reports talk about updating the standard operating procedure. I think that is what jumped into my mind when you said that. We think that is actually making a change to the work, but it's not, it's making a change to a document. It's not the same thing.

Ben: Absolutely, and we found a lot of examples with that in the first study with the internal audit reports. It was also really, I guess, validating in a sense, personally, to say that the accident investigators also came to the same conclusion. That, of course, doesn't mean it's a real state of the world. We spoke about these findings should be seen as more persuasive rather than conclusive.

The accident investigators also did observe something very similar, where audits would very clearly focus on changing documents. Actually, Piper Alta had a great example around the permit to work system. This is one of the data sources. It was audits on the permit to work system, but it was largely based on the actual paperwork rather than the practice in the field. Because there weren't any real bad news flowing through from the paperwork, leaders implied or assumed that it must be working because I haven't heard any bad news to the contrary.

David: Let's talk about that finding. You talked about audit masquerade and providing comfort, and you just mentioned there about audits not necessarily flowing bad news. In one of your papers or the second paper, I don't know which one is the first.

Ben: Audit masquerade is the first.

David: Yeah, okay. The idea of reinforcing a positive view of safety, reconceptualizing hazard conditions into less concerning conditions, and promoting an unwarranted confidence in risk management. These findings all, to me, speak to what you just said, that the audits don't seem to surface the issues that need to be surfaced and don't seem to convey that bad news in the reports.

Ben: The persuasive, simple, seductive explanation is it's the auditors. We weren't convinced that that's really the factor. If it happens across industry, and apparently it does happen quite a lot when it's based on the data that we evaluated, there must be more to it. Of course there is. You look at from a systems lens, there's a whole bunch of different constraints and influences from across the societal, from industry, from standards, from the regulator, et cetera.

That one in particular around less concerning, this one came out heavily in the process safety themed audits. For instance, insurance audits at processed plants that produce sugar. Buildups of sugar dust were seen as food safety issues rather than as combustible dust hazards. Because the audit reports were coming back fairly green, there was a lot of good ticks and greens, and things were seemingly managed pretty well, it's of course natural that the leadership team is going to assume that things are well managed.

I had that bad news. It turns out the audits weren't calibrated to necessarily find those issues. I've got another example that wasn't in this paper, but it's another one. It's on a similar theme. It's the Pike River mine disaster in New Zealand.

Those methane sensors, I think it's for detecting obviously methane in the underground mine. The sensors had saturated, means they'll show 5% methane. Even though there could be 10%, they're saturated at 5%, they don't go any higher. The mine manager assumed that because they haven't calibrated, they had a calibration schedule, the mind manager simply assumed that, if there was something wrong with those sensors. They weren't working or if they were saturated, the maintenance people would automatically know to tell that person, tell the mind manager.

What we argue in this other paper, but it's actually quite relevant, is you would assume, yes, something critical like that would be communicated. But unless you really made it explicitly clear that that's something that concerns you, that's that bad news that needs to be surfaced, you're living at the chance that someone is going to automatically know because they've got their own constraints, they've got their own job to do. We see the same with audits. We assume it's going to do X, Y, and Z, but we've never really articulated that that's what it should be doing.

David: Yeah, I think that's a great point, Ben. We know organizations are dealing with so many things, so you've got to pick and choose what to pay attention to. Unless something is really standing out as needing attention, then it can be hard to be curious and to notice these weak signals. We've talked about weak signals in safety for decades and decades, but we still don't have good models for what they are and how to look for them.

I want to talk a little bit about the actions from audit reports and that. One of the findings that I made when I did my ethnographic research into safety professional practice was a conclusion I drew was there was a lack of goal directed risk reduction activity. A safety person saying, I am undertaking this activity or I'm working on this because it will make a change to the work activity, like you mentioned, and reduce the level of risk associated with the task.

There were lots and in our safety, in Drew and my safety of work model. There was lots of administrative safety activity, demonstrated safety activity, and social safety activity, but not a lot of physical risk reduction safety activity. I know that you analyzed all of the corrective actions and categorize them. Do you want to talk a little bit about what audits are actually recommended and required to be done?

Ben: Yeah. The spoiler alert is almost identical to what you've just outlined, a lack of goal directed risk reduction activities. Overall, actually I was surprised that at least half of the samples, we categorized 327 findings or corrective actions. There's actually the finding from the auditor and then the fix on the other side, we compared both. But I'll just talk about them as the same thing for now just for simplicity.

At least half of them were physical. Actually, I was a little bit surprised. I thought that was higher than I expected. The other half was more administrative focus. Either a pure administrative is changing documents, it was administrative action to try and change a physical aspect, so change a permit to work, a compliance based entry permit, and then there was a number of review or assessment of risk.

What I think was interesting from that was, even though nearly half of the sample was physical, it was almost exclusively on very superficial or trivial items. Auditors seem to love it. It's like their birthday every day when it comes to fire extinguishers and first aid kits.

What we found interesting is not just that they're looking at those things, but there was very rarely any discussion around why. Why does the first aid kit matter in this context? Is it a particularly hazardous activity that they do at the site that requires first aid kits? Fire extinguishers were very rarely discussed in the context of how, where, why those things would be needed.

The less surprising was the administrative side. The vast majority of items or actions in this sample were changing documents, a template, a footer, version numbers. We even found spelling like items around changing spelling and things like that.

What we argued was, and we actually assessed the strength of connection, we tried to give the benefit of the doubt. If we just use the hierarchy of control to assess the findings, it would have automatically been very unforgiving on administrative action. What we argued was administration isn't itself bad, it could be very goal directed actions and fixes.

What we did was we assessed the strength of connection between the finding and a physical aspect or physical issue. For instance, trying to change very specifically around the blasting activities in a process, we argue could actually be quite a strong action, as long as it's coupled to actual awareness and instruction. But we just didn't find many examples where the administrative actions were actually strongly coupled. They said to work, to changing physical risk operational practices and community centrality.

We couldn't even sometimes even determine what the point of that action was like put a sign up, change the wording in a plan. What we argued was that seemed to be really weakly connected to a physical issue, if we think about safety management systems as trying to create the safe and efficient conditions in the workplace. We couldn't work out in some instances why changing that aspect of the system would actually have some downstream influence.

David: Anything else you want to say about the findings? Anything we haven't discussed in relation to some of the key outcomes of the research?

Ben: Yeah. We already touched on that audit reports seem to be released out in this sample, but poorly coupled to evaluating work. Something that I found really interesting was we found that audit reports almost never talk about actual work itself and the hazardous interactions, operational risks.

For instance, when they talk about traffic management and traffic movements on site, almost always relayed via a document. Instead of talking about vehicle movements, they'll talk about the traffic management plan. Instead of talking about actual work at heights, they'll talk about the work at heights permit, and all those examples.

What we argued was they're conflating the two. The management had the issue with the actual abstract representation of the issue, which itself may not be a problem, but when it's almost exclusively focused on abstractions rather than the actual work, we argued, is this what people think the audit is doing? Does anyone actually think the audit is supposed to be evaluating work?

If you ask a whole bunch of leaders, for instance, we think probably quite a sizable number would say, yes, we're checking compliance documents, but I think you're checking the effectiveness of systems. I think you're checking how well those systems work in practice. If that's what you think the latter, then that's what audits don't seem to be doing very well.

David: It sounds like an interesting interview based study of organizational leaders. Is that something that's on the cards for you to go off and research?

Ben: We've got an interview based study now that I'm working through for my final paper. That's probably going to be an element of it. Yup.

David: Yeah, great. That was one of the things that I took out of your recent presentation, which I love, by the way. It doesn't matter what these things are doing, what matters is if they're not doing what we think they're doing. If we need to check the compliance of the administration and the paperwork, then great, go and check it. Make sure the spellings right. Make sure that they think forms are getting filled out. I think the way that you've used the word conflate a number of times, but don't think you're actually checking the effectiveness of your systems in terms of shaping physical work practices.

Ben: Yeah. I think one of the critical elements is looking at the early work around fancy planning for fancy documents with San Bruno and you mentioned Esso Longford. People were managing the systems and the artifacts, inadvertently thinking they were managing the direct functional day to day risks.

In fact, the Royal Commission for Longford, it's not just academics in ivory towers coming up with these really nice terms. This was the actual Royal Commission that concluded that. It's really problematic because it feels like you are directly managing the issues. You convince yourselves we have fancy risk matrices. We have really seductive paperwork that makes us think like we're directly interacting with work, but we may not be.

There's something else I think probably worth covering. This is the thing I found most interesting from the research. This was from the paper two. The second paper was, there seems to be a mechanism where audits fail silent. What we argued was some audits actually are achieving the goals, the expected goals, but they're not providing any intel that they're not achieving those goals. It's a term from computer science that we commandeered around failing silent.

Investigators then expected audits to fail loudly in contrast. The counterfactual argument is audits should fail loudly. If they're not achieving their goals, then they should give some intel. We spoke about the absence of bad news. In fact, sometimes, at a more pathological level, it appears that someone has not only provided no intel that they're not achieving their goals, they actually provide counter evidence that they're succeeding, so lots of greens.

We found examples actually where safety management systems were given glowing praise like highest possible scores in audits, and then something figuratively blows up coming weeks or months, or actually literally blows up in some of those examples. What fascinates me is, how do we get to a state where systems can be praised on one hand for being such fantastic representations of our safety practices and knowledge and then on the other hand, obviously with the benefit of hindsight, how can they have been working against this? We explored that in the case of failing silence.

David: Yeah, I like that term. I don't know that I have this anywhere near as well as you do, but I think there was a report just before BP Texas City, which gave the site quite a lot of safety praise in terms of developing a strong culture and things are going well here, which may not have been the case.

Ben: Yeah. We found a few examples of this, something very similar. That false assurance that results from the absence of news. Actually, there was a couple of coronial inquiries from mining, and the coroners put it beautifully. They said there was a misplaced sense of safety around this interlocking or interwoven layers of audits and inspections of permits, et cetera.

People felt safe because they had all these different artifacts and processes in place, but they weren't actually helping you achieve not killing people or not seriously injuring people. Which what we think about if you're in a high hazard industry. That's really what we need to be ensuring and having systems that support safe, efficient practices, and good Intel that we've got control over those.

David: Great. Ben, is it okay if we move on to some practical takeaways?

 All right. This is the section we always do at the end of The Safety of Work podcast. I'll put you on the spot a little bit now. I think you've talked very descriptively and non judgmentally about auditing as a practice. What would be your practical takeaways for listeners? What should they probably now go and do as a result of some of the things you found in your research?

Ben: We're not trying to provide a normative checklist that you can assess your audits, but certainly a very simple step you could do to help provide maybe some calibration is, you can actually assess your audits. Maybe grab the last three or four or five audits. You could assess it against some of the criteria we've got in the paper around failing silent.

I think that's one of the key reflections for me. How would you know if audit is achieving the goals that you think it is? In fact, what do you think the audit's supposed to be achieving? There’s different audits for different goals. If it wasn't achieving those goals, how would you know? What indicators, feedback, intel do you have?

I think being really clear about what you think audit is supposed to achieve. Again, if it's a purely a desktop audit, it's only checking documentation and not the implantation. As long as people know that that's the goal, you're not relying on it for anything other. I don't actually think it's a problem. I'm not against auditing per se. I think auditing, just like any other activity, permit to work, inspections, can also fail silent. It can also lead to a false sense of safety, so be really clear.

Actually, what I found interesting is, I've had some nice debates around this is a problem with audits, but we use inspections instead for day to day verifications of work. I said, that's fine, but inspections are still susceptible to largely the same issues that we've identified, just like any practice. How do you know inspections then are achieving the goals you set out? How would you know when they're not achieving it? I think that's one of the critical things for me, just know what you expect of it. How would you know when it's not achieving those goals?

David: Yeah, thanks. Ben, I like that. Understanding if there are structures in your organization, which actually create that fail silently. I saw an organization the other day that had a KPI, a number of audits completed in the year, and a KPI for no more than one major non-conformance at each site. The whole system's designed around not finding anything major to achieve a KPI.

We know the politics and challenges that come with auditing in an organization, which can really create separation between audit teams and management having different objectives for the audit process when really, like what you said, they should have the same objective, which is help us understand the effectiveness of our safe system of work. It seems like audits and management have differing approaches. I think that would be another suggestion I'd have for people is, what can you do to bring your management teams and audit teams together to go much more deeply into the effectiveness of your systems?

Ben: I think that's a great point. Audits, like most activities, are very socio-political. There's a lot of vested power and conflicting interests. Definitely the way that you score and measure success and failure will help dictate the results you get for better or worse.

David: Yeah. I just want to ask you about criteria. This is maybe not necessarily a practical takeaway, but it may be. I think a definition of audit, which we didn't define, is something like the application of a defined criteria to a situation and checking for evidence against that criteria. Assurance I've always seen as a word is maybe a little bit more broadly about trying to provide insight whether something safe or not.

In some of those examples, I think Mara was an example where, and the one you mentioned about the sugar facility seeing something as a type of issue, not another type of issue. I think it was Hopkins who said that maybe one of the better audits would be an experienced person digging deeply on a site with a blank piece of paper and just looking for the major issues. How do you feel about criteria in audits and where there's white space just to go and find issues?

Ben: Yeah, and you make a good point. Hopkins also discussed around, I think he used the words pulling the threads. If you're genuinely interested in, again if you're in mind, emergency evacuation, you need to actually test emergency evacuation in a very naturalistic setting. Make one of the sensors go off and then observe how people actually respond to it. Does the information get to the person that needs to? How long does it take? Do people know what to do and how to do it?

I think audits could be very well configured. In fact, I think the process industries, by and large, does this better than the health and safety field around scenario based auditing. They go in and look for very specific things. If you're a plant where things can blow up, and they'll look at the performance of valves or loss of containment, they'll spend a week evaluating how you manage loss of containment, the systems, the containment systems, the response, training, competencies.

Of course, you can still convince yourself that you're safe when you're not. I think that they go in there with a very defined purpose of things that they want to explore. I think audits can do the same. You can go in and understand work and how things don't go to plan.

The misplaced assumptions around systems and technology, we saw that with Buncefield and other disasters, where people thought that a system would operate in one way, In fact, the system could never function that way because an emergency stop was never installed or whatever. I think audits can be very well placed for that. They focus on work and understanding how things actually function.

David: Yeah, it's a great point, Ben, because I know that in the Safety Science Innovation Lab, we're doing quite a lot of research into safety work activities under this broad theme led by Drew in Sydney around this safety work, safety of work, and what's the connection or disconnection between the two.

Our most downloaded episode of The Safety of Work podcast was Drew's Take Five, last minute risk assessment paper, which has attracted a fair bit of attention. I think even the common narrative or argument there was more around, what do you think this is actually doing? Is it doing that? How would you know if it is or isn't? Which feels a little bit like this conversation, feels a little bit like some of my research into the safety team. What is it that you think the safety team are doing and what are they doing?

I think what we don't want to come across as saying is we're saying stop doing these things, stop doing audits. What we're trying to do is actually go, let's do a better job of designing these practices so that they do have a direct and hopefully material improvement on the safety of work.

Ben: I absolutely agree. One of my key motivations with safety was I'm proud to work in safety. I'm proud to call myself a safety professional. What really drove me to understand these systems was my love for safety. I was just become so disillusioned with the amount of safety work I had to do. It wasn't helping, I think, the business better manage and create effective systems of work.

I think it's the same thread with all audits, inspections, and all these other activities. It's not because we want to see the world burn. We genuinely care that we want to get things better to help people to have effective, meaningful work, safe work. I just think there's so many things working against us that we need to be critical at points. We need to stir the pot a little bit occasionally.

David: Yeah, absolutely. What's the time frame for your final study and finishing your PhD?

Ben: A year and a half, I think, is the fixed date. I have to finish within a year and a half. That's it, that's the constraint.

David: Hopefully when you get your next study published, we might get you back on. I think the practitioners really seem to like the research around the specific activities that organizations perform and how we think critically about those activities. Thanks so much for coming on. The question that we asked this week was, do audits improve the safety of work? To really put you on the spot, Ben, how would you answer that?

Ben: Possibly, but it depends.

David: It depends. A nice on the fence answer. I did give Ben the heads up that I was going to ask him that question, and I think my comment there would be is don't assume that they are improving the safety of work unless you've gone and had a really hard look at what's going on.

That's it for this week. We hope you found this episode thought provoking and ultimately useful in shaping the safety of work in your own organization. Send any comments, questions, or ideas for future episodes to feedback@safetyofwork.com.