The Safety of Work

Ep.25 Why don't workers use reporting systems?

Episode Summary

On today’s episode, we wonder why workers don’t use reporting systems.

Episode Notes

Dave isn’t here today. Instead, Drew speaks directly with the author of the paper we intended to use to frame our discussion. Tune in to hear his discussion with Tanya Hewitt. 

 

Topics:

 

Quotes:

“If people were very seasoned users of the reporting system, we’d want to really understand, ‘how did they become seasoned users?’ ”

“A lot of professionals: That’s where they derive their professional pride, from being able to fix problems.”

“[Reporting systems] are complex sociotechnical constructs.”

 

Resources:

Hewitt, T. A., & Chreim, S. (2015). Fix and forget or fix and report: a qualitative study of tensions at the front line of incident reporting. BMJ Qual Saf, 24(5), 303-310.

Feedback@safetyofwork.com

Episode Transcription

Drew: You’re listening to the Safety of Work Podcast Episode 25. Today, we’re asking the question why don’t workers use reporting systems? Let’s get started.

Hey everybody, my name is Drew Rae and I’m from the Safety Science Innovation Lab at Griffith University. This is the Safety of Work Podcast, but David Provan isn’t with me today because I’m speaking directly with the author of the paper we’re talking about. Today’s question actually comes to us a little bit indirectly via Andrew [...] together at safetyontech.com. 

I connected on the forum with Dr. Tanya Hewitt, a long time podcast listener and it turns out a researcher who focuses on the interface between the safety of work and the work of safety. The subject we’re looking at is voluntary hazard and incident reporting systems. We’re not talking about accidents here, but about the little things that go wrong; near misses or dangerous details that show that things are not as safe as they could or should be at the front line. 

Many organizations would like to be able to learn and improve from these little things, but systems to collect hazards and minor incidents seldom work as well as we’d like them to. This is a really good illustration of the disconnect between the lived experiences of frontline workers and the well intentioned but awkward ways in which safety practitioners try to help.

The paper we’ve chosen is called Fix and Forget or Fix and Report, A Qualitative Study of Tensions at the Frontline of Incident Reporting. The authors of the paper are Tanya Hewitt and Samia Chreim. Hewitt was a PhD candidate at the time of publication, is now Dr. Hewitt. Professor Chreim researches organization studies and seems to specialize in studying professions and professional work.

The paper was published in 2015 in BMJ Quality and Safety. That’s a high quality, very practitioner focused journal specializing in patient safety and safety of health care staff. Here’s the interview.

Tanya, Thank you for talking to me about this paper, Fix and Forget or Fix and Report. 

Tanya: It's my pleasure, Drew.

Drew: Can you tell me a little bit about your PhD? I think you wrote this paper fairly early on when you were officially enrolled. 

Tanya: It was actually the second paper of my PhD. Just a little bit of background on my PhD, I actually don't have a background in this from my bachelors. I have a bachelors in Physics and a masters in Physics as well. I changed direction once I started to go to some talks and learning experiences where I noticed people were uncomfortable when they started to get around topics that were not so technical.

When people were talking about leadership, communication, and things that might well have been of interest in looking at an event or a near miss. The speakers often wanted to get back to computer systems, laser systems, and things like this. I thought, wasn't that fascinating? While I did have a background more on the technical side, I thought there's probably more to be gained by looking at the social side of things. That was kind of my impetus for starting to look at this type of area.

Drew: How did you go about finding a supervisor given your own background?

Tanya: I actually went supervisor shopping. I started by looking at what the supervisors had published and reading a lot of the type of materials that they had done. Then, frankly, cold-calling or cold-emailing professors to see if they were taking on graduate students.

Drew: One of the authors on this paper is your PhD supervisor?

Tanya: That's correct. Samia Chreim, she has a background in organizational management which her expertise was actually in mergers and acquisitions but she had an interest in health care. When I had approached her on looking at incident reporting systems and health care, she was very, very keen.

Drew: How did you get the entry to talk to everyone at the hospital?

Tanya: That was fortuitous. There was a principal investigator at the hospital who had already secured a grant to look at the patient safety. That principal investigator was actually starting to look for researchers who would be interested in partnering with him on this study. That's how that connection was made.

Drew: There's you and Samia who are both interested in looking at instant investigation systems. There's the investigator at the hospital interested in patient safety. You decided to look at the systems not by hopping on the computer and looking at all of the accident reports or incident reports that were recorded, but by going out and finding all of the people who hadn't reported into the system. Can you tell me a little bit about how you decided that that was the research question?

Tanya: I don't know if we were driven by who is reporting, who isn't reporting. Health care is very difficult, especially these days. I sympathize with anybody who's doing this type of a study today because with the emphasis of the health care systems globally focusing on fighting Covid-19, I'd be surprised if this kind of study could be done right now.

But health care systems, typically, are not running in a relaxed state. It's very difficult to try to talk to practitioners in order to do this type of a study. It was really availability that was driving who we were able to speak with. It just so happens that in the people that we talked with, spoke with, that was almost as random a sample really as you could find because we were just looking for who was available at a certain time. Some people had extensive reporting experience and others had zero reporting experience. It was just the way that this so-called availability sample was able to afford us this kind of information.

Drew: How does that work practically? Were you sitting in an office in the hospital, sending emails, trying to get meetings, or were you sitting in the break room just grabbing people as they came off shift?

Tanya: A bit of both, it depends. We started with a nursing manager, and spoke with the nursing manager who then would help us talk to somebody else. Snowball sampling is a technique where you would ask the person whom you just interviewed, "Is there anybody else that I should be speaking with?" That was used heavily in this study just to be able to get people because we didn't go to HR and have a list of everybody who was available or who was scheduled on that shift. Some of them were in break rooms, some of them were at the nursing station, some of them were in doctors' offices, some of them were in the cafeteria. It was a mix of all of the above.

Drew: Were you specifically asking people whether they reported things and why they didn't? Or was it some other question that just revealed that this was the interesting question that you wanted to answer?

Tanya: The focus of a study was incident reporting systems. We were interested in recording behavior. We did ask questions about reporting. Did they know that there was a reporting system? Did they ever use the reporting system? When would they use the reporting system? A lot of probing was part of this study. The interviewee would reveal information which would then lead to the next question that we would ask. 

If people were very seasoned users of the reporting system, we'd want to really understand how they become seasoned users? Was it because they were mentored by somebody, did they just see the value when it or what were their motivations? I've reported maybe two things ever and we'd want to try to explore that like, 'what was worthy of being reported?' was a very interesting question for us.

Drew: I'm interested in these seasoned users. They don't come up a lot in this particular paper. What sort of things are they recording in? Why are they finding the system useful or part of their life?

Tanya: In this paper, this was restricted to the first part of the study. I had looked at another area of the hospital in obstetrics for the second part of this study. The first part was in general internal medicine. Obstetrics has a much longer history with reporting than general internal medicine. It was more of a discipline-driven one than a hospital-driven history. That alone accounted for a lot in terms of the use of the recording system in obstetrics.

Drew: When you say discipline-driven, you mean they just considered it part of their work to use the reporting system?

Tanya: When we explore this in depth, we actually realized that delivering babies is a highly litigious field. The field of obstetrics and gynecology had recognized that if they didn't get a handle on this, they might not be able to survive as a practice because the lawsuits were just going to bankrupt the entire discipline. That was a very interesting finding of why reporting systems were introduced so early there.

Drew: It's not that the reporting system helps them directly with patient safety, it's that they report because the record keeping is a defensive act that builds up a record that they can use in case of claim or litigation.

Tanya: Now, it depends perhaps on who you talk to. Another phenomenon we were looking at is the difference between doctors and nurses reporting. The doctors were far more on the litigation, protection, and professional liability side. The nurses were far more on being able to help my patient, being able to make things better. I can remember a number of nurses seeing much value in using the reporting system because they saw things happening once they reported them.

Drew: That leads into these people who don't report and correct me if I'm mischaracterizing, but there are sort of two things that are going on here. One of them are things that are so infrequent or that people at least consider to be once off, that there's no point in reporting it because, well, it happened once. I fixed it, it's never going to happen again. The other one is things that are so chronic that there's no point in reporting them. Well, this is just a normal part of life, I work around them. What's the point in reporting every single time it happens?

Tanya: Yeah, that's a good characterization. Yes, that's right.

Drew: Could you tell me a little bit about each of those things? With the ones that just seem to happen once, it seems to be that people thought that if there was no harm arising and the situation had been successfully resolved, then there was just no utility or no point in reporting?

Tanya: That's right. A lot of the people who gave us that information believed the reporting system was in order to get something fixed. Say, there's a mechanical issue with a ventilator or an infusion pump, that's when you use a reporting system. That's the mechanism of communication to get somebody up here to fix it. That was the mental model they had of the reporting system as opposed to some other people who also were in that paper, who recognize that this can contribute to organizational learning. 

Therefore, even if the immediate problem was solved, this fix, A, is only known to me, maybe other people have to know what this fix is and B, if there is a problem, maybe I'm not the only one encountering it. It really should be into the organizational repository so that this can be looked at from a wider perspective.

Drew: There's a strong hint in what you're saying and I think this is a fair characterization of how a lot of people look at instant recording systems, that the system is there for the organization to learn and the individuals using it are sort of contributing to that organizational learning. It's almost like optional extra work that they have to do, they're not seeing an immediate benefit themselves but they're feeding into something that ultimately helps the whole organization. Is that how the patient safety people at the hospital saw the system as something that is about organizational learning?

Tanya: I think that's fair to say that that was the messaging. But as you, I think, recognized, this was often seen as a duplication of effort by a lot of the people who were expected to report because a lot of what they would be reporting would be put into the patient chart anyway. If something bad happens to the patient, it's going to be in the patient chart. If something didn't happen to the patient, maybe that wouldn't be put into the patient chart but nothing happened so it doesn't really matter. There was a large severity bias going on with a lot of people with whom we spoke. Depending on how bad this was, would determine whether or not this would get into the incident reporting system.

Drew: Your experience as a practitioner as well as a researcher extends beyond health care, as I understand it. Do you think this is something that's particular to health care? This focus, or do you think this is something that affects incident reporting systems more generally?

Tanya: I suspect this is a common phenomenon throughout any place that is using incident reporting systems. I don't think a lot of the findings that we had here, the professional pride and all this kind of stuff that is around, all of these influences that are around into your reporting systems don't hold in other areas. I think these are quite applicable to other areas.

Drew: You mentioned professional pride there. That's something that I missed on my initial reading of the paper. Could you talk a little bit more about that one?

Tanya: Especially in this particular sub topic of fixing problems, a lot of people, a lot of professionals, that's where they derive their professional pride, from being able to fix problems. If then organizationally, this problem has to be communicated so that we can fix the problem, not you. This poses a dilemma to the practitioner who doesn't necessarily want to see divulged that there was a problem because I was able to fix it. There's no reason for them to come over to us here where we have this under control.

Drew: I'm going to throw something at you where I suspect I already know the answer but I'm interested in your take on it.

A lot of organizations have tried to fix that one by changing it from an incident reporting system to a lessons learned system where people are encouraged to share the solutions that they found to problems as a way of taking away that sort of bias where they, you're not reporting a problem, you're telling me about your own expertise and how you fixed something. It doesn't seem to work any better. Do you have an idea for why that might be the case?

Tanya: A lot of these types of systems come with either explicit or implicit incentives that can drive behavior in certain ways. I fear that a lot of the efforts put into these types of systems, seeing them as the be all, end all turnkey solutions miss that they are deeply embedded in the socio technical environment in which they're being used. I think you'd come into just similar things like what was a problem that you fixed kind of thing. That wasn't a solution. You could get into a lot of the same kinds of debates that currently go on. Well, that wasn't really an incident, it shouldn't have gotten into the system in the first place.

Drew: Having spent quite some time looking at these systems, both in this paper and in your follow on work, do you think overall that incident reporting systems are valuable and that's something that can be redeemed and made to work well?

Tanya: I think that incident reporting systems do have value but I don't know how many of them are designed to capitalize on the value that incident reporting systems could have. What I mean by that is a lot of incident reporting systems lack what I see as an absolutely fundamental property, and that is feedback loops. There is an incredible amount of literature talking about the black hole of reporting systems.

If people report into a system that they believe is allowing them to have a voice in making their organization better, and it turns out to just be some kind of histogram chart that they never know of, that doesn't actually make the organization better from their perspective, the introduction of a reporting system can actually be worse than not doing it at all because the cynicism that you can introduce by not doing your homework before introducing an incident reporting system can be extraordinarily difficult to recover from.

Drew: That almost suggests that model that some people had in their heads, that you report things into the incident system to get them fixed, might actually be the way you want frontline workers to be thinking about the incident reporting system. As you know, this system is valuable because I know if I put something into it, I'm going to get a response and I'm going to get something fixed.

Tanya: Yes. Just so as long as the time sequence isn't so short, because the fix might not be something that biomechanical engineering is going to come up with this afternoon. The fix might be more structural or organizational. That's not going to be something that's done overnight. But that doesn't mean that the communication with the reporter doesn't have to take place. Just because there isn't going to be a fix tomorrow, doesn't mean that the person who took their time and told you what they did shouldn't be communicated with.

Drew: I don't know if you remember this one, but it might be worth talking about this with a specific example. There's a incident that you mentioned in the paper where a nurse is looking for a swab and she opens swab after swabbing there too dry until she finds one that hasn't dried out that she could use and because she's got to the point of finding one that she can use, she doesn't report into the reporting system the fact that all of the previous ones were dry.

I think you sort of hint in the paper that that's the sort of thing where she doesn't need a fix now, she doesn't need to have someone come and immediately give her as when she's managed to find one, she can keep going. But possibly, there is something that the hospital can learn here so that next time she doesn't experience this frustration.

Tanya: That's exactly right, because maybe this is a procurement issue with vendor loyalty that has to be looked at. The people who are having to finish the design kind of thing in, Decker's words, are often seen problems that they might not know the solution for. They just know that they have to do something about it. This is an excellent example. While they might be able to fix it for their world, it was disruptive enough such that something didn't work according to the way it should have. Maybe this is worth highlighting to the hospital so that they can get more eyes looking on this from different perspectives.

Drew: How do you think good feedback would work in that case to encourage people to use the system?

Tanya: The idea that a lot of incident reporting systems don't offer any feedback is any feedback might be better than none. One of the ways that I thought could be of help is with the analysis of the incident reporting. I had conceptualized that incident reporting systems are basically three big blocks: detection, analysis, and learning. If the analysis could take advantage of the person who detected the incident in the analysis, perhaps a lot more can be learned overall. That would serve as a feedback mechanism in itself.

If the person who reported was involved in the analysis, they would have a very good understanding of what is being looked at, why it's being looked at, how much time is devoted to my incident kind of thing, and that would give them a very good understanding of the importance of their report in the first place.

Drew: The idea would be that the nurse would get an email or a message back saying, "We've heard your issue and we think it's part of a class of similar problems. We're going to invite you along to a session on a shift in two weeks time with a bunch of people who've experienced similar things so we can see if we can analyze and work out what's going on."

Tanya: That would, and the other reporters would be in the same room with the facilitator and all those involved with the analysis of incidents. Yes, exactly, that kind of thing.

Drew: If you were giving just one or two pieces of advice to someone who didn't currently have an incident reporting system and was thinking of introducing one, what would your takeaway message be?

Tanya: My biggest message for organizations looking at incident recording systems is to not see them as simple turnkey solutions. They are complex socio technical constructs and that they should do some homework before they start, because a little bit of homework might save them a lot of grief. If you are looking at building your own system, my suggestion is to build it backwards. I had said detection, analysis, and learning.

Instead of focusing all of the efforts—where currently a lot of organizations do—on the computer interface, on the dropdown menu, and the types of categories, all of that kind of thing, how about look at the learning part first. What happens if you get information, see the information that you're asking for is anything. Tell us anything that we think we should know that is safety related in the organization. Say you get information back such as we have incompetent senior management, we are structured incorrectly, or we are not meeting our mission as in our organizational priorities. What are you going to do with that?

If you don't know how you're going to handle that information, perhaps a writ large incident reporting system is not what you really need. Maybe you need a suggestion box, maybe you need something else. If you are able to do that, if you are able to use that information, be as transparent about the process of how you treat that information as you possibly can. Transparency buys you a lot with this type of system.

Drew: This is possibly overgeneralizing about Canadians, but someone from the Canadian Transport Agency who's had a very, very similar thing to me that changed the way I thought about risk assessment. He said, "Start with the decisions you're willing to make and look at what you think are realistic options on the table, and then go backwards from there to design your risk assessment process that will help you choose between those options on the table. If there's no change you're willing to make, then don't bother doing the risk assessment."

Tanya: Isn't that fascinating? But I do honestly believe that if you're looking at introducing an incident reporting system, A, do a little bit of homework first, and B, think about why you wanted in the first place, really have the purpose for the incident reporting system be central to how you're going about introducing it in your organization.

Drew: Those were a slightly harder one. If you're in an organization that already has an incident reporting system that's been around for a long time and you feel that it's not working well, what would be your advice for the first step to try fixing it?

Tanya: I might start with what I had just said. Why do you have this? Perhaps the reason why you got it has long since expired and maybe this is just some kind of a routine thing that you're doing now and you're not even questioning its utility anymore. If you don't have a use for doing something, that's a good opportunity to question why you're doing it at all.

If, though, you are getting some value from it but not as much as you might be, you could start looking at what instructions are people being given, what capacities are they being given? Are they being trained on the recording system? How are they being trained? What kinds of incidents are you expecting to have in your system? What are you not expecting to have? There's all sorts of who's allowed to report on whom, where, and what and all. There's all sorts of rules that I've seen for a bunch of different reporting systems.

How complicated do you need to make this be transparent about everything? Post pressing send on your incident reporting system. People should know how their data is being treated. Not necessarily in intimate details, but they need to have a very good idea of who's allowed to see the data, and if there's expiration dates on things and these types of questions can be asked. How are you analyzing the data? What are you giving attention to? How are you prioritizing? These are other questions that could be very helpful in a stale incident reporting system.

Most importantly, if you do have reports coming in and you are analyzing them, what are you doing with that analysis? Are you actually using the information to drive organizational improvements? If you're not, you really should question why you have the incident reporting system in the first place.

Drew: There's one thing that you say right at the end of the paper, you say you've got these fix and forget people who are obviously very practically useful to the organization because they're finding problems, they're dealing with them, they're getting on with it. But you've also got these fix and report people who do the same thing when it comes to resolving the problem but they also use the systems and share the learnings. You've suggested that, somehow, there's something we can do with finding these people and getting them to informally train other people in the utility of using the system.

Tanya: I am leveraging heavily on Hollnagel's group ETTO, the Efficiency-Thoroughness Trade-Off system. When he describes group ETTO and talks about everybody being efficient and nobody being thorough, he suggests you don't need too many who prioritize thoroughness over efficiency in order to rebalance the system. I was thinking that perhaps, in the same spirit, if you can have a few of these people ensuring that you have coverage kind of thing, perhaps you'll be able to rebalance the system because you'll have people who are prioritizing reporting, not to the exclusion of the patient care, but ensuring that the organization knows what it needs to know and that not everybody is is overburdened with filling out incident reports.

Drew: You don't actually need every incident of fixing to be reported, you just need enough of them so that the organization is aware of the problems and can do something about it.

Tanya: One of the possibilities I had suggested in my study is to see if you can get learning potential to be a criteria for reporting. What are we actually going to learn from this? If as happens in hospitals routinely falls and medication errors, are that the lion's share of your reports, what are you going to learn from this one? If you have to have histograms for some reason, well, then have histograms, but don't have that interfering with this organizational learning thing. Report into the organizational learning system, things that we can actually learn from.

Drew: Was there anything else that you were hoping we were going to talk about in this discussion that we haven't covered so far?

Tanya: In terms of incident reporting systems, probably not. I was prepared to talk about more general research. But I know that you do that quite well in a lot of the podcasts that I've already heard. I'm wondering if maybe that's not worthy of discussion here.

Drew: What if we throw in a question to help researchers who are listening to this? You've spent obviously a long time and several studies on this project. If you could go back in time and give advice to yourself when you first started off, what have you learned about the qualitative research process that you would love to have known from the very start?

Tanya: I think I might have appreciated understanding just how much work it is to go through transcripts of interviews. It is a lot more debilitating and trying than I had thought it was when I first started. "Oh yeah, I'll have that done next month," became four months later. It takes a long time to be able to go through these things and it can be draining. I think that would be one caveat I would have appreciated knowing earlier than having to experience that myself firsthand.

Drew: Did you have any suggestions for—particularly when researchers have collected such Richfield data, whether it's your volumes of observations or transcripts of 40 interviews—how do you approach that? What is clearly a long term task of analysis that's so daunting to start?

Tanya: There's a whole literature on how to do these things. Basically, I had relied on coding which is it has been around from the 70s. I use a computer program instead of sticky notes. But I read the transcripts, hopefully, with enough mental capacity to actually absorb what was being said. If I had done that same interview, then I would be even closer to the data so I might be able to remember some of the conversations. I would then hold the conversations with subjects that I believe that paragraph represented. You could code a single paragraph multiple ways if they were talking about multiple things.

It is a long and drawn out process and you might not be as reproducible in it because you might come up the next day and say, "Oh my gosh, I would call that this." The more that you go over your data, the more you're going to question, "Oh my gosh, did I get this right?" But there might not be a right that you're aiming for. It might just be ensuring that you can present your argument with enough data that substantiates where you're going, that you're looking for it and being authentic to the people to whom you were talking is a better measure.

Drew: One thing that I love about this particular paper is that you've pulled out just one particular, I don't know whether you'd call it a theme or maybe a level upper category, that's just a simple, easy to explain thing that you noticed was going on where you had lots of examples of that thing and you could just discuss what was happening simply and clearly separate from the rest of the body of the research. Was that something that sort of happened naturally?

Tanya: When you're talking with people, you can get a multitude of subjects that you could possibly pull from any one interview, let alone 40. I ended up doing 85. There were many, many different types, different ways I could've gone so why this one? I just found it fascinating. I just found the idea that people would not see the value in reporting into a reporting system, things that they could fix. Absolutely, really, really interesting. I think that's all I can really say about that. I just found it something worth exploring.

Drew: You just find something that is fascinating within the data and build the paper around that?

Tanya: Qualitative data often allows you this luxury. If you are allowing the interviewee to really talk to you. If it's very structured and you are hand-holding the interviewee, you might not have that kind of luxury. There was a paper that I got out of this data that didn't even have to do with incident reporting, it had to do with double checking. This was because I had enough information from people sharing how they were able to ensure patient safety without writing an incident report and double checking was one of the mechanisms that they gave. Qualitative data can be lovely this way, just so as long as you can allow the interviewee to help drive the interview.

Drew: We might leave it there. Thank you so much for the time you've given. I really enjoyed that.

Tanya: Well, thank you, Drew. This has been delightful for me. I really appreciate it.

Drew: That’s it for the interview. I’m interested to hear your thoughts either directly about what Tanya had to say, or about your own experiences with operating or using instant reporting systems. That’s it for this week, we hope you found this episode thought provoking, and ultimately useful in shaping the safety of work in your own organization. Show notes can be found at safetyofwork.com. Send any comments, questions, or ideas for any future episodes to feedback@safetyofwork.com.