On today’s episode we discuss whether presenting people with facts or stories are more effective in changing their attitudes about safety.
“They found that the one that has a story of someone whose child has had measles along with the photo with the measles, had a very strong effect on attitude change…”
“Typically, as safety professionals, we often want to influence a change in what people are doing in the organization, be it managers or workers.”
“I would ask what sort of workplace are you running that the difference between whether people are working at heights safely...is a tiny increment in how scared they are of working at heights?”
Horne, Z., Powell, D., Hummel, J. E., & Holyoak, K. J. (2015). Countering antivaccination attitudes. Proceedings of the National Academy of Sciences, 112(33), 10321-10324.
Drew: You are listening to the Safety of Work podcast. Today, we are asking the question, “Are facts or stories better for influencing attitudes? Let’s get started. Hey, everybody. My name is Drew Rae and I'm here with David Provan and we're from the Safety Science Innovation Lab at Griffith University. Welcome to the Safety of Work podcast. If this is your first time listening, then thanks for coming. The podcast is produced every week and the show notes can be found at safetyofwork.com.
In each episode, we ask an important question in relation to the safety of work or the work of safety, and we examine the evidence surrounding it. We're recording this episode on 19th of March and David very appropriately you’ve picked a topic relating to diseases sweeping across the country. So, what's today's question?
David: Drew, today's question is are facts or stories better for influencing attitude? Like you mentioned, this episode will probably come out in about four and a half weeks so we might be able to see what we’re like at foresight and make some predictions about what the world might be like in four and a half weeks time, because we’re just entering quite drastic measures associated with the COVID-19 virus. But this is an episode that I wanted to do for a little while, Drew, so I'm glad that we are recording it.
Safety is focused a lot throughout the last 20 or 30 years about understanding and changing individual attitudes, beliefs, and even as far as values. This is in an attempt to either maintain or change the behavior of people within organizations, workers, managers, safety practitioners, and so on. Some of our listeners might think that discussing attitude is somewhat of a departure for us, Drew. But understanding individual and social psychology is no less important for any of the safety theories that you might subscribe to.
We are going to go on a bit of a wonder through some research outside of the safety field, more into the medical and social psychology field, and we’re exploring this idea of whether facts or stories are more effective for changing attitudes. Then we’re going to draw some parallels into what we do in safety.
The motivation for this episode, Drew, I think I first brought up with you probably a couple of months ago was a research finding that I came across which got me thinking about the strategies that we've had in safety for a while. Particularly, there's a lot of great speakers that can come into your workplace and talk about their life changing accidents, and I'd always been somewhat skeptical of this type of intervention.
Bring someone along who’s had a really altering accident. Let them tell their story that it could happen to you and hope that then changes the way you feel about safety within your organization, but this research and thinking deeply about this research has somewhat changed my mind in relation to the complementary nature of that kind of intervention and what it might do in your organization.
But before we start, Drew, I've done the introduction that you'd normally do. Do you want to talk a little bit about storytelling and sense making in safety more broadly?
Drew: Sure. I have to admit that having read today's paper, I've always been skeptical of the types of interventions that have trotted out someone who's been injured to talk about their accident. The paper has not changed my mind in the slightest about the ineffectiveness and potential harm of those sorts of interventions. But I do want to be clear that storytelling is a really important part of pretty much all aspects of our culture.
It's embedded in the ways that we put ideas together and make sense of the world around us. That's obviously, if it's true of kids growing up, if it's true of how we learn, if it's true of how we make sense of the world, it's going to be true on how we understand what's going on at work as well.
There's a wide range of research ranging from ethnography, to experimental research, to safety climate type research that says that the stories we tell, at the very least, they reflect how we feel, think, and make sense of safety and changing those stories very possible, not quite as strong as evidence can change and influence the way that we think, feel, and act about safety.
David: I think recently, Drew, you published a paper with Derek around accident narratives within the incident reports and how it changes the way people feel about the causes, and the recommendations of those accidents just by changing the way the accident is described. Is that what you are referring to also now?
Drew: Yes, so what we were testing there was whether changing the language that you use in an incident report and then showing those different versions of the same incident to people causes them to come up with different recommendations. I think incident reports very clearly show the power that stories can have at very specific points of time.
If you think that any report about an accident is in a sense storytelling and sense making. You are taking a set of incidences. You are taking a set of events that happen. You are giving meaning to those events. You’re selecting particular other events and joining them through a meaning process and it very directly then results in attempts to change the organization.
That's a little bit artificial because incident reports have very clear recommendations out of them, and most stories don't. Most stories don't say, “You're here. I've told you the story about Goldilocks and the Three Wolves. Now, here are my five steps to avoiding [...] related mishaps,” but still there's very often an implied meaning there. That's the power that stories have is they link together events and ideas into patterns, and those patterns then suggest what we are going to do next.
David: I am moving to the first paper, Drew, and I like this. We are trying to be more spontaneous with the podcast. We are less and less sharing our preparation before we put each other on the spot so I like the way that we started there. I said that I've somewhat changed my mind about some of these safety interventions and you immediately opposed and said that you haven't. This is at least going to be a bit of fun.
The paper that I picked out has to do with anti-vaccination and pro-vaccination I suppose, and it's quite topical and quite emotional. It's not our intention in this podcast to criticize anyone's individual beliefs or attitudes around vaccination. But full disclosure on, I consider myself very pro vaccination, Drew. What's your personal position on vaccination?
Drew: I am both pro-vaccination, and I've spent a chunk of my life essentially as an activist in this skeptical movement, so that it's specifically trying to combat incorrect beliefs about things like vaccination. I’m not just pro vaccination, I guess I’m anti anti-vaccination.
David: Okay, so there's very strong bias on behalf of ourselves as hosts, but let us go this article because the headlines that grabbed my attention were a newsfeed item that said, “How to convince vaccine skeptics and how not to,” and another headline that says, “Scientist discover the one thing that can change the anti vaccers minds." They grabbed my headline on one of my social media pages, and I went on and dug out the original research which is for my curiosity really.
This study was published in 2015 in the proceedings of the National Academy of Sciences in the US. The authors were Zachary Horne, Derek Powell, John Hummel, and Keith Holyoak. I looked this guys up, Drew, they are split across two universities in the US, the University of Illinois, and UCLA. They published extensively in psychology journals. They’re both from Schools of Psychology. They publish relation to knowledge, beliefs, learning, reasoning, creativity, a whole raft of individual psychological constructs.
I might just go into the method now, Drew, just to give a sense of what the study involves and I'll try to do this quite quickly. Just like some of the psychology studies, it was quite involved in doing the research.
They developed a five item vaccine attitude scale. These were questions like I think the risk of side effects outweigh the potential benefits. On a scale of 1 to 5, I plan to vaccinate my children. On a scale of 1 to 5, and so on. They check the reliability of the answers to this scale against past vaccine behaviors including asking people whether they had a flu shot in the past 12 months and whether that lined up with their vaccine attitudes on the scale.
They managed to recruit about 800 participants on day one who completed this questionnaire, Drew, and they also gave them a whole list of additional questions on several other moral issues like abortion and eutanasia. They did this to serve as distractors because they didn't want participants on the first day to know that they were specifically being asked about their vaccination beliefs.
They put these vaccination related questions amongst all of these other moral type issue questions. And then, they inserted, Drew, which I had never seen before these attention check questions where they got to a certain point on the questionnaire and they asked the question, “We just want to make sure that you are paying attention, select 'Strongly disagree’ for the answer to this question,” and something like 10% failed that question in the first pass.
Have you seen something like that before in a survey?
Drew: I've seen checks in a survey before. I've never actually seen that particular one which is kind of cool in a way it indicates how many people do just tick consistently without paying attention.
David: Particularly if they are being recruited with a $20 gift voucher or something to turn up and do a questionnaire. I thought that was quite novel. What they did might not be novel in psychology, but I enjoyed it anyway.
Drew: Do you just keep that 10% result in mind if you see studies that report things like, “10% of people believe that man never landed on the moon,” or “10 people think that the moon is made up of cheese.” That 10% could generally believe that, or that could be the 10% who just don’t tick ‘Strongly disagree’ on the question that says, “Tick ‘Strongly disagree’ here.”
David: Ticked all threes all the way through. Basically, 315 participants turned up to Day Two. First of all, what they did is they checked if those 300 were representative of the 800, because this study required people to come on two separate days, and so obviously there was a fair attrition rate after the first day, that's why the study only ended up with 315 people.
Here's where their main part of the research happened. They split these 315 people randomly into three groups. What they did is they gave the first group, the first group read a paragraph written from a mother's perspective about her child contracting measles. The mother's name was Megan Campbell whose ten month old son suffered a life threatening bout of measles.
The story included quotes from the first person saying, “We spent three days in the hospital fearing that we might lose our baby boy.” Went on to write, “Couldn't drink, couldn't eat, was on an IV. Seems to be wasting away.” Very emotional, very personal account of the impact of a small child with measles on the mother.
They also provided a picture of a child with measles to this group, and three very short warnings just reinforcing how important it is for people to vaccinate their children. That was group one.
Group two read a whole lot of information summarizing recent research that vaccination does not increase the risk of autism, and basically debunking a lot of the mis arguments used within the anti-vaccination type community. All of this information was compiled from the Center of Disease Control website. This was a whole heap of synthesized research finding, facts, information essentially making a very clear, logical, and factual argument for why vaccination was the right choice.
Then the third group which served as the control group read a completely unrelated piece about feeding birds. Drew, three groups. What are your thoughts about the way the design was done?
Drew: The only thing I want to comment strongly on here is that this was not a preselected group of people who are anti-vaccines, so even though when we come to comparing the results, they broke them up to somewhat the group who most support vaccines to the group who least support vaccines. Even that group who least support vaccines includes people who are generally somewhat positive towards vaccines.
When we come to interpret the results, remember that this is testing on a general population which is generally vaccine supportive. It is not testing on an anti vax population..
David: Yeah, that's a good limitation, Drew, just to understand. Because like you said, they did do a reliability test when they found these results and tried to get the people whose attitudes were most leaning. The third of the group whose attitudes were most leaning towards anti vaccination to make sure that the effect was still there within that group, but you are right. That third was only the third out of the whole sample. It wasn't necessarily 30% of people who are strongly anti vax.
Drew, you probably read ahead. You read the article so I had it here on the show notes, I was going to ask you what they thought they found. Do you want to talk about the findings?
Drew: Sure. Remember, we are basically saying which of these three interventions has the most effect. They found that the one that has a story of someone whose child had measles along with the photo of the measles had a very strong effect on attitude change. It made people more pro-vaccine. The factual information actually had the same effect as the information about bird feeding, so you are telling people information about how vaccines work and how effective they are, and didn't really have any effect at all.
David: I found that so fascinating, and we talk about evidence based practice and this whole podcast is based on providing evidence I suppose as close to factual information as we can around safety. This idea that all of this information from the Center of Disease Control, all of these synthesized findings out of research papers, and all of these had no impact on changing the attitudes of people in relation to vaccines. I was fascinated about that finding.
Drew: To be honest, I was less surprised by that because this is something that I've encountered a lot. I struggle a lot personally with trying to be compassionate towards people who reject scientific information and try to understand where it's coming from.
One of the key things is if you keep in mind that a large part of disbelieving in vaccines already requires you to distrust the scientific community and distrust doctors as a source of information, you can't trust your doctor who is telling you to have a vaccine, and also believe that they're trying to force something on you that's bad for your kids. When you think, okay, we are giving information that is official government information about vaccines. If you've already got an anti vaccine belief then you already got a distrust of that sort of information, distrust of that as a source of information. Whereas if you are pro-vaccine, then any information is going to support that existing belief as well.
David: Yeah, but I think interestingly it didn’t necessarily enhance the attitude of the people that were already pro-vaccine because it was more like reinforcing what they thought or assumed they already knew. The researchers concluded basically that what doesn't work with changing the attitudes of people in relation to their attitudes towards vaccination is to tell them that their fears are uninformed or erroneous. Telling them they’re wrong, and telling them that they don't understand the science doesn't work.
What it does is reminding parents and even people who weren't parents that measles or some of these other illnesses are terrible diseases and that they can protect their children by vaccinating them. Doing that in a very emotional, emotive, and in a very compelling way.
Drew: This is where I need to jump in and say that the bit about telling them that they're wrong doesn't work is definitely part of the scientific consensus around trying to change erroneous beliefs. There's a wide suede of literature that says that myth busting in fact reinforces myths, and fact checking just reinforces the belief that their mediary is biased. When people hold beliefs which are against the scientific consensus, then trying to explain that there's a scientific consensus just has no effect at all.
When people hold a detailed conspiracy theory, explaining to them that it's a conspiracy theory doesn't work. When people are scientifically ignorant, telling them that they are scientifically ignorant tends to get you the metaphorical equivalent of a punch in the face, rather than someone listening to you.
David: Yeah, or maybe the literal equivalent.
Drew: The other part of it, what they found did work in this study, is not in fact in line with the consensus. I think that is a function of the fact that they weren't testing this out on a very skeptical group, a very anti-vax group to start with. There are lots and lots of models of how people can change beliefs, and all of the major analysis says that none of them work consistently. This is really, really hard.
David: When you say consistently there, Drew, do you just mean it's almost individual by individual. There's not a common human condition about how to change attitudes?
Drew: No, what I actually mean is hold true across studies. That's part of the trouble. It's possible that changing people's minds does need to be a very personal, individual thing, and the way we do studies is by taking a large swath of people and trying to see what on average changes them.
One of the mistakes people often make is they just take a general population and test against them. What works for people who are already undecided is different for people who disagree, which is different for what reinforced beliefs of people who do agree. You got to really target your population, you got to really target why they disagree.
Some people, their hesitancy about vaccines can just be a lack of information. Some people don't vaccinate their kids because they never get around to it. Some people believe that the whole medical complexes are out to get them. How you respond to those differs. If people aren't just getting around to it, then bombarding them with reminders works. But that doesn't work when you test that intervention on a genuinely hesitant population.
David: I think, Drew, what you described there, I just think it's important to talk about a practical takeaway when it's really obvious like that. I was also thinking about how safety practitioners approach influencing people within their organization.
Typically as safety practitioners, we often want to influence change in what people are doing in the organization, be it manages, or workers, and to go ahead telling them what they are currently doing is wrong, and pulling out a fact to do with the regulation or something like that is probably going to have that universal punch in the face.
But like you said, finding out the specific logics for individuals about whether they are doing something because they don't know when they are doing something, because they can't, because they're resource constrained, or whatever the reason is, and having an influencing tactic that aligns to whatever position the person starting from is going to make you a more effective safety practitioner. Is that a reasonable conclusion?
Drew: Yes, and I think that sort of reasoning does take you down a path, which says that there are times when telling people what the rule is very effective. You have some people who are very rule minded, and they just thought the rule was one thing, you tell them that the rule is another thing, “Oh, thank you for telling me that. Now I know. I'll make sure to follow it.”
Some people for whom these personal stories of injuries are just really powerful. They think that, “That guy is exactly like me. I felt it. I feel it and suddenly I'm coming out of this feeling less sure and I'm going to check next time.” There's a big difference between you who does this work as a general strategy which the answer is almost always no when it comes to trying to change beliefs.
What works for an individual where anyone of these tactics might work. Maybe it's worth going through some of the general ideas of how to change beliefs that can work for individuals, but don't hold up in these big scale studies.
David: I'm very happy for you to do that, Drew. You’ve gone and looked at it for us so I'm going to learn along with our listeners right now.
Drew: The one they are testing at the particular study which we are talking about today, they've talked about it as the idea of alternate belief. Rather than trying to fight directly against the belief, you just give people a stronger belief to latch on to. The idea here is people might be afraid of vaccines, but if we can make them afraid of measles instead, then it's going to be hard for them to hold both of those beliefs at once. One of them is going to win. We never actually have to combat their belief about vaccines, all we need to do is to give them a stronger fear of the measles. That's one way of going on about it.
Myth busting is the one we try directly to tell them that the reason you think vaccines cause autism is because of this particular study that was found to be fraudulent. The guy actually got locked, got disbarred and prosecuted for being fraudulent. All other studies have contradicted.
That's the one that really appeals to scientifically minded and skeptical people, but they are not the people who believed in any vaccine in the first place. Sometimes, what works as an information deficit is just explaining to people gently and calmly, here are the facts. Most science communicators suggest that the information deficit model is not the one that works, but it's the one a lot of teachers default to just because that's how we are used to persuading people assuming they start from nothing and giving them knowledge is a way of changing beliefs.
A couple of the more effective ones that work quite well for some people, just not consistently on the broad study. Just making people believe that everyone else does it. There's a fairly famous study that turned out to be totally fraudulent where they were changing people's voting behavior in optional voting elections by getting them to believe that everyone else on their street had already voted. It turns out that the research was actually totally fabricating the whole thing which is a bit disappointing, but that model has been tested in some other studies and sometimes has some effect. If people believe that everyone vaccinates, and it’s just a normal thing to do, then they might do it.
David: What we spoke about at the start is people believe that everyone else is hoarding food and toilet paper from the supermarkets, maybe I have to do it as well, even though I would never believe that I needed to do that.
Drew: Yes, exactly. The evidence is a little bit up and down with whether this works or not, but at the very least, it's much more friendly than the other tactics. To actually engage and acknowledge people's concerns and try to find common ground, so you don't try to argue against them. You say, "Look, I understand how you feel this way. You don't want to vaccinate because you care about your kids, and I want you to vaccinate because I care about your kids." Leave the vaccination aside for the moment. Let's just agree that we both care about the kids.
Let's agree that things are scary, and the kids have put up with so much, and I understand that you don't like giving your kids needles, you don't like needles yourself, why would you want to do that to your kid. I acknowledge that, and connecting as a human being by acknowledging the concerns and fears is believed in some cases to then help to persuade by building up that relationship of trust. I personally like it because even if it doesn't work, at the very least you've made a friendly connection with the human being. You don't both go away, hating each other, distrusting each other because you're on different sides of the issue.
The other one that has some work, which I really don't like, because it goes the other extreme is essentially the version of medical bullying, which is I know better than you and I'm gonna vaccinate your kids, and of course, I'm just gonna do it. You just basically dare the person to object loudly enough to count as an objection. That does work when people you present to a doctor at a time when they'd normally be vaccinated. That does have some effect on them, the kid walking from the consultation being vaccinated, with what harm to that relationship going forward.
David: Like every week when we dig in a little bit into some of these questions, there’s whole disciplines of research, and complicated frameworks, and different effects going on all the time. I think what you've outlined there are six different ways of thinking about how to influence attitudes, some of which are more or less reflected in research findings, and some of which are a little bit more humane or a little bit more personable if you like.
I suppose this one finding isn't going to be a generalizable thing, but I did start thinking about what this finding from this research might mean for some of the other things we do in safety. Is there anything else you wanted to say before I moved on?
Drew: Just one thing I wanted to point out is how hard it is to get good research on these things. The way we do a lot of these studies is over short periods of time. We measure people's belief, we give them the thing that's supposed to change their belief, and then we measure again. Obviously, that's not how people form opinions or change their minds. You need to have much longer periods of time to test these things. I think that definitely translates into safety.
If we want to know whether a speaker is effective, we don't measure the effectiveness by measuring beforehand, getting people to either survey straight afterwards how much they liked the speaker. We would need to have a consistent type of speaker coming into the organization regularly and measure the effect on time of that messaging.
David: That was one of the two things that I immediately thought of and I've mentioned them both in a roundabout way when I read this study I thought, gee. I wonder whether we can form conclusions about these safety speakers or these workplace based safety accident speakers, people like James Wood, Alan Newey, Charlie Morecraft, et cetera, who can come into the organization and tell their own story of their own accident and how it changed their life. What their life was like before what their life was like after. Have people see them as someone just like them who thought it couldn't happen and maybe reflect on the attitudes they hold towards work and safety.
I thought about that and I also thought about how we influence people in our organization as safety practitioners to try to change some attitudes and behavior, so that were the two practical things that I wanted to briefly talk about now.
For the first one, I think about the scenario, Drew, based on this experiment where if I wanted to engage my workforce around working at heights. Let's put this in the context of physical safety, Drew. I'm not saying for a minute ignore doing all of the heavy lifting about physical risk controls in the workplace. In addition to that, I'm curious about the attitudes of people towards safety. If I pose this scenario to you, if I wanted to engage the workforce around work at heights, would I be better to prepare and present a presentation, with all the facts stating that 65% of people who fall from a height of greater than one meter have a serious injury and has all of the information, all of the rational logic and factor and working at heights, or should I get a speaker to come in who's fallen from a height to tell their story about how it changed their life? That's scenario one. What are your thoughts?
Drew: My honest, immediate reaction is I would ask what workplace you are running. The difference between whether people are working at height safely, or not working at height safely is a tiny increment in how scared they are of working at heights, because that's effectively what both those strategies are trying to tinker with, is the level of inbuilt fear and caution. I'm struggling to think of good examples where that really is the difference.
If that was the difference, we'd be seeing much greater variation in individual safety in exactly the same work sites, because there are so many things other than what we do that tinker into people's risk perception. If you think of the drastic difference between the guys who put in my pool working here happily at two meters with no protection, and guys on large construction sites where they've got yellow tape for any bump in the ground of less than half a foot, those sorts of differences have to be far greater than every six months what information we give people.
If we zoom right in that question, let's put all that aside and let's assume that the level of risk perception is what we want to tinker with. Which one is going to make a difference? Having a story you can recall or having a fact that you can recall, I would guarantee that it's the story.
David: I appreciate you playing along with me, Drew, even there's a fairly long disclaimer. Maybe my mind is a little bit more simple than yours, because those were the sorts of practitioner type scenarios that were running around in my head. I think maybe if we can put all that aside, all that context aside that you described and just reinforce that point out of this finding, part of the human condition is to latch on to stories, particularly if there's some kind of emotional connection that can be tied to that story as opposed to recalling facts that they've heard.
I went looking for some papers. I tried to find if anyone had done this research because there's a very strong claim on Charlie Morecraft's webpage that claims that following his speeches in organization, there's a “proven 40% reduction in accidents.” I went and tried to actually find it.
I don't know if you are aware of any, but I went extensively searching for accident storytelling, influence, attitudes, all of these search strings, and I couldn't find any safety research that had been completed that tested the safety attitudes versus this storytelling approach around accidents and incidents. There is some research that was to do with the use of narratives, stories, and accident rates and things like that, but it was so poorly performed that I couldn't even bring myself to show it to you by claiming that an increase in safety, narrative storytelling reduced accident rates.
Drew: David, I don't think it's coming to any surprise to you that I think that the “proven 40% reduction” claim is nonsense, not just because I don't believe that it's possible, but because I don't believe it's possible to do research that's going to create that evidence.
You need to regularly have organizations that are having dozens of accidents, and then this tiny change in belief that you get from the story would need to have a massive influence repeatedly over, and over, and over with these really, really dangerous organizations. That's just almost my self-contradictory. No, there's very little research on the effectiveness of storytelling directly.
There's some really interesting research on how stories are used to communicate hazard information in organizations. We might even actually do an episode on it because I personally find it really interesting the way that the lessons that come out of official investigations and get shared are often less effective than the lessons that come out of people in the smoking huts, having conversations, and basically warning each other using stories of near misses. Those spread through the organization much quicker than the official investigation reports do with quite different messages.
Famous is quite the word when you come to accident studies conducted in Denmark, but the official investigation blames failure of communication protocols for some trackside workers nearly getting hit by a train. Whereas you listen to the stories that the workers tell, these stories percolate through the organization for years. It's when you phone in your location to the controller, don't trust that they record it properly, which is probably actually much better advice than the official investigation recommendation. The stories do perpetuate that knowledge.
David: Yeah. It's one of the things, what do you do about communication failures as a cause as opposed to what do you do about the recording accuracy of the phone in location from track crews? Actually, one you can do something about, one you can't.
Drew: Yup. I think that there's an underlying message here, which is that the effectiveness of stories is going to depend on the type of lesson and result you're trying to get out of the story. I think some people who tell the story, they're basically just trying to say I was careless, I got hurt, be more careful, care more about safety, and that's a really useless message.
Whereas stories that embed particular hazards that may be invisible, except for the fact that people tell these stories, that can actually be really powerful. I think the type of story and the type of learning that is carried by the story, your Trojan horse inside the story into people's brains. If you think strategically about that, what sorts of messages do you want? Make those slightly more sophisticated, slightly less being scared, slightly more about influencing people into understanding particular hazards or understanding the dangers of work that might not be visibly dangerous, that I think can be really powerful.
David: Drew, it was a surprise to me that the first search result that came up when I started pushing all of these search string terms was a paper that you had written called House of Disaster, The Role of Accident Storytelling in Safety Teaching from Cognition, Technology & Work in 2016. I wasn’t even aware that you had written that paper.
Drew: Just to be clear, this is not a direct research paper, this is much more of a reflection on teaching type paper. This is me trying to think about how I teach, and trying to dredge up evidence to justify the way I teach, and totally failing, but thinking that it's worth communicating anyway.
I realized that particularly in the teaching engineers about safety, or teaching safety engineering, we use lots and lots of stories. We use lots of disasters. I head up a whole podcast, which is telling people about disasters. I was wondering, why do we do that? Is it just to entertain people? Is it to scare people? Is it to teach something specific? Is that a good teaching strategy? I wanted to understand that.
The result of that paper was really the formulation of a theory that hasn't been tested rather than an actual result, which is consistent with what we know about teaching, that we should be using accidents, and we should be using them to try to convey an understanding of how accidents happen, how events connected together, and what accidents are like from the point of view of people before the accident.
Some of the victims out on the stage type stories fit that model. They're trying to say what was life like before the accident? What was I thinking? Can you recognize yourself in this? I think the model of teaching is probably the same. It's just that I believe in trying to convey a slightly different message using that story, rather than just don't be complacent.
David: Drew, I'll put a link to that paper in the show notes in case anyone wants to have a read. I think we're up to about two or three ideas for future episodes from today already. That's helpful for the future episode list. How about we move on to practical takeaways now?
If we cast our minds back to the paper that we reviewed on the pro and anti-vaccination, they basically concluded that it's more effective to accentuate the positive reasons to vaccinate and take a non-confrontational approach which is what you described, Drew, when you were talking about the different models for changing beliefs.
If you challenge people directly, they tend to become more entrenched in what you're challenging them about. They concluded this study has broad implications for persuading skeptics on a wide range of issues. Fighting something head on is never going to be an effective way to change someone's mind.
We might do a future podcast on safety practitioner influence, because there's three or four really good papers about how safety practitioners have historically and currently do attempt to exert influence on others in their organization. A lot of that research suggests that the way that the safety profession may be going about trying to exercise its influence is not consistent with some of the most effective ways that they could be influencing, which is why we tend to see probably a large range of professional practice within the profession, but also a big gap between people who are effective in their role as safety practitioners and people who are less effective.
Think about this study in the context of how you go about using stories and how you go about the way you approach people in the organization you're going to influence, or change the way that they're thinking in relation to safety.
Drew: None of the evidence guarantees that what you try is going to work, except to say that if you don't try to understand where someone's belief is to start with, and try coming at it indirectly rather than head on, you're guaranteed not to work. Your old strategies start with understanding the position of the person you're trying to influence first.
You're genuinely asking them about their thoughts, beliefs, find out where they're coming from, find how strongly they believe, find where they justify that belief from, and then come at it sideways rather than head on.
David: I could say something like I don't care how much you know, unless I know you care. I also thought about thinking about the use of stories in organizations. One example that I had previously in my career was in relation to lessons learned. We did a program in the organization where what we ended up doing was having the team that were involved in a particular operational situation or incident which included the whole team, the workers and the management and the safety practitioner.
We had to actually go around to the different sites in the different businesses in the organization and tell the story of what was happening and what they learned. It's a little bit like what you said about the conversations in the smoker hut, and we did this as a replacement for our safety alert process, which was instead of sending the A4 page around saying this is what happened, this is what you should do.
We actually set the teams to discuss with the other teams, and in their own words, and answer the questions of the other teams, to try to prevent some of these distancing by differencing, and the gaps in information, and the censoring of the story as it moved through the organization. What would be your thoughts on that process, Drew?
Drew: I think it's a great idea. I think the trap to avoid is making people specifically list what they think the actions or recommendations are. You're in a good story. The lessons are buried within the story, and when you ask people having told the story, okay, so now tell us what the takeaways are. Now tell us your top five recommendations.
That's often where a sophisticated understanding gets dumbed down. And that's insulting both the person telling the story and the listener. If the story is well told, the lesson is embedded. You don't need to say okay, based on my description of the accident, we should have done a risk assessment. We should have been more careful. You should be more careful. Just let the story speak for itself.
David: Yup. You can go back about four episodes if you want to look at dumbing something down into five bullet points, and our point, and the lost information that's along with that type of approach. Drew, is there any other practical takeaways that you want to pull out of this wondering discussion?
I guess the final one is just that it is very much an open question. What does work to change beliefs and how desirable is it? This is part of practice where we shouldn't be throwing in solutions ready-made. Just because there are speakers out there with prepaid packages that lots of other organizations have used, don't assume the fact that everyone else uses them means that they are actually effective. Popularity of a speaker is not proof that their message is helpful for your organization.
David: Yeah, good advice. In terms of invitations to our listeners and questions we might prompt for discussion. I was really interested, Drew, to know how people are currently using stories, or storytelling, or case studies or something within their organization to convey more implicit safety messages. Are they replacing any of their artifact type processes with more narrative based processes? I think also I was wondering if anyone's doing any measurement or evaluation of any of these processes and practices within their organization. Anything else you'd like to know, Drew?
Drew: Note that those are good questions and I'd like to know the answer to. David, I'll throw it to you. Today, we asked the question: are facts or stories better for influencing beliefs?
David: I think my answer to that would be on the way through stories, definitely better for influencing beliefs. That disappointed me somewhat and I'd probably put a writer on there as long as it’s a nonfiction story. I wouldn't be encouraging anyone just to make up stories and start sharing them, but clearly stories are more likely to be effective in influencing people's beliefs either their own or ones that they hear.
Drew: That's it for this week. We hope you found this episode thought provoking, and ultimately useful in shaping the safety of work in your own organization. As always, you can contact us on LinkedIn or send any comments, questions, or ideas for future episodes to firstname.lastname@example.org.