The Safety of Work

Ep. 106 Is it possible to teach critical thinking?

Episode Summary

In this episode, we’ll discuss a 1993 paper by Jonathan Baron, titled “Why Teach Thinking - An Essay” published in Applied Psychology: An International Review. Jonathan Baron is an American psychologist and professor emeritus of psychology at the University of Pennsylvania in the science of decision-making.

Episode Notes

Baron's work focuses primarily on judgment and decision-making, a multi-disciplinary area that applies psychology to problems of ethical decisions and resource allocation in economics, law, business, and public policy. 

 

The paper’s summary:

Recent efforts to teach thinking could be unproductive without a theory of what needs to be taught and why. Analysis of where thinking goes wrong suggests that emphasis is needed on 'actively open-minded thinking'. including the effort to search for reasons why an initial conclusion might be wrong, and on reflection about rules of inference, such as heuristics used for making decisions and judgments. Such instruction has two functions. First. it helps students to think on their own. Second. it helps them to understand the nature of expert knowledge, and, more generally, the nature of academic disciplines. The second function, largely neglected in discussions of thinking instruction. can serve as the basis for thinking instruction in the disciplines. Students should learn how knowledge is obtained through actively open-minded thinking. Such learning will also teach students to recognize false claims to systematic knowledge.

 

Discussion Points:

 

Quotes:

“It’s a real stereotype that old high schools were all about rote learning. I don’t think that was ever the case. The best teachers have always tried to inspire their students to do more than just learn the material.” - Drew

“Part of the point he’s making is, is that not everyone who holds themself out to be an expert IS an expert…that’s when we have to have good thinking tools .. who IS an expert and how do we know who to trust?” - Drew

“Baron also says that even good thinking processes won’t necessarily help much when specific knowledge is lacking…” - David

‘The smarter students are, the better they are at using knowledge about cognitive biases to criticize other people’s beliefs, rather than to help themselves think more critically.” - Drew

“Different fields advance by different sorts of criticism..to understand expertise a field you need to understand how that field does its internal critique.” - Drew

 

Resources:

Link to the paper

The Safety of Work Podcast

The Safety of Work on LinkedIn

Feedback@safetyofwork

Episode Transcription

David: You're listening to The Safety of Work Podcast, Episode 106. Today, we're asking the question, is it possible to teach critical thinking? Let's get started.

Hey, everybody. My name is David Provan. I'm here with Drew Rae. We're from the Safety Science Innovation Lab at Griffith University in Australia. Welcome to The Safety of Work Podcast.

In each episode, we ask the important question in relation to safety of work, or the work of safety, and we examine the evidence surrounding it. Drew, what are we going to talk about today?

Drew: David, before we get into the main episode, I just wanted to mention something a little bit more personal. We're both podcasters. I listen to a lot of podcasts. We were talking about this before the episode, and I think you'll listen to a bit less now that you've probably a bit less. But in the past few days, it's come out that one of my favorite podcast co-hosts was engaged in unacceptable behavior, resulting in basically the breakup of the podcast and a lot of the community around it.

I've been pretty upset about it. It was a reminder to me about how important parasocial relationships that we form with people who we watch or listen to on a regular basis. David and I do this podcast basically because it's a good excuse to hang out together. We'd like to talk about interesting stuff. We didn't get to do that after David stopped his PhD, so it was a good way to restart it. We know there's lots of people who regularly hang out virtually with us when we do these recordings.

We know a few of you personally. Some of you we've met once or twice, some of you we exchanged comments on LinkedIn but haven't actually met. Despite the fact that we don't know you well, you're willing to basically let us talk to you for up to an hour at a time every couple of weeks. That's pretty generous of you. There aren't many people I've got who will just listen to me talk about my special interest. We give you free content, but you give us your time and you give us your trust in regularly listening. I just wanted to let you know how much I personally appreciate that.

David: Yeah, thanks, Drew. I do as well. It's just always not a great situation, but a nice reminder. Drew, today's topic comes from a paper shared by one of both of our PhD students, Russell McMullan. Russ has been on the podcast before and he's doing his PhD on engineering decision making for safety. As part of that, he spent the last year or so doing a very broad and very deep literature review on multidisciplinary approaches to decision making research.

We make decisions every day, but wait until that literature review comes out and just the range of ways that we can think about what a decision is and how we go about making them. This paper we're going to talk about today is one of those gems that he dug up in his process of doing a broad literature review.

Drew: It's one of those things when you're supervising a PhD student that you make them do a ton of reading. Very occasionally, they make you do some back by sending you a paper and saying, hey, what do you think about this one?

David: Critical thinking, Drew. Do you want to share some opening thoughts around that?

Drew: Yes. Critical thinking is one of the most overused terms I think we have particularly overused in education and particularly overused as a complaint. People say high schools and universities, we should be doing more to teach and encourage people to think critically. Another way of saying it is that sometimes people say it's less important what you know about, what you think, and more importantly, we teach people how to think. Why don't we do more of that?

Even though it's a very vague thing to do and really hard to pin down exactly what we mean by pre-critical thinking, sometimes we just mean, why do people not think more like us? Sometimes we genuinely mean something specific. I think it's making a bit of a resurgence with tools like ChatGPT.

David, you might know. I can't remember the name of Google's equivalent as well that's come out recently for their equivalent of ChatGPT for search. It becomes very clear that just producing data and finding things on the internet are not things that humans have a unique selling point, so we need to focus more on things that humans are uniquely good at. That includes our ability to deeply understand things to be reflective and to critically evaluate knowledge rather than just regurgitating it.

In the news, again, recently, complaints have been coming up again recently, but it's not a new thing. It's one of those every generation complains about the generation after, their lack of critical thinking. This particular paper comes from 1993. For me, 1993, I was just out of high school in my first year of university. David, I think you were still in senior high school.

David: I wasn't quite there yet. I was somewhere halfway through high school.

Drew: Most of the message of the paper is still relevant. We'll point out where the paper is a little bit out of date. I think the ideas are actually fairly helpful in helping us think about thinking critically and learning. I thought it might be a bit fun to read the paper as a reflection on how we teach and develop safety professionals.

As we go through, how about thinking about your own education, your own high school, your university? Maybe on LinkedIn, you could tell us a bit about how we could be doing a better job teaching safety at university if we're supposed to be encouraging these critical thinking skills.

David: Drew, I think that reflection about cite professionals, I was doing a bit of reflecting because I get the opportunity to do a lot of, I guess, teaching with safety professionals these days. Where do we spend the most time when facilitating learning conversations with safety people? It touches on bias and heuristics, such as talking about the way that we think. Interestingly, Baron suggested that these aren't necessarily innate, hardwired thinking processes for people, but these are learned and shared. I found that interesting.

A large part of the learning that we do as safety professionals is not what to do, but how to go about doing it. It's not necessarily what advice to provide, but how do you go about engaging with other people, balancing your perspectives with their perspectives, humble inquiry, and constructive dialogue? And not about how to do a risk assessment, but understanding how we think about risk, can perceive risk, why we do that in the way that we do, and why people do it differently than others. 

Not what's a good strategy, but what does it take for individuals and organizations to actually change, be influenced, and adjust beliefs? Even not how to do an investigation. We've talked about investigation a few times on the podcast, but what's the mindset that an individual and organization needs to have? What are the conditions around an incident investigation to deeply understand and learn? My initial reflection reflecting on, I guess, how we normally educate, train, develop safety professionals, is always probably on the knowledge component as opposed to the thinking and understanding component.

Drew: I don't think people would disagree that our goal is to teach people how to think. We just often put that up as a headline and then we don't then necessarily do a lot in our methods of teaching that actually worked towards that goal.

David: Drew, if I just introduced the paper, to get us started into the content of the paper. As a single author, Jonathan Baron, Professor of Psychology at the University of Pennsylvania in 1970. He got his PhD in 1970. Towards the end of his career, which he had a long academic career, specifically on topics concerning learning, rationality, and then more recently, political thinking and decision making by voters, the title of the paper is Why Teach Thinking? An Essay published in Applied Psychology: An International Review.

Drew, you know that you suggest Applied Psychology. I remember reading a lot of Applied Psychology articles with my undergraduate psychology degree. It's a bit like Safety Science, the journal. It's a generalist, well-known flagship journal, contains a mixture of theory and empirical research results and also papers that are more like commentary or essays rather than original research. I guess we've published a few of those in safety science as well.

This particular paper makes it very clear from its title that it is a work of critical reflection rather than research. However, Baron does reference throughout the paper referencing a lot of previous publications of his, some of which involved more empirical research activity. This is someone who's researched, thought, read a lot about thinking, and then it's got to the point of okay, now I want to lay my thoughts out in relation to this topic.

Drew: I love the laying down. This is what I think. Here's the evidence. I produced it. Why didn't you already listen to it?

David: We'll break it up into sections. We'll talk about what the paper says in each section and then we'll put our own thoughts in as we go. Drew, do you want to get us started?

Drew: Okay. The first section of the paper is appropriately just called Introduction. It starts off with the groundbreaking idea that the last decade has seen a rebirth of the idea that schools should teach students how to think. I don't know about you, but immediately, I'm wondering you could write that in 1993, you could write that in 2000, you could write that in 2010, you could write that in 1950. It's a pretty timeless idea.

In fact, Baron then goes back later in the paper and talks about Aristotle and his theories of how to teach people how to think. It is a common thing that people worry about. How do we make sure that we're not just teaching people knowledge? How do we make sure that people are learning actual thinking skills?

Just out of interest, I looked up the Australian National Curriculum. I saw there, they've got seven general capabilities, it starts off with literacy and numeracy. One of the seven is critical and creative thinking. I think that's pretty much what Baron is talking about when he says that it's incorporated into many statements of goals by educational authorities.

It's a grand scale effort that we want people to be literate, we want people to be numerate, and we want them to think critically. I won't read it out, but this one curriculum just got a bit of a statement that explains what critical thinking is, what creative thinking is, and why it's a goal of education just as much as literacy or numeracy.

David: Drew, we're reflecting on our own time at school and university and then our kids' education. I guess, even though that statement is quite timeless, I think we are and have seen quite significant changes in education curriculum and clearly, delivery in the classroom, over the last, well, let's say, 30 years since this paper was published.

I know that we did projects. Baron's quite critical in this paper of things like projects, where you're coloring in, cutting out things out of magazines, and not really doing any thinking or learning. My kids at school do units of inquiry where they need to have inferences, hypotheses, research, and a whole bunch of things that I never did when I was in their stage of schooling.

Drew: Although interestingly, the point parents are making is that when people introduced projects, it was to try to encourage critical thinking. Our ways of trying to do it have evolved and hopefully have got better as we've learned more about teaching, but it's aimed to teach critically. I think it has always been there.

It's a real stereotype that old schools were all about rote learning, but I don't think that was ever the case. The best teachers have always tried to inspire their students and teach them to do more than just learn the material. We're getting better at different ways of doing it. We're learning more about how learning works. That's really what Baron is at in this article.

What do we currently know about how to teach critical thinking? Where are we maybe still just doing bad habits to encourage it that aren't really working? What can we do in the future? It's really interesting to look at this point in 1993 and say what things did he suggest? Have we actually embraced wholeheartedly since then? What things have we maybe just kept doing without thinking about it?

David: Drew, what else do you want to pull out of the introduction?

Drew: A couple of things. One of them is that this idea of critical thinking isn't just a school thing. That's why we thought it'd be relevant to talk on the podcast. You've got lots of writers who try to encourage people in business to think critically as well.

This paper cites Edward de Bono. I don't actually know what the age range of our learners are. Your more modern reference might be something like Kahneman's thinking fast and thinking slow. They've always been these pop culture writers who want to encourage business people and just people out in the world to think better and think more critically as well. You get consultants who make their careers out of trying to get corporations and institutions to think better and inspire critical thinking and creative thinking workshops.

Again, this is a timeless statement that the world's become more complex. It's become more difficult for the average person to understand. Being in a democracy is harder. We need to rely more on experts and think about the nature of expertise. Again, written in 1993, but things have continued to get harder on that front.

David: I think we'll talk about expertise throughout this paper a bit as well because there are some really interesting sections around there. It reminded me of, I guess, those little snippets that you might see on social media, those little memes or something that says, I don't care if you've got a PhD. I watched a three-minute YouTube video, so we can now have a healthy debate.

Drew: Part of the point he's making is that not everyone who holds himself out to be an expert is an expert. That's when we have to genuinely have good thinking tools around expertise like who is an expert? How do we know who to trust? When is the right thing to do, just defer the expert? And when should you actually use your own judgment and challenge the experts? There are times when you should want to do one time, you should do the other, and you're getting it right.

One extreme, you just do what other people tell you to do when you just served in the bureaucracy. The other extreme, you're like an anti intellectual, sovereign citizen conspiracy theorist. We'd all I think conspire to be somewhere in the middle, but how do you know where in the middle to be on each of you? Where does the middle lie in each situation?

David: After the introduction, we get into the next section titled The Current Rationale. It opens up with, what is thinking? Where do we start? What is thinking?

Drew: Okay. He gives us a definition. He says, thinking is a mental activity that's used to resolve doubt about what to do, what to believe, or what to desire or seek. Thinking about what to do is decision making. Thinking about what to believe is part of learning.

He slightly pushes aside the thinking about what to desire. It's important, but not particularly relevant to this particular discussion. It says thinking about what to believe falls into things like scientific thinking, hypothesis testing, making inferences about connections between things, your causal relationships, and contingencies.

David: Drew, you got to know, this section is a bit out of date. You mentioned Kahneman and Tversky thinking fast, thinking slow, a lot more work on heuristics, the entire discipline of naturalistic decision making, NDM, which has some crossover with resilience engineering, but then a lot of the work on sense-making, Gary Klein's work, and others. There's actually quite a lot that's been thought about researched and written on thinking in the last 30 years.

Drew: Yes. I think we know a lot more now than we did then about how people actually think. But I think general principles about the need to be able to think and what we use thinking for are probably still pretty relevant. He's got a very detailed referenced summary here, and I would not use that as your foundation for what we currently know about how people think. That bit's certainly out of date, even if his central point isn't.

In particular, he says that decision making is the final common path of thinking. I think that's a bit that has been rejected now, this idea that everything points towards a decision. We now recognize that people can make decisions without even knowing that they've made decisions or justify decisions after they've made them.

David: And maybe think without arriving at a decision. It's probably about now in the podcasts where people are going to put a couple of things, maybe. One is that thinking something that we all do every day, but maybe we haven't really stopped to think about how we think. The first thing I did for our listeners when I jumped on this with Drew after spending an hour or so reading about thinking was just to say, gee, my head hurt. Actually, reading and thinking about thinking actually is actually quite hard.

Drew: It might be a little bit easier if we move on to his next subsection about good thinking, which is I think fairly straightforward. We're thinking about, what counts as good thinking? He says good thinking is basically just anything that's likely to achieve the goals of the thinker. He says, some ways of thinking are likely to be better at that than others.

If your way of thinking is to leave all of your decisions after a magic eight ball, you're probably not going to achieve your goals. Some ways are going to work better than that, some ways they're going to go worse. He says, we've developed through philosophy and try psychology's normative models, but they're all limited.

We've got things like holding people to logic and using logic to evaluate thinking. But the trouble with logic is that traditional logic is really about propositions and the relationships between those propositions. If A is true and B is true, does that necessarily mean that C is true? He says, okay, if you don't like that, you can go to something like probability, which allows us to deal with uncertainty. But even probability requires us to subscribe to things like utility theory that says we're trying to maximize the utility that comes out of any particular decision.

There's no normative model that everyone agrees with that we can apply and just say, okay, here is the standard against which we're going to judge decisions. Although interestingly, I haven't actually followed up on this, he's got this other article in the press that does make some suggestions about this. Maybe somewhere he's got the magic answer that no one else has found.

He's got a little fun bit where he says that even if you think of it, decision making, as a search for options and then search for evidence to evaluate each of those options, you can still recognize that utility theory has to apply to the search itself, that spending time thinking about that has costs. We can't afford to perfectly make every decision by perfectly searching for perfect information. We've got to have tools that let us do our thinking without getting totally hung up on every decision. As he says more simply, we can think too much.

David: Good thinking is about achieving the goals of the thinker. There are some normative models and ways of doing that. Baron also says that even good thinking processes won't necessarily help much when specific knowledge is lacking, so you can have a good thinking process. But if you maybe don't have the background field or domain knowledge, it can not necessarily help you.

It can help you to acquire that knowledge effectively and apply it once it is acquired. Critical thinking, even without domain knowledge, is a good way of acquiring domain knowledge. I'd never heard of Myside Bias before I saw it as a section heading in this paper.

Drew: No, I hadn't either. I think he's just using that as a catch-all term for what we would, today, think of as cognitive biases. He's making the underlying point that almost all cognitive biases are, to a certain extent, types of self-confirming biases. They tend to lead us to maintain our current beliefs rather than to change our beliefs. It's helpful to think of them as my side. He says, it doesn't necessarily mean that it favors your side because you can have people who are chronically blaming themselves for everything. That would still be maintaining your existing belief, my side bias.

He gives some examples. I think, probably familiar to most of our listeners, but I'll just call out a couple. One of them is what he called selective exposure effect, which now tends to be called selectivity bias, where people tend to search out information that would support their present views, and then they pretend that this is like randomly or objectively gathered evidence they convinced themselves that they have. This is your classic person on YouTube that says, I've done a lot of research into this topic, by which they mean, I've watched lots and lots of videos that confirm my views. The bulk of the evidence that I've seen confirms my views, therefore my views are correct.

David: I've heard of that one as selection bias. Since we've already done an episode around influencing beliefs and vaccination, I think whether you're a pro vaccination or an anti vaccination, you will select out the research and the information that support you're not having to change your view, which is not what critical thinking is about.

Drew: Exactly, and he goes through some similar things. Even when we have found things, we tend to evaluate evidence against our beliefs more skeptically than we evaluate evidence that supports our beliefs. We don't tend to look for counter evidence as much as we usually should. We're subject to groupthink, which is finding excuses to fall in with the views of others who are around us.

He says, if you look at all the different steps of making a decision, then cognitive biases fall into each stage so that when it comes to framing the question, when it comes to searching for evidence, when it comes to selecting evidence, when it comes to evaluating that evidence, when it comes to integrating it, at every step, we have cognitive biases that lead us towards confirming the beliefs that we already hold.

David: I guess when we think about critical thinking and good thinking processes, I guess this section is really just a warning, which is that you need to be quite deliberate with your thinking process to avoid some of the less critical thinking aspects associated with all of these types of biases, which we all have.

Drew: David, before we go too much down the rabbit hole of just reciting all of these thinking errors we should point out, I think we should jump to his bottom line, which he says, by any reasonable description of thinking, students already know how to think. The problem is they don't do it as effectively as they might. That's a really insightful summary because there's been lots of research since that has backed up his fundamental claim here, which is that teaching students about cognitive biases, teaching them about logical fallacies, and even directly teaching them logic, just gives students more sophisticated language to justify all of their own existing beliefs and to criticize beliefs that they don't agree with.

It's one of the most important lessons in teaching critical thinking that teaching people about mistakes in thinking doesn't actually make them think critically. It just gives them lots of tools that they can use alongside their existing biases. It's funny evidence that the smartest students are the better they are at using knowledge about cognitive biases to criticize other people's beliefs rather than to actually help themselves think more critically.

David: That's interesting. I wasn't aware of that body of research. It makes a little bit of sense now that you've said that, but I guess I'm just reflecting on what we're doing safety about teaching people how the mistakes of the way they're currently thinking and the hope that they won't make them. In safety, a lot of times, we teach people about what not to do and the hope that they won't do it.

Drew: It doesn't work if that's not the problem in the first place. If the problem is that you've got an innate tendency towards certain things, being made aware of that tendency doesn't actually stop you having that tendency.

David: There’s realities of the way that we think as individuals, which may not be as critical as we might like it to be when we think about critical thinking and good thinking. What about some beliefs about thinking? The next section goes on to talk about some different beliefs about thinking.

Drew: He starts off this section with the claim that one of the big determinants of how people think is how they believe they ought to think. In other words, if you believe in certain thinking practices as good ideas, then that actually helps you use those practices. I know that some of my students listen to the podcast, and I don't want to accidentally call anyone out.

One of the things that I use a lot in my own education is reflective learning, and I do it really explicitly. I make everyone in all of my classes submit a learning log every week, which is like a reflection on their learnings for the week. There's reasonable evidence of efficacy that forced reflection does in fact help with learning. Not every student likes it. Some people think it's a waste of time.

Actually, he points here this evidence that says that students who believe that reflection is unnecessary are actually worse at how they understand difficult reading, and students who think that reflection is a useful practice are actually quite good at understanding those passages. It's less about whether you're made to do it and more about whether you buy into the belief.

For that reason, it's a good idea to actually explain to students why you're teaching them in the way that you're teaching them, and explain to them the theory of thinking and the theory of learning because if they honestly buy into that theory, if they believe that that's a good way of thinking, that does in fact actually improve their thinking. I thought that was interesting. It explained something that I've noticed with my own kids. They're doing a lot more learning about learning in high school.

When my son did the International Baccalaureate, the IB, they had a unit called Theory of Knowledge, which is learning about learning. But all my kids' schools have got some of these elements of they're not learning study skills, they're actually learning theory and learning philosophy about learning and thinking. It's obvious that lots of teachers have adopted this idea that if people believe in thinking a particular way, that actually helps them think.

David: I think that's nothing like what I did in high school. I'm sure that they're reading some really interesting theories, Foucault, Descartes, and these types of thinkers and philosophers.

Drew: They're not making high school kids read Foucault.

David: Really? That's disappointing. Okay.

Drew: Corporal punishment got outlawed 20 years ago.

David: Really? I've got most of those.

Drew: You can't tame kids. You can't let them read Foucault.

David: Okay, fair enough. All right, I'm a bit out of touch. That is really interesting. Again, it makes sense. If we believe that something is the way that something should be done, then we're going to find it useful, and we're probably going to get some efficacy out of the process. Drew, are we ready to move on to the next section?

Drew: We're talking about abuses.

David: Yeah, I thought that was an interesting title for a subheading.

Drew: Yeah. He talked about this is what people think and know about thinking, and then how do they misapply it? Some of this will be out of date because education has moved on. This is his hit list of things that people do in schools and things that people do in business education that probably don't actually work according to his research and his point of view. He starts off with what I think is, perhaps, an unfair accusation for high school teachers that would probably definitely apply to a lot of university lecturers.

He says many teachers who try to improve thinking have a little understanding of the theory I've just sketched or of any competing theories. I think that's unfair. Particularly, high school teacher education is really quite good as it's just the amount of continuing self-education that a lot of teachers do. It's just more that sometimes a lot of this stuff is forced on teachers. Just like with safety practitioners, we end up doing things we don't fully believe in ourselves because it's been mandated as a practice that we should be doing.

David: Yeah, absolutely. You got a note here about this being also important in relation to organizational training and learning.

Drew: Yeah. The particular bit I was referring to is he says what's missing from the conventional wisdom is a common understanding of what the problem is, why it needs to be corrected, and how the various prescriptions will do the job. What he's particularly referring to there is the emphasis on classroom discussions. He's saying that just saying or having students engaged in lively discussions, that might feel good, but does it actually promote what you're trying to promote?

What is the actual goal of discussion? Is it to teach kids how to discuss well? In that case, it's probably pretty good, but it doesn't necessarily promote critical thinking. He says something that I think most people can relate to, that discussion and a class of several students is filled with talk by people who don't have much to say that others can learn from. Many good students find discussions boring and time wasting for this reason, and could say that about a lot of meetings, a lot of vocational training as well.

David: By this stage in the podcast, you've offended your students and your colleagues in terms of meetings.

Drew: I've been nice to the high school teachers who look after my kids to stay fit. That's important.

David: Very good.

Drew: I think it's important if we think about things like learning teams. One of the questions I love to ask people when they talk about learning teams is, okay, who are you actually intending to learn? What are you intending them to learn? And how is this process supposed to provide that learning? Not because I doubt that learning teams are a good idea, but just because I think often we just aren't really clear on what is the problem we're trying to solve and how is this process the actual solution to that problem.

We'll be better at using things like learning teams if we can understand what we're actually trying to get out of them. Otherwise, they just become discussions. Some people like to speak and be heard, and other people find that their time is wasted just listening to other people talk.

David: If it's to promote critical thinking, which I guess a lot of these things are trying to promote because learning, new insights, and new knowledge, comes from challenging existing insights and knowledge. You've got notes here about safety workshops, safety day, safety stand downs, or a whole bunch of things that we do for safety. I think that's your prompt there, which is, do any of these processes in the way they're designed and delivered promote critical thinking that is actually going to inspire new learning?

Drew: Yeah, exactly. Having someone who's inspiring to come and talk to you about critical thinking doesn't promote critical thinking. Just like having someone inspiring to talk to you about the importance of safety, it doesn't encourage safety. It's got to actually be something that promotes the need that you're trying to fill.

He says his conclusion that educational methods must be justified by arguments about their ultimate consequences for what students do and how they think in the future. Although he acknowledges that sometimes that's really quite hard to measure, it should still be the goal we're trying to meet.

David: Drew, are there abuses of, I guess, of our knowledge, of our understanding of good thinking?

Drew: David, you might be able to think better than I can about relevant examples for this. The other one he criticizes is the idea about breaking a skill into sub skills and just teaching those skills. I can see that it works in classrooms that you divide up. Giving people just exercises in matching or just exercises in finding differences as if that's going to create that meta skill or the ability to compare.

The only one I can think of is actually something I'm guilty with, teaching people to evaluate a source by breaking it up into think about the timeliness of the source, think about the relevance of the source, think about as if there are individual skills instead of an overall package.

David: The only ones that I can think of, I guess, and I test this with you around some of our safety practices is teaching a leader to how to do an infield safety leadership interaction, and thinking he's going to make them a good leader of safety, generally, or teaching someone how to do a take five, and thinking it's going to make them able to make good risk-adjusted decisions. I wondered if that's the example of breaking down something into one transactional activity and hoping that it's got a much bigger impact than it possibly can have.

Drew: I think that's a good example, David. He actually really criticizes someone that basically everyone in universities today creates a little bit of gospel with Bloom's taxonomy of learning. I think he's been a little bit unfair because Bloom's idea is that you scaffold. You start with teaching the basic skills and then you integrate them up towards the more complex skills. I think that's totally reasonable.

Using something like teaching someone how to identify hazards using it takes five as a scaffold towards the more complex skill of being able to assess a cite and decide how to make it safe. I think that's quite reasonable. What he's really criticizing is just this in isolation, teaching this particular thing as if it's got value in its own right without doing the extra work of scaffolding and linking it all together.

David: The next section we go into, which is the main section of the paper, is just titled The Growth of Knowledge. Do you want to get it started?

Drew: I'm about to say, where to start in this one? He starts in the Middle Ages. Let's start there. He says, in the Middle Ages, one individual might have been able to learn everything that was worth knowing. You might have had a little bit of a hard job to know what that was, but in principle, you could write down all knowledge in one really, really good book.

The trouble is that just the amount of knowledge continues and continues to accumulate. People were saying in the 1930s that we were beyond the point at which any individual could ever even aren't master their own field, let alone all of human knowledge. Just an example of how it's gotten worse, in 1993, I was getting weekly emails telling me about all of the new webpages that have been written.

David: I'm amazed that you had an email address in 1993.

Drew: I was an early adopter in Australia.

David: I didn't get an email address until ‘97 when the university gave me one.

Drew: Literally, instead of Google, the search engines worked by indexing all of the webpages. Now we're way, way past the point of having to have engines that crawl the internet to find all the papers. Now we're right at the point where we need AIs to then tell us how to then put that stuff together.

The idea of sort of doing this, he called it scholarship, this idea of meta creating of knowledge. It's still helpful, but the accumulated knowledge is so big that the most we can do is teach people how to get information and give them enough basic information to make sense of it. He said, even that's not enough because you're just telling someone how to use a library and use computer databases, or today, he'd say, telling them how to use the internet lets people find knowledge, but it doesn't let them get hold of the right knowledge.

For that, the way most people do it is by relying on experts. That's true whether you're someone who trusts a particular news program to tell you what news matters, if you're someone going to a particular website that you trust that the people there collate good information, or if you're a conspiracy theorist on YouTube who just watches videos by certain other people. That's how all of us really do it. We have certain people that we know to trust, and we just hope we're getting it right who those people are that we're trusting. If we could get better at that, then we'd get better at thinking, get better at knowledge.

David: Also just remember, it's 1993. This section is titled The Growth of Knowledge. I think Baron could foresee also the rise in what would possibly not be social media and things like that. It's not just the growth in knowledge, it's more broader than that in terms of critical thinking. It's the growth in information.

I guess that's been one of the main criticisms of ChatGPT, the AI, and what you've introduced in this episode already. It can not necessarily find, I guess, the most truthful story and just find maybe the most popular story. Critical thinking becomes, I guess, maybe even more important, not just with the growth of knowledge, but with the growth of information.

Drew: Particularly, it used to be that we had fairly strong filters for what got recorded in certain formats. It's not like that every book that was published was reputable, but the amount of work you had to go through to get things published and the number of gatekeepers meant that your average citizen couldn't produce their own book, only certain people could. On average, those people were more likely to be real experts than not reliable, but a good proxy.

Similarly for publishing papers, when there were only a few journals, you couldn't just send it off to another journal or start your own virtual journal if your paper didn't get accepted. But now, the experts are producing stuff on LinkedIn and anyone else can put their stuff on LinkedIn. You can't use the format or the place to distinguish. You need other tools. It looks the same.

David: Growth of knowledge and information becomes a real, I guess, increasing necessity to, like that opening line, to teach critical thinking. Then there's a section we mentioned. We briefly touched on expertise. I mentioned that we talked about it later in the episode. I guess this is the place to talk about it because the next section is titled The Basis of Expertise. Do you want to pull out some important points here?

Drew: This is going to be the theme that carries on to the end of the paper. He's going to build it up into his answer as to how to teach thinking. Ultimately, it's all towards thinking. It is really about knowing how the experts in a particular field think and using that in order to evaluate the information and to evaluate stuff that purports to be experts.

He's going to get to the point of telling how you teach that. But for now, he's just setting up the idea. There's lots of literature about what is an expert. He goes through some of it. Experts have richer representations in their minds or brains. There are certain things that they do more automatically and more quickly. They use different systems of classifying knowledge. When solving problems, they tend to work forwards rather than backwards.

There are certain cognitive processes that experts use, certain personality traits, and evidence for different types of thinking, different amounts of openness and willingness to test your own tentative assumptions, all sorts of things that experts tend to do and that tend to make people into experts. He gives lots of citations. His details, of course, are wildly out of dates because experts are something that fascinates researchers. Since this was written, what are we up to, 30 years of extra research about expertise since? And some of that research is good stuff.

His overall picture of the landscape of expertise is actually still pretty accurate. It's just the details. It says here, knowing all this stuff about experts doesn't tell you how to know who an expert is if you're not an expert. They don't tell you, because all the things that characterize an expert in astrophysics would also characterize an expert in astrology. How do you know to trust the expert in astrophysics rather than the expert in astrology?

I don't know how much of this we should go through. He flees back to Karl Popper and his ideas of falsifiability. I don't think we need to go into this detail. What I think we should get is just the fundamental idea that expertise comes in a field from self-criticism. Good fields advance by criticizing themselves, but different fields advance by different sorts of criticism. To understand expertise in a field, you need to understand how that field does its internal critique.

You could say, in psychology, you need to know, okay, how do psychologists criticize a paper? How do they decide what's bunk and not? How do they decide what's a worthwhile theory? How do sociologists critique it? How do they decide what to work with? How do they know what counts as good sociology? He says, you want to go into each field. You've got to know how people within that field create knowledge, maintain knowledge, and test knowledge.

David: I think interestingly in terms of understanding his expertise, that idea that different fields have different things that counts or prioritizes evidence. Like you said, mathematicians or physics have certain deterministic evidence criteria. Then there's sociology, which has more, sometimes, consensus-based, descriptive criteria. But then I think it was interesting when I thought about safety.

It said, in fields like safety, it tends to be more about what's practical, rather than what works. I thought about that in relation to things that we do in safety. It's like, yeah, I know it's not working, but it's practical, it's able to be done. I think there's a bit of a challenge there for us in safety to go, are we focused on the right things when it comes to what we do?

Drew: I was struck by that as well. That was one of the bits that made my brain hurt a little bit in this paper because I do tend to basically criticize safety for not being a different field. I tend to say, in safety, people tend to accept ideas based on popularity and practicality. I don't recognize that as legitimate. Whereas what he's saying is that each field gets to say for itself, what are the main tests.

I think not in this paper, but probably maybe a discussion for another time, the fact that good fields are reflective on their own foundations as well. We've seen over the last 10 years, psychology has actually had a self-critique about its methods of critique. That's what sprang not just the replication crisis, but the whole contemplation afterwards about how we got to this point of having a replication crisis. What are we counting as legitimate or not legitimate?

David: And P values and all of the changes that have gone on in psychology. I guess we tried to do that in one of the earlier episodes and with the paper that you're the lead author around the manifesto of safety science about trying to be critical on the safety science discipline about how we construct and perform our research.

Drew: Yes. What he's saying is that if we want critical thinking of students and critical thinking then of students who become adults and become practitioners, then it's got to be almost like a field dependent on critical thinking. If we want students to be critical thinkers in math, we actually probably do this really well. All through high school, we don't just teach math, we teach people how math works.

People forget most of the math that they learn, but they remember the basic ideas of solving equations, building up theories, and solving problems in mathematics. We don't do that so much for other fields. We do it a lot for English and literature. We teach people lots of how to critique, analyze, and how literature works as a field. If you're lucky, you probably get some history, but things like economics are usually electives. Only people who choose to do economics get economics, and all other social sciences get left out.

The other one we do pretty well in science is to teach people science. People get taught about the scientific method as if it's the only way of critiquing knowledge sometimes. We do it well in some areas, but perhaps less so in others, particularly if people then go into a field. One of the common ones is going into engineering where you learn how engineering works, but you're very adjacent to management in economics and sociology, and you never learn how those fields work. You find yourself in an organization having to interact with expertise that you've never learned how that expertise is formed and worked.

David: I'm not going to start a discussion about engineering. I think our listeners will know what we're talking about, about different fields and different ways of thinking.

Drew: That wasn't pointed at engineers. That was more about just how we do engineering education.

David: Fairly, it's the truth in engineers. It's just a point of critical self-reflection.

Drew: We're doing the same thing for people who come up through management economics, find themselves in an engineering organization, and they've never learned how to critically think about engineering because they didn't get socialized in that way of thinking. It's hard to know then who are the ex-engineers you should listen to and who not to.

David: That last section before we jump into the practical takeaways is this section title, which I don't really like. It's What The Educated Person Should Know. I think that's a hugely judgmental section titled, but basically just goes on to say those things that we've already said. Okay, you need to understand the nature of true expertise, you need to understand the geography of expertise across different fields. I guess you need to understand how to learn about the methods of inference, search in each discipline, and other disciplines as well like what we're just talking about then. Is there anything else you want to say or add?

Drew: There are two particular things I'd like to quickly pull out from this section. One of them is one of his prescriptive solutions is to do active learning with the specific purpose of trying out how a real person works in that field. He talks about in high school, learning history by trying to do some of the stuff that a historian does, take a bunch of sources, try to assemble them into an argument, and evaluate them.

That's actually something I'm trying myself in my current teaching. I'm trying to design a new course on accident studies in safety, where we basically get the students each week to take a different accident and critically think about how do we derive safety principles and safety theory from that accident as a way of understanding how knowledge and safety science builds up by doing it yourself, by analyzing an accident, drawing conclusions, and try to turn those conclusions into recommendations.

He also points out this idea of geography of expertise, which I think is interesting. He says, basically, people, too much step on turfs because they don't even know that they are experts in that. Psychologists do philosophy as if philosophy doesn't exist, or economists do psychology without knowing that psychology exists.

David: Maybe managers, HR people, and engineers, do safety without knowing that safety actually exists as a discipline on its own.

Drew: We could do the other way and say, how many times in safety, and I've been guilty of this myself, do we do marketing as if marketing didn't exist as a serious field of study? We just think, hey, I can design a poster, I can design a video, I can design an ad that will improve safety, without thinking, hey, there are experts in this, people who know what actually works and doesn't work.

David: Yeah, great point.

Drew: I think that was really all I wanted to pull out of that.

David: It's been a bit of a theoretical conversation about what we've been talking about thinking. What are your practical takeaways for our listeners?

Drew: I don't know about you. I wasn't even sure where to pitch the takeaways here. I think they’re probably almost at the personal level, so things to basically go away and think about for ourselves in our own learning.

The first takeaway, I'd say, is being part of a good thing is being able to recognize and respond appropriately to expertise. I think it's good for all of us to sometimes think about, who do we spend time listening to? Who do we trust as experts? Why do we trust them as experts? And are we going to the right people for the right information or just going back to the same experts, even though they're not actually experts in that particular thing?

David: I think that's a good point. Your second takeaway here is that there's a common failing in all learning and not being precise about what we actually want as an outcome. This leads us to doing things that we think are a good activity, like you mentioned earlier, learning teams, safety stand downs, without being clear on who is supposed to get better at what.

Drew: Just because something is a popular or good way of learning, it doesn't mean you just dump it into your education program. You should think about what's the actual thing you're trying to do here. The third one is that self criticism is the key to advancement in any field, but it's got to be well informed self criticism. Otherwise, what happens is people just criticize all of the new ideas. It's just a reaction rather than criticism.

I think we do this a lot in safety. We fall into two camps. There are the people who aren't trying to critique the new ideas, they're just trying to reject them. Then there's the people who jump enthusiastically on board to the point where the new ideas become dogma and don't get critically challenged and improved. Somehow, as a field, we've got to learn to do that reasoned critique together, where we critique criticize in order to move forward, not criticize to move backwards, or to just replace the old with the new and still be dogmatic about it.

David: It sounds like in this episode, you've softened up a little bit on your students by giving them a weekly learning journal. I don't know if you remember, in my first year of my PhD, you made me do a daily surprise journal because I had been a safety professional for almost 20 years. I was researching the safety profession. I think your way of trying to encourage me to maybe see if I could be more critical, my thinking, was to everyday record what surprised me today.

I think your suggestion to me was that those are the situations where the world didn't work the way that you thought the world worked. You actually want to capture those things, because that's a way of thinking more critically about the world and your role. I don't know if you remember that, but is that something that fits into this category of trying to adjust your thinking?

Drew: I do remember it, but you've also pointed out, we never went back and checked whether that actually worked as a learning tool. I just made you do it and then never came back and said, did this actually help? Maybe in a future episode, we could look up on surprise journals, see if we can find a paper on it, and see if there's any evidence that it's actually a good idea.

David: I think for me, it definitely did make me do, like you say, self-criticism and structured self reflection on a daily basis about, why did this person respond in the way that they did? It wasn't what I expected. Why did this happen like this? It wasn't what I expected. It really did, I guess, allow me to see the world not just through a single lens of the way that I always saw the world.

David: Yeah, cool. The final takeaway is just the point that what we think about thinking does have an impact on how we think, which is very circular. It does mean stopping and thinking about our own thinking strategies. Why do we think what we think? How are we evaluating evidence?

It can be interesting and self-improving. It can also hurt. I hate that feeling when I can feel my own mind getting slippery in response to a cognitive bias that I know is at work. It's still, I think, a useful tool, particularly when we find ourselves being dogmatic, being a bit reflective about why we feel so strongly about things can be helpful.

David: Excellent. The question that we asked this week was, can we teach critical thinking?

Drew: That was a big promise for the paper to make, and it never answered it, but it did give us lots to think about.

David: Excellent. That's it for this week. We hope you found this episode thought provoking and ultimately useful in shaping the safety of work in your own organization. Join us for a discussion on LinkedIn or send any comments, questions or ideas for future episodes to feedback@safetyofwork.com.