The other day we called up our friend Ron Gantt, the Safety Differently thinker, to get his take on some standard ideas in traditional safety.
In particular, we asked Ron for his take on (1) safety and compliance; (2) safety indicators–lagging and leading; (3) safety measurement; (4) Heinrich’s safety pyramid; (5) SOPs, work as planned, and work as performed; (6) job hazard analyses, or JHAs; (7) the hierarchy of controls; (8) incident investigations & root causes; (9) behavior-based safety; (10) risk & risk management; and (11) safety culture.
Ron was kind enough to answer all our questions in helpful and thought-provoking ways (which is typical for Ron).
You can watch a recording of our discussion below (watch for the appearances of Ron’s dogs!) or read a transcript below. And we’ve included some links to earlier discussions about Safety Differently with Ron below as well.
We’d like to thank Ron once again and we are sure you’ll find this discussion interesting.
Check out these other discussions with Ron, too:
Let’s get right into the transcript of our discussion with Ron Gantt about some “classic” or common ideas in safety and get his critical perspective on them.
Convergence Training: Hey there, everybody and welcome. This is Jeff Dalto of Convergence Training and Vector Solutions. And today we’ve got an exciting guest, I think a lot of you might be familiar with him. This is Ron Gantt. He’s the Director of Innovations and Operations with Reflect Consulting Group. Ron does a lot of work in safety; you might think of him as a guy who talks a lot about safety differently.
He’s one of the more interesting and provocative people, I would say, in safety, and shares a ton of his knowledge and time and insights. So I’m really grateful for Ron, and I think he’s actually my intro to things that might be called new safety. So with that, I’d like to say Hi, and welcome to Ron.
Ron Gantt: Thanks, Jeff. I’m glad to be here. You’re a superhero for doing this.
Convergence Training: No, no, no.
Ron Gantt: Not for talking to me, but…maybe a superhero for talking to me. I don’t know. We’ll see.
Convergence Training: Well, thank you very much, and we’re excited to have you.
Ron has agreed to come on today to give some quick, off-the-cuff comments on some kind of classics of safety, safety thought, and safety techniques and methods And I think this will be interesting for everybody. So to just to jump right into it, we’ve got a list of 12 things here. Why don’t you go ahead and start by talking about item number one on our list, which is safety compliance.
Ron Gantt: All right, yeah. So compliance matters, right? I mean, it’s not like regulators are going to go away anytime soon. The question becomes, how do we manage compliance? And what role should it have in the broader safety management system? And those, I think, are questions that we need to answer because oftentimes, we just go right to compliance and many organizations stop there.
So I think we need to have a broader conversation in safety about seeing compliance as a potential tool,
amongst other tools that we can use, right? When are the times that we can and should have compliance be non-negotiable? And when are the times when we shouldn’t? And how would I know the difference between those two things? Because if we just treat everything as non-negotiable, well, everybody knows that’s obviously flawed, right? There’s no such thing as an environment where I can always do or not do something…it’s not black and white, there’s no such thing as that.
So, for me, it’s, it’s about situating that idea of compliance in our broader discussion about what does it take for this organization to be safe? What are the things that we’re trying to fix, if that makes sense?
Convergence Training: Yeah, Yeah, it does. I think sometimes it’s easy to totally dismiss completely the importance of compliance, or the necessity of compliance in these kind of discussions, so I like that.
So I guess, following up on that, maybe you could talk to us a little bit about reporting on lagging indicators and use of incident rates and so on.
Ron Gantt: Yeah. So incident rates and things like that, again, are another thing that are not going to go away anytime soon. I would say that as a measure, because they’re often used as a measure of safety performance, as a measure of safety performance, they are notoriously poor. And even if we buy into the fact that whether or not you have an accident could indicate whether or not an organization was safe, we’ve run into a situation in safety where we are safe enough.
So from a statistical perspective, we’re not having enough accidents to actually know whether or not you get a reduction as a result of things we’ve done or anything, right? So the problem is our rates are low enough now that they’re not telling us anything new. We’re not learning anything. Especially in those organizations that go long periods without having any accidents. Does that mean they’re safe? I don’t know. You know, and so that’s that why if we measure safety by its absence, we run into these sorts of problems that actually can lead to logical conclusions that are flawed. If we have a long period without accidents, does that mean we’re safe? Everybody agrees that that’s just not true. But of course, that begs the question, then why are we then measuring the number of accidents we have as our sole measure of safety performance. As any legitimate measure, really?
Convergence Training: Alright, great. So that leads into our next question, which is about safety measurement. I wonder if you can address two parts of this. First of all, why do we feel compelled to measure safety? And should we? And then secondly, if you don’t think that measuring safety by incident rates is a good answer or the whole answer, what would you suggest?
Ron Gantt: Yeah, these are good questions.
So first off, we measure something to identify how close or far away we are from some sort of goal, right? So I think we would all agree that for safety, whatever definition of safety we use, and that’s going to be important in a minute. But safety, whatever definition we use, implies some sort of goal. Some people would have it be the goal into itself. But regardless, however you define it, there’s some sort of goal there. How do we know how close or far away we are? And so the only way we do that is to measure where we are or to measure some version of the gap.
Now, a lot of people will criticize the idea of measurement all together, because we’re in a complex environment and safety is an abstract concept. How do you measure that gap? And that’s a very good point. However, I think the idea that we can’t measure it all is just wrong. Humans are always sort of measuring and trying to understand in some ways. If measurement only is numbers, then I agree, but measurement is not only numbers, we use all sorts of our senses to measure, right? You know, like even just having an idea of how close I should be to this computer screen, there’s a form of subconscious measurement that’s happening. There’s no number associated with it. It’s just a qualitative measure–am I close enough or not?
So I think measurement has a place. But again, it’s kind of like with compliance, we need to understand how is that situated in our broader conversation about safety? And so this leads to that point about the definition of safety, right?
What if safety is not the absence of accidents? You know, if that’s not good enough to define safety, then that begs the question, what is safety? For myself, I would argue that safety is a capacity to be successful in varying conditions. So, safety then is not a thing unto itself, it’s a set of potentials to allow us to get our job done as conditions change.
Read more about Ron’s definition of safety in our interview with him titled What Is Safety Differently?
And if that becomes our definition of safety, that actually leads to certain things that we can start to measure that would give us an indication of how likely we are to be successful in creating that capacity. So from a learning perspective, how would I measure a person’s capacity to be able to do certain jobs? So one of those things is knowledge and skills. Right? Those are things we can measure. The hard part is that we sometimes have to use proxy measures. And so the idea of measurement is complex. But if we start with the idea that safety is not just this absence of accidents, but it’s a thing that we’re trying to achieve, or that gets us to something we try to achieve, that actually breaks down the measurement discussion into something a bit more manageable. If it’s a capacity to be successful, I can start to measure that capacity or at least movement towards that capacity.
I know this is a bit abstract, but I think we need to deal with it at the abstract level. And then, with each organization have the conversation, “Okay, what would it take for me to be successful? You know, and how would I know that conditions are changing? And what would be my response pattern to those conditions as they change?” If that makes sense.
Read Ron writing more in depth on capacity at the Safety-Differently-dot-com website.
Convergence Training: It does. But I wonder if I can try to nail you down a little. So could you give us some examples of things one might measure to measure capacity or ability or something like that, and then, two, some examples of a non-data measurement metric, if possible.
Ron Gantt: Yeah. Yeah, absolutely.
So, in terms of measuring capacity, let’s say we were a company that makes pizza, right? We are a pizza restaurant. You know, what are the capacities that need to be in place for us to continue to make pizza into the future? Well, one of the things is we’re not making pizza for ourselves we’re making them for customers. So what abilities do we need to have in place that would enable us to understand what the needs and desires of our customers are? Do we have those abilities in place? And you can think of knowledge, skills, space, things like that, that would allow us to do that.
If you take that and put it into what we would call a traditional safety space. You know, let’s say, okay, now we’re thinking of in terms of risk, which, you know, I start to get a little bit uncomfortable with because then I’m starting to wonder what are we talking about with risk, but let’s make it easy and concrete for people. So let’s say we’re going into a confined space, for example, we’re going into a tank that had stored hazardous chemicals, you know, acid or whatever, and we need to go in there and fix a leak because the wall was becoming too thin.
Okay, well, so how would we ensure that we are able to complete that job? Even as conditions change? What capacities need to be there? Well, right off the bat, there are some capacities that are obvious that are in within the normal, traditional safety space, right? You know, we obviously want to measure the air, we want to make sure people have the right PPE. But then we also start to think about, well, if the job is not just to make sure these people don’t get hurt, the job is to fix the leak, then we need to make sure the PPE that they’re wearing and the other things that we’re doing don’t interfere with the overall job. So how can we manage those trade offs?
And some people would say, well, that’s not my job as a safety person to manage those trade offs, that’s their job. The problem I have with that mindset, and the reason why I think it’s really important for us to make that part of our job is that someone has to manage those trade offs. And as of right now, I don’t know anybody in the organization that is uniquely qualified to do it, right? So when the work gets done, the workers have to find a way to make whatever safety requirements we gave them fit with their job, right? So if we can start to identify what those trade offs are, we can start to measure our ability to manage those trade offs, which then puts focus on the trade offs and helps us manage them better.
Which then kind of gets us to the second answer to your question, which is a non-data source, right? So once you start to get stories about how people are managing those trade offs, you can start to pull data out of them specifically, and by data, I mean numbers and things of that nature. But the story unto itself is a data point too, right? It is a source of information about what’s going on in the organization.
And one of the things that I think is really important for us to understand is that stories are compelling in ways that numbers are not. There’s some interesting research on how people make decisions about risk and things of that nature, that when we present information to people about a risk that is primarily in terms of statistics, on average, people will tend to take more risk. Whereas if we present information to them that is more qualitative in nature, like a story, people on average tend to take less risk. You know, it humanizes thins. An easy example is when you know, Sarah McLaughlin wants you to give money to save poor little puppies. She doesn’t tell you that there are 30,000 puppies that need adoption. She shows you this one little puppy, right? Don’t you want to love this puppy? Why would you hurt this puppy, Jeff?
Convergence Training: I apologize for wanting to hurt that puppy.
Ron Gantt: But you know, we can give them indications about the functioning of our safety organization and how it is managing those capacities. So yeah, we still want some data. But presenting stories as an equal source of information of measurement, I think is something that we haven’t done in safety, but is an untapped resource.
Convergence Training: Great. I’m glad you talked about stories, and we’ll leave it there. But maybe sometime next year, we can come back and talk about that, because I think that’s really important, that ability to not just capture and acquire information, but share it throughout the organization, and maybe not something that organizations excel at, whether it’s safety or not.
Ron Gantt: Yeah, I’ll just add to that real quick. You know, the reason why it’s important is, organizations may not accept it, but humans always excel at it. And so there are already our stories going through the organization and being used as an unofficial measurement. It’s just not always getting to where it needs to get, right?
Convergence Training: That’s a good point as well. Are you familiar with that book Made to Stick by the Heath brothers and how they talk about the power of storytelling, amongst other things, the power of storytelling as opposed to data?
Ron Gantt: Now I’m going to have to look that.
Convergence Training: You bet.
Okay, so all of this kind of leads nicely into our next point, which is something called Heinrich’s Safety Pyramid. And before you tell us your thoughts, Ron, for those who aren’t familiar with Heinrich’s Safety Pyramid or don’t know about it by that name, can you just kind of explain what it is?
Ron Gantt: Yeah, so Heinrich was an insurance guy in the early 20th century. And he did a number of studies on accidents and causation and things of that nature.
Say what you will about Heinrich, but we have to give him kudos. He was one of the first people to really apply a sort of scientific or quasi-scientific approach to safety, industrial safety in particular. And that really kind of kicked off a whole sort of evolution. So it’s easy to sort of bag on him now. But he was a pioneer in a lot of ways. So we’ve got to give them kudos.
The Heinrich pyramid in particular was one of his very influential models. And the idea was for every major event…and I’ll use the numbers he used, but other people have expanded on it…so there are 29 minor events, minor injuries, and 300 near misses. Unsafe acts was the term he used, and things of that nature. So, sometimes it’s called the 329-1 theory, but most people know it as the as the pyramid or the triangle.
And a lot of people would point out that at the time and when he initially wrote up his theory, he was not necessarily saying that that ratio really matters, that we could like count how many unsafe acts we have and then predict whether we’re going to have an injury tomorrow or not. But subsequent to that, a lot of people have sort of taken that idea and used it as like a predictive ratio. And then others had you know, like Bird for example, an another insurance guy, have kind of done subsequent studies and come up with different numbers but similar idea that okay, we can look at the bottom of the pyramid and get rid of these unsafe acts and that will stop us from having that one bad injury.
So, how it’s been used may be a little bit different than Heinrich intended, but I don’t know, I don’t want to get into the weeds on that and try to read his mind. The bottom line is that a lot of people have this belief system that if we get rid of all of the unsafe acts and we deal with the minor stuff that will deal with the major stuff.
And as to the validity of that, it really depends on what we’re talking about. So the general idea that there’s always going to be more minor stuff than major stuff, that’s just true. I mean, there are always going to be a lot more people stubbing their toe than people getting killed at your organization. I mean, that’s just an obvious, right?
But the idea that the people stubbing my toe can predict whether or not I will have people getting killed, that’s more problematic. The thing I would have people think about is, is it predictive? That if you know, the number of ergonomic injuries that we have in our organization, is that going to be predictive of our propensity to blow up our factory?
No, I cannot see a causal connection there. Now there are going to be some instances where with a particular type of event, or accident, or risk, that there might be some kind of events that if they happen would predict to us, “oh my god that’s much more likely or makes us more likely to have that bad thing.”
So like, let’s say lockout tag out events, if we have a number of near misses where someone was working on a piece of equipment that turned out that it wasn’t properly isolated. Well, certainly, we could argue that if that keeps happening, eventually something bad is going to happen on a long-enough given timeline, right? Sort of makes sense. But to say that, lockout tagout events are related to slips, trips, and falls. I don’t see how those two things are connected.
So the Heinrich pyramid itself, I don’t see much of value in it. You know, a lot of people will say it’s completely invalid. I’m not sure I would go that far. I mean, certainly he found numbers, and whether those numbers relate to one another, that’s a separate question, they may be correlated, but correlation doesn’t equal causation.
To me, the most important thing for organizations is that we are looking at each individual sort of whether it’s risk or whether it’s work or a task and identifying how can things go wrong? How can things go right? And how can we manage that so we get more of the good stuff and less of the bad stuff.
Convergence Training: All right, great. Thanks for talking about that. And I know you didn’t want to get into the weeds on the question of what was Heinrich’s original intention, which is fine. For those who are interested in that, I think Carsten Busch is an interesting guy to get into weeds with on that. So we’ll refer you to Carsten Busch for that.
Ron Gantt: I had him in my mind when I was sitting thinking saying that.
Convergence Training: Yeah. Let’s just make the obligatory reference to Frank Zappa for Carsten and move on.
Here’s our recorded discussion with Carsten Busch discussing 10 Common Safety Myths.
Okay, maybe kind of summarizing this part of the discussion. Can you talk to us about safety bureaucracy?
Ron Gantt: Yeah, okay, so, safety bureaucracy is sort of…at least the way I perceive it. I mean, the term is used in a negative sense almost all the time and it’s unnecessary safety paperwork, or unnecessary safety procedures, or things of that nature. And even though it sometimes is talked about in an unfair way, it’s totally true. You know, one of the easiest ways to describe how safety bureaucracy comes about is ask any organization when the last time they removed a safety rule was. I mean, it never happens. You know, we just add more rules on top of rules. You know, best-case scenario is we replace one with another and often the one we replaced it with is more complicated and has more stuff we’ve got to do. And so the problem up with all these safety rules…
So one of the problems with bureaucracy and where it comes about often we do not have a clear picture and safety about why we’re doing what we’re doing, all right. So there’s no underlying theory. Yes, we want to have no accidents. And I kind of generally think that this thing is going to stop accidents. But there’s no connection that says, this gets me to not having accidents, because by doing this, it will lead to this, this, and this. And I think it’s important identified this, this and this, the A, B, and C, that leads to the ultimate end goal, because those things are often testable. Right? There are theories that we have.
So posters on a wall is a classic sort of thing that people eye-roll about safety. You know, these motivational posters…what’s the underlying theory that would lead us to believe that that’s going to stop an accident from happening? Okay, well, someone’s going to walk by, and they’re going to see it, and they’re going to think to themselves, “Man, I should be safer today.: And that’s going to lead to subsequent behavior change out in the field. That’s all testable, right? Do people actually walk by it and when they do walk by it, do they notice it? We can measure that and see, does it actually do what I do?
But so the problem is, when we don’t do that, we often just keep things in place because it’s done in the name of safety. Right? You know, a safety rule is put in place, a procedures put in place. And the only way we validate whether it actually works is “Do people follow it or not?” And then when they don’t follow it, we blame them. We don’t go back and think, wait a minute, the whole purpose of a procedure is to change behavior. And if people aren’t using it, then by definition, the procedure is not working. So that safety bureaucracy just gets piled on top of, you know, one thing after another because we just believe that if it’s in the name of safety, it’s sacrosanct. It must be good, but that’s not always true.
Convergence Training: So if that’s one element of safety bureaucracy, and I think you spoke well to that, I guess maybe a second element is just simply reporting up. Do you have any thoughts on that?
Ron Gantt: Yeah, so a lot of times in bureaucracy, the underlying belief system that is kind of built in, in addition to what I just spoke about, is that we can’t trust people to do safety on their own, so we have to validate they’re actually doing it. So we do that through audits or inspections or things like that, or we have to report up to people to prove that we’ve done something, because we can’t trust people to do it on their own. You know, the classic thing I hear a lot in organizations is, we’re going to have a kind of pre-job conversation, which is a good thing to do. Well, then we’ve got to prove that they did it. So we gotta have some piece of paper to have that everybody sign because God knows that if they’re going to lie about doing it, then having them sign their name, they would never lie while signing their name, which makes zero sense, right?
And so you get this kind of idea that we have to validate everything they do, and the only way we can validate it is to write it down, like, you know, the whole kind of thing that if it wasn’t written down it didn’t happen. I mean, that’s nonsense. That’s going back to what I said earlier about proving our theories correct. That’s an easily disprovable theory and I can do it for anyone, anytime. You just tell me what you want me to write down, I’ll write down what I did, and I’ll tell you that whether it happened or not, right? So like, I can write it down right now that we’re on Mars, and you know, write it down on a piece of paper. Does that mean we’re on Mars? No. Right? So writing stuff down doesn’t prove anything happened. It just makes us feel better at the end of the day. And that’s another mechanism by which bureaucracy gets built.
Convergence Training: So it’s, there’s an element of nonsense to it. There’s obviously an element of inefficiency in wasting time and resources. But I think it’s corrosive. As you kind of pointed out, it’s corrosive to trust and team building, as well.
Ron Gantt: Absolutely. Yeah. Because, I mean, the underlying logic is I can’t trust you. Right?
Convergence Training: Right.
Ron Gantt: It’s amazing to me how many people in safety and in management believe the idea that if we removed all safety rules and all of our processes that there would be anarchy. Like where do you think these rules came from, you know? Is there only an enlightened few in the organization that want rules.? And the rest of us just want to run around with scissors all the time?
It doesn’t make any sense. Where does culture come from if we need some sort of, you know, bureaucratic system to create safety? It’s nonsensical.
Convergence Training: Yeah, this isn’t a safety example, and I won’t name a name of a company, but I know somebody who worked at a company. He created an additional workload for himself that essentially doubled the amount of stuff he was doing. And nobody even knew he was doing it. And then he told his boss about it and that it was having good results. And then he told his boss about it, and he immediately became responsible for tracking it and reporting up on it.
Ron Gantt: Oh, God, love it.
Convergence Training: Okay, so touched on this a little bit already, but maybe you could talk to us about SOPs and the idea of work as planned as opposed to work as performed?
Ron Gantt: Yeah. So first off to the latter point about the difference between how we think work happens and how it actually happens. They’re always different. And that is really important for us to understand. Because it’s important for us to understand that that difference is not just always there. It’s that the reason why it’s there is not because we’re not trying hard enough. The reason why it’s there is because the world is fundamentally complex, that it is impossible to come up with a plan that will always be perfect.
You know, there’s the famous military line, no plan survives first contact with the enemy. You know, as soon as you have a system that is complex, and pretty much any system where people are involved is complex. It’s going to have variation in it and most of the time that variation is small. But sometimes it’s huge, right? And unfortunately, we don’t always get to know which times are going to be small and which times are going to be huge in advance. In retrospect, it’ll look easy—OK, how could you not know? And that’s one of the reasons why we often think that the difference between work as imagined and work as done is a product of people not choosing to try hard enough. But in reality, it is impossible for us to know, in looking into the future, how things are going to be next. Right, we always base what we do based upon what we has happened in the past. But we know that that’s not always what’s going to happen in the future, right? So that gap exists because the world is too complex for us to come up with a perfect plan.
Now, what does that mean for procedures? Does that mean procedures are useless? No, I don’t think so. I think procedures have value and kind of the best way I can think of to explain it is another quote by Dwight Eisenhower that says “In preparing for battle, I’ve found that plans are useless, but planning is indispensable.” And the way I kind of perceive his point is that, yeah, the ultimate plan you come out with may not have a lot of value when it comes time to actually doing the battle in his