Switch Statement

034: The Design of Everyday Things Ch 5: The Anatomy of Mistakes

May 12, 2023 Jon Bedard Season 3 Episode 9
Switch Statement
034: The Design of Everyday Things Ch 5: The Anatomy of Mistakes
Transcript
Matt:

Hello everyone And welcome to the switch statement podcast It's a podcast for investigations into miscellaneous tech topics

Jon:

This is episode nine in our series on the Design of Everyday Things by Don Norman.

Matt:

Hey John. How are you doing?

jon_raw_wav_unedited_bad_noise:

Hey. Hey. I'm doing? good. How are you?

Matt:

I'm all right. I'm all right.

jon_raw_wav_unedited_bad_noise:

Nice. Wanna talk about mistakes?

Matt:

I want to talk about mistakes. Have you ever made a mistake because I, I haven't and.

jon_raw_wav_unedited_bad_noise:

man, I make mistakes. It's like my life. It's just mistakes, like the entire thing. It's one big mistake. By my parents.

Matt:

see this, now, you're exonerated from any responsibility absolved.

jon_raw_wav_unedited_bad_noise:

That was the original sin. So everything I'm doing is just, you know, just icing on the cake.

Matt:

Yeah. so we talked about errors last time. Mistakes slip.

jon_raw_wav_unedited_bad_noise:

Yep.

Matt:

now I think, I think we're just gonna de dive, dive in even deeper. Just continue to go through.

jon_raw_wav_unedited_bad_noise:

I really like the section where he classified the types of slips.

Matt:

Okay. What does, what does he say about slips?

jon_raw_wav_unedited_bad_noise:

Yeah. so I mean, firstly like the slip is the thing where you're trying to do something and you just straight up do the wrong thing.

Matt:

Yeah.

jon_raw_wav_unedited_bad_noise:

you make the coffee, you get your milk outta the fridge, pour it into your coffee cup, and you accidentally put your coffee cup back in the fridge. That would be a slip. there's different types of slips. capture slip where instead of doing one thing, you do Another recent thing, and I, I think actually the Um, coffee cup example is a, is kind of an example of the capture slip cuz you're like dealing with. You know, two objects, you're like getting something out of the fridge and then you're just putting the wrong object back in the fridge. I don't know. Do you think that's an example of a capture slip or am I reaching here?

Matt:

Um, I almost think that that's more of a, something similarity slip, like the object, uh,

jon_raw_wav_unedited_bad_noise:

Description,

Matt:

Are

jon_raw_wav_unedited_bad_noise:

slip.

Matt:

description, similarity, like in your visual field, given that you're distracted, like those are, are similar enough. It's like, all right, this is roughly like a cylinder shaped thing on my counter. Like, um, let me inter and like if you think back to like affordances and things like that, it's like the, the affordances are basically exactly the same. Um, and so, but one thing that actually just happened to. So, um, I am using this, uh, I'm recording this on the script. It's a project, uh, it's a piece of software that I work on. I developed for, so when I went to log in to the app, just my, so I go to log in, you use Google off. I'm presented with my work account and I'm presented with my personal account. There was no possible way that I was going to click my personal account because probably 10 times a day I log into my work account and, and it's just like the, the starting parts of that are exactly the same and you just need to like move your mouse to a much small, like a different part of the screen. And it's like, unless I'm paying very close attention, Specifically because I'm embarrassed that I just clicked on the wrong, the wrong account to log in. So I, um, does that, does that seem like it, uh, aligns with, uh, slip or what does it capture?

jon_raw_wav_unedited_bad_noise:

slip. Yeah. Yeah, so I, I think that's a good example of a capture slip. So then there's this description, similarity slip that we just touched on, which is basically something can be described similarly, like throw your dirty clothes in the laundry, you accidentally throw your dirty clothes in the toilet cuz it's kind of like, you know, it's a similar type of action. Like you're taking a bundle of clothes and you're kind of throwing it into a hole. So you just kind of throw it into the wrong hole. So again, the milk, I think the milk example is also a description, similarity, slip. Uh, then there's something called a memory lapse slip, which is, I think this one is just where you kind of like forget something, which, you know, hence the term memory lapses. And the example he gives is where, let's say you're making copies of a document and you just like leave the document in the copy machine.

Matt:

Right. I guess the way to think about this is like, it's like the plan almost certainly included a step to take the original off of the, well, actually, so then now this is an interesting point because I guess depend depending on like the particulars. You could have never been thinking about the fact that you needed to take the original off of the glass, or you could have just like forgotten to, to do that even though you were like aware that that was something you needed to do. So it feels like kind of a gray area because it's all subconscious, right? Like when you're making a copy, you're not like, you don't have the like steps. Maybe this is why you need a checklist.

jon_raw_wav_unedited_bad_noise:

Yeah man. We should have checklist. for making copies. the stage where I'm headed. That's like the quality of my brain these days is I need a checklist for making copies of a document.

Matt:

But man, I think memory slip, memory, lap slips. This is my, this is my bread and butter. This is my, my, slip all the time. Uh, were you gonna say something?

jon_raw_wav_unedited_bad_noise:

Yeah, I was just gonna mention, he gave another really interesting example, cuz I actually remember this from when I first got my like ATM card, the way ATM machines used to work. you could forget your card in the machine. Like you could just, you know, you put your card in, you type your pin or whatever, you get your money and you just leave, leave your card. this horrible situation. And I actually did, had left my card in an ATM machine before. And these days the way it works is before it dispenses your cash,

Matt:

Yeah,

jon_raw_wav_unedited_bad_noise:

your card. you know, going back to what we were discussing in the last episode, like these guardrails. is a very good, very well designed guardrail, preventing a very easy to make mistake, so I am supportive of that particular one.

Matt:

we need some sort of like brain finalizer or something like where you know, you, you can, before you do an action. Because I'm, I'm a big believer in like, you do the like end thing. Like whenever, this is kind of a dumb example, but like, uh, whenever I type a brace or a parenthesis, like I always like to type the matching closing parent at the same time because it's like, it's kind of similar because you don't like, or it ties back to what I was just saying, where it's like there's gonna be a bunch of junk in between here and I don't wanna have to remember and figure out like, Which syntactic layer this matched, uh, peren is gonna ha or, uh, cause curly race is gonna have to, um, correspond with, um,

jon_raw_wav_unedited_bad_noise:

it's a pro move to just type, you know, both brackets and then cursor left into the contents. And I actually, like these days I feel like a lot of ides will like type the right side of something and that

Matt:

which I actually hate. Yeah.

jon_raw_wav_unedited_bad_noise:

because it's like, I don't know so much my, my fingers already have buffered to like type the thing, so I end up like typing it

Matt:

With two. Yeah,

jon_raw_wav_unedited_bad_noise:

and it's just a pain in the butt. So

Matt:

I do think some of them have like detected that now, which is even more absurd, where it's like, oh wait, okay, so we're gonna add a new one. But then if the user types the same character, we're gonna like, Uh, collapse them or something. Uh, um, it's just like, all right, just chill out. Like, I'm fine. I'll type this other character. Like the characters are right next to each other. It's a quick, like, it's a really quick operation to type, type the other one, because they're ma they're matched. It's like,

jon_raw_wav_unedited_bad_noise:

how much engineering time is lost from people just trying to save keystrokes?

Matt:

yeah, yeah.

jon_raw_wav_unedited_bad_noise:

let's all just get better at typing, remove all of these, you know, bells and whistles and you know, there'll be a lot of boiler plate and we have to type it, but that's okay.

Matt:

Well, I mean, AI is just gonna start writing everything, so, um

jon_raw_wav_unedited_bad_noise:

gonna have a job for another two or three years, so, you know. But there's one more type of slip that I want to talk about, which is the mode error slip. And this one I actually love. Basically what this one means is there's different states of the universe, let's say, where the same controls have a

Matt:

Yeah.

jon_raw_wav_unedited_bad_noise:

So I think, I think this goes all the way back to, uh, mapping where you have like controls that map to some output or some action or something. And imagine a state change where that mapping changes, you know, those same controls do different things and it gives a very, uh, an example which I thought was tragic. Uh, like this Airbus plane crash there, you know, there's a bunch of important indicators on a plane, obviously. And it sounds like from my interpretation of, of this tragic event is both the angle of the plane, like basically the pitch, I guess, the velocity would be shown on the same display under different circumstances with two digits. Like they would be shown in the same, know, with the same number of digits, which just sounds like. It sounds insane to

Matt:

Yeah.

jon_raw_wav_unedited_bad_noise:

I'm sure it's one of those things where it sounds completely insane in hindsight. And at the time they were like developing this user interface, like it might've made sense or maybe there's some reason why pilots would understand it that way. But I mean basically the pilots thought that they were decreasing their like velocity and really they were just pointing the plane straight at the ground and uh, I just, Yeah. I found it to be tragic and. Just say, um, Yeah. a crazy example of this type of slip.

Matt:

And so your, your point is the display didn't have the capability to indicate like units or anything like that. It, it was just like, 3.3 and like, well, in some circumstances it was degrees, and then in other, uh, circumstances it was like feet per second or something like that. Uh,

jon_raw_wav_unedited_bad_noise:

yeah. Like if you're, if you're, uh, cruising at 30,000 feet or something, it would just say 30. It.

Matt:

yeah. Yeah. Yeah.

jon_raw_wav_unedited_bad_noise:

have had, you know, I don't know, he, I don't remember if he mentioned if it had like a unit on it or something, but I remember him calling out that it was two digits. So like the same, you know, you look down at the thing, you see a two digit number, you look down again, you see a two digit number, but it actually means two different things. Like, it just feels like a recipe for disaster.

Matt:

Yeah.

jon_raw_wav_unedited_bad_noise:

that is the classification of slips. Um, and he gave a separate classification of mistakes, but I just did not, uh, I didn't write them down in my notes, so I'm wondering if

Matt:

Yes. Yeah. So let, yeah, we can get, we can go through those. So first off, we have, rule-based mistakes. The situation under which a rule applies is like mistakenly uh, you know, you think a rule should be invoked, but like,

jon_raw_wav_unedited_bad_noise:

Hmm.

Matt:

you're actually wrong in that regard. So, um, and, and then the other thing is like rules. Like the rule itself is wrong. Like someone has, has laid down a rule, like if this, then that. But then actually that's not necessarily always the right thing or, you know, uh, and then,

jon_raw_wav_unedited_bad_noise:

he, did he give examples of these, by the way? Because I'm like, way my brain functions is I just like need concrete examples of things.

Matt:

Um, yeah. So he goes into an example where a band accidentally set the ceiling acoustic tile tiles on fire by using flares in their pyrotechnics display. Um, and you know, that would've been bad enough by itself, but so then there was a rule-based error that was layered on top of this, which was. The, the bouncers or the guards at the club were actually staffing people from leaving the venue because there was a rule which was like, oh, people try to leave without paying for their drinks, so don't let them do that. But the rule didn't accommodate for like emergency scenarios, which just like, and I'm sure they figured out eventually, like we should let people leave. It was just the kind of thing where it's like you have this Underspecified rule and, uh, they start following it and it makes the, it causes to excess death, you know? Um, so, so that's an example of a rule, rule based mistake,

jon_raw_wav_unedited_bad_noise:

So depressing. I feel like there's o other, uh, examples of that that I've read and heard about with, uh, dealing with crowds and, you know, cuz a

Matt:

a crush.

jon_raw_wav_unedited_bad_noise:

wrong with Yeah, exactly. a crush. reading about an example where there was like a magic show and there was a ton of kids in the audience. This is like a super tragic example and there was a door that was like, Propped open, but it was, it was like open in such a way that you couldn't open it any further. Like it

Matt:

Oh God.

jon_raw_wav_unedited_bad_noise:

against something. basically what happened is all these kids were either trying to get into the venue or out of the venue, I can't remember, and they couldn't exit the door fast enough because of this propping. And so there was a crush and a, you know, a lot of kids were killed. It's just a very depressing event.

Matt:

Um, oh my God. That is just, and again, I mean, this probably goes back to guardrails where maybe that that door was propped open because. You know, the, they probably wasn't supposed to be propped open, but there was something that they needed it to be open for and there wasn't a good, safe way for them to do that. Um,

jon_raw_wav_unedited_bad_noise:

to the bouncer thing, where they had a, you know, they're like, oh, let's, let's prop this door so that. people. You know, can't leave too quickly or whatever, or can't enter too quickly maybe, and it just backfired because people don't, you know, they make decisions and they're not accounting for worst case scenarios. And then it happens, and then, you know, everything goes wrong.

Matt:

Um, so the other thing that he says is knowledge-based mistakes and. This is, this is another hard one to pin down because it's, it's kind of saying like, there's no formal rule here that's specifying this, or like a skill. So literally it's just your assessment of the situation and your decision in that, in that circumstance. It's like you only know, understand in retrospect that it was a bad decision and so. You know, he basically goes on to say he does, he doesn't go give any examples of this, but, um, he does say like, from a design perspective, the best thing you can do is, is try to provide all, uh, you know, as much of the relevant information about the situation as possible to the user. So it reduces the likelihood that they're forgetting some important.

jon_raw_wav_unedited_bad_noise:

Right.

Matt:

Uh, and then, uh, memory lapses mistakes. And so this, this was also a slip where, um, you know, you have, there's an interruption in like your formulation of the plan, and you actually like, forget an earlier part. Like you go to start doing something and then you forget the full context there.

jon_raw_wav_unedited_bad_noise:

Ah, so this would be a case where you're writing that checklist for yourself. But you forget a step in the checklist, which is why there should be a, a master checklist that you just copy.

Matt:

Yes, yes. Yes, exactly. Um, so, um, so yeah, I think those are the, those are the high level, uh, the, uh, categorizations of mistakes as, as compared to slips.

jon_raw_wav_unedited_bad_noise:

it. Um, he went on to talk a lot about. Why these things happen, and I found there was a bunch of sections late in the chapter that I thought were super interesting. He talks about social and institutional pressure,

Matt:

Yeah.

jon_raw_wav_unedited_bad_noise:

that I, that I do think exists in our industry, like in the software engineering industry where, what I have found, and I'd be interested in your take on this, is and sort of like intelligence. Is currency in our, uh, in our industry. So an important thing to be doing is to, to be exhibiting knowledge and, know, showing that you have intelligence, almost like a peacock with their feathers. also preventing people from, you know, realizing that you don't have knowledge, sort of like protecting your, uh, basically not being vulnerable.

Matt:

Yeah.

jon_raw_wav_unedited_bad_noise:

and I'm, I'm sort of rewarding this in a very extreme way. Like obviously this is a, a spectrum. and it's why I like cultures where vulnerability is normalized because I do think is very dangerous to have a culture where people are, are deathly afraid of like saying the wrong thing or making mistakes because they just won't ask questions basic misconceptions can happen. and he talks about. And, and also I think those cultures also tend to very much favor like, you know, the senior or like more vocal people in the room because maybe the junior people are like afraid of, you know, saying, saying the wrong thing, getting trampled. Um, and he talks about a KLM disaster, which is another flight disaster where the pilot of the plane was doing the wrong thing, like he was crashing the plane and the junior co-pilot. Spoke up and said, Hey, I think, I think this is wrong. I think you're, you know, making a mistake here. And the pilot just totally overrode the guy and crashed the plane. And, uh, you know, obviously another, I feel like his episode is very tragic, but Yeah. just another very tragic example, but in, but also an example of why that type of culture is so, so dangerous.

Matt:

Well, I, I think there was another, I, I don't, I won't know the specific details, but there was another. Um, case relatively recently, I think it was a Korean, um, Korean pilots. This basically the same situation happened, but there apparently in, I, I don't know if it's in Korean culture or if it was in the culture of this airline, but you do not question your superior in, you know, in that or in their. Culture, it was not okay to question what the superior pilot was doing. And the junior pilot was like, kind of knew that something was wrong, but he didn't have the, uh, you know, the cultural ability to, to even propose. And, and it led to, I, I, there was, it was some crash. I mean, I'm sure, I don't remember exactly what the outcome was, but, um, but I think exactly to your point, it's like it needs to be okay for junior people to like, say that they think something is wrong because either, either they're wrong about their assessment of the situation and they should learn about it. Or the, the senior person really is wrong. Um,

jon_raw_wav_unedited_bad_noise:

I, I think it's also, I think there's a flip side to this too, where a senior person needs to be able to say, I don't know, like, I'm not sure I d you know, I'm, because there, I'm sure that in those events, or I guess I'm not sure cause we'll never know, I wonder if there was like an inkling of a doubt in those pilot's minds. you know, am I doing the right thing right now?

Matt:

Yeah.

jon_raw_wav_unedited_bad_noise:

And, and I think that culture cuts both ways, where junior people are too afraid to call out their senior counterparts. And senior people are almost too afraid to like show weakness. So they're kind of like, I'm just going to follow this through to the bitter end, even though I have this doubt in my mind. Um, and Yeah. I think, I think both of those are very dangerous. Another, uh, I think really, really, sad but good example of this is the Cher Noble incident. I'm kind of basing this off of the television show, which who knows how, uh, you know, true, like who knows how well that that followed the actual truth of the incident. But it seemed like a large part of what made that problem worse was people just afraid to tell each other that they screwed up and that there was a serious problem happening. There was all these delays in actually communicating the full scale of the

Matt:

Yeah.

jon_raw_wav_unedited_bad_noise:

you know, to to more senior people in the Soviet Union. And it, you know, it's, we need to be able to normalize making mistakes and normalize not knowing things and, um, it's

Matt:

There was one, um, yeah, there was one example where he talked about people being on, I forget if it was an oil rig or something. He did an assessment of this business and he was saying, I, I can give you these different rules about that, that will reduce most of your mistakes, but you're gonna need to give up this Superman mentality that the people that are kind of on the rig are infallible and can't ever make mistakes, cuz counterintuitively, that's actually leading you to these worst outcomes. Uh, so I think it is so important that people have the, have the humility. To say, you know, say, oh, something's going wrong. Like, we need to, I don't know, I fucked up or something, but.

jon_raw_wav_unedited_bad_noise:

Yeah, I, I also think it's powerful for us senior, your person to say like, I don't know, in front of junior people. I think that can be, you know, I mean, this sounds silly. Because it sounds really dramatic, but I, I just think it can be setting like a good example for, know, people who are more new to a team and just realizing that like, oh, this is the culture. Like it's okay to just not know something or, or make, do the wrong thing. Um, another

Matt:

mean,

jon_raw_wav_unedited_bad_noise:

oh, sorry. Go ahead.

Matt:

well, yeah, just, just a quick point on that. It's like I think there can be something weird that happens where. You have a person who is willing to admit their ignorance about something, and that actually is really good for them learning a ton of stuff very quickly. They learn about everything because they're willing to admit their ignorance. And then other people who know what they're talking about teach them. So then you wind up with a person who knows a ton of stuff. So like, and then you have a junior person who comes to them with question. And as far as they like, as far as their perception goes, this person knows everything. You know, like every question you come to them with, they are giving you like a really thorough answer to. And it's not because that person isn't willing to admit their, uh, their lack of knowledge. It's just because they, their domain of knowledge completely encompasses the questions that this junior engineer. Has lobbed in their direction. So, um, so I think you can kind of get this sense like, oh, like I can't say, you know, this senior engineer is never saying they don't know anything. Like I need to always have an answer. Uh, and um, but I don't know. So I don't know if the resolution to that is to say you don't know, or, or just to try to convey your knowledge. Some inkling of a, I mean, of, of self-doubt. I don't know, maybe that's silly, but like, um,

jon_raw_wav_unedited_bad_noise:

Yeah. it, I mean, you raised an interesting point cuz. You know, I, I think it's, it's really valuable when senior people can sort of tell junior people like, oh, the way that I learned so much in this organization was by being curious and sort of admitting that I didn't know things and being proactive about, you know, learning new things. Um, and, you know, that can, I guess that can be more difficult when you're, you sort of like already have learned all of the important questions. So you just look like this infallible god-like creature. Um,

Matt:

Yeah. Um, but you were gonna say something else,

jon_raw_wav_unedited_bad_noise:

Oh, right. Ju I just wanted to move on to another couple sections in this chapter, which I thought were interesting, which was basically mistake prevention and he mentioned a few

Matt:

never to stop trying.

jon_raw_wav_unedited_bad_noise:

just be better. Um, but he mentioned a few interesting techniques that I think were both, I, I'm pretty sure they were both from Japan

Matt:

Mm-hmm.

jon_raw_wav_unedited_bad_noise:

they were, they were both from Toyota. Um, and they're called Judoka and pka, yo. Uh,

Matt:

Hmm. Good names.

jon_raw_wav_unedited_bad_noise:

Uh, but Judoka, which I think means automation with a human touch, uh, is basically, and yeah, I, I only wrote a couple of notes for this, so feel free to fill in the blanks here, but like get punished for not reporting errors. Like you, you basically form a culture where you're normalizing reporting errors and you probably get a lot of false positives, like, you know, error reports that aren't actually errors, but you probably also avoid of errors that are truly errors. Um, so I thought, I thought that was very interesting. And also pokey yoke, which this one I found even more fascinating, which was. It basically like you jerry rig things to try to prevent mistakes. And the example he gave was like, if you have a switch that does something very serious, you put a cover over it. You know, you

Matt:

Mm-hmm.

jon_raw_wav_unedited_bad_noise:

little cover flaps that you have to like lift up before you can hit the switch. Um, or like if there's a lever where you can like accidentally push it too far to do the wrong thing, like you put like a little wooden block there to like prevent, you know, so you have to remove the block in order to actually. Push it further. Um, and I thought that, I thought those were both really cool, cool things.

Matt:

When I, one question I have with Judoka is, how do you evaluate when a worker didn't or knew about a potential error and didn't report it? Like how do you, how do you do that? Um,

jon_raw_wav_unedited_bad_noise:

that is interesting.

Matt:

I guess when something bad happens, you try to find every, maybe you just do interviews or something.

jon_raw_wav_unedited_bad_noise:

Yeah. Yeah. You'd almost need to know every point where someone could have that, like an error was gonna happen and, and yeah, basically ask them like, Hey,

Matt:

Maybe it's. Maybe it's just enough to say that you would be punished if you saw something wrong and didn't do anything about it. And you don't even actually need to be able to follow up on it. And then anytime someone thinks something could be wrong, it's like, oh, if someone found out about the fact that I knew that this was happening and I didn't say anything, I'm gonna get in trouble. Um, and yeah, I believe in, I believe in the, the air proofing. Um, and I. Yeah, it's probably, probably for the best. I keep on thinking back to the fact that, uh, my, my Mac warns me every time I download a program from the internet and I'm like, I'm okay. I got this. Like, uh, you probably don't need to warn me. Every, like, in that case, like the, the little block block over the switch, which is the switch is. Install a program from the internet. It's like, all right, Mac, like, I'm, I'm, I'm, good. Can I take, yeah. Can I rip this off? And no, it's not possible. But, but the other part of me thinks like it's probably better to have that, like, have a little bit of annoyance. Uh,

jon_raw_wav_unedited_bad_noise:

This is, these are questions I constantly ask myself because like, you know, ju, just as an example, my organization has these tests. That have to run. Like we have a fairly large system, so it takes a really long time to build our system and a very long time to like run the full tests for our system. And I am constantly asking myself like, these tests actually providing?

Matt:

value.

jon_raw_wav_unedited_bad_noise:

Yeah, I mean I'm sure they're a like Adding value. like I'm sure they're occasionally preventing issues or you know, whatever, but are they adding enough value to justify their cost Is.

Matt:

Well, it feels like making those tests faster should be like whoever, whoever's responsibility those tests are like, It should be a really high priority to make them run as fast as humanly possible because it's like if every engineer in the system is running them, uh, like for, I don't know if it's every PR that you've merged or what, but, uh, or.

jon_raw_wav_unedited_bad_noise:

Yeah. I mean it's a, they are run all the time and they're part of a lot of like normal. Workflows. So, Yeah, I mean, I agree. I think improving the speed of them is super high priority, but it's tricky if, if your system is so big, it's just tricky you know, it's big. You can't really, it's like unless you're breaking the actual system down into like separate parts that can kind of function in independently. Like it's just

Matt:

Yeah, you need to fix the system. And I do think. I don't know. This is probably, we, we can, we can cut this, but I do feel like sometimes unit testing can like, lead with like, or wind up with a design that has like very strangely, like very, it's like units of functionality that are very small and like maybe don't make a lot of sense and you basically broke it down that way so you could write a test for it and like, I think that's an anti-pattern too. Uh,

jon_raw_wav_unedited_bad_noise:

definitely. Yeah. And I definitely have seen cases where, you know, Testing the code causes the code to be worse. And I just don't, yeah. Really don't like that. Um, so there was one final thing that I wanted to talk about with this chapter, which was the Swiss cheese model of how errors lead to accidents. Uh, which I

Matt:

Okay.

jon_raw_wav_unedited_bad_noise:

he had this image. of like

Matt:

It's a nice image.

jon_raw_wav_unedited_bad_noise:

cheese and basically like a rod kind of going through the block. But the, the basic premise here is like, Accidents. Major accidents usually happen because of a series of errors that kind of all line up and you know, you can imagine a block of Swiss cheese where you can shove like a pencil, you know, through the whole block because there's one hole that just kind of goes all the way through. Um, and the reason this, this model is, is useful or one of the reasons is because one way of preventing major accidents is to kind of move the, the pieces of cheese around so that there's not a hole that goes all the way through. know, cuz you, you're going, you're going to have accidents. Like there's no, there's no way you can design a system, or I'm sorry, you're going to have errors. There's no way to design a system that's gonna be error free, but you can avoid major accidents by. Trying to prevent errors from kind of stacking up on top of each other.

Matt:

Interesting. Yeah. This feels like this almost feels like more useful in like an investing, uh, like sense where it's like an exposure to risk where it's like you have these like correlated risks where actually you have all of your asset classes are, you know, could, could tank in, in value simultaneously for, from a particular. Kind of failure and, and I don't know, like does he have an example of like what moving the piece of cheese, uh, means or like, like

jon_raw_wav_unedited_bad_noise:

I don't know if

Matt:

how do you move the piece of cheese?

jon_raw_wav_unedited_bad_noise:

so, I don't know if he said this example or I just wrote it in my notes because I was trying to think of an example, but he did give. So, he gave an example of, uh, the Swiss cheese happening or the rod going all the way through where it was yet again, a plane disaster, where like there was fog, there was delays on the runway. So pilots were making bad decisions because of delays. Um, and then yeah, there was, there was pilot mistakes. so like all these things kind of stacking up in order for this truly terrible accident to happen. And I wrote down, um, you know, you could have a policy where like if there's fog. Things need to be double checked. You know, like you just, you just can't do certain things. Um, and that might be a way of kind of moving one of the piece of cheeses.

Matt:

So, yeah, so I have the, uh, I have the steps here. So he says, add more slices of cheese.

jon_raw_wav_unedited_bad_noise:

Nice.

Matt:

Reduce the number of holes or make the existing holes smaller, alert the human operators when several holes have lined up,

jon_raw_wav_unedited_bad_noise:

There you

Matt:

which that is a very interesting idea. Um, I'm trying to think about, this is something, this is something that is, is ma, you know, making my brain light up a little bit. I work on this collaboration system and it's like, it's so, there's so many moving pieces to it, and it's like we have a case where collaboration really breaks down. It's never one, it's never just one thing. It's like it's this concert of, of effects. And so it's always hard to take the like melted cheese and like, Pull it apart and be like, okay, like which, where were the holes here? Like, all I have is the, you know, they tell me like, oh, collaboration was horrible. And it's like, you know, it's your job to kind of like, do, do a little bit of like technological forensics and be, you know, figure out where, where things were falling down. So,

jon_raw_wav_unedited_bad_noise:

yeah, yeah. He even, he kind of alludes to this, where he talks about the, I think it's the N tsb, which is the organization that does the, uh, forensics, uh, into like why a plane accident happened, and he talks about how that process. Needs to be slow and very rigorous. And you know, cuz one complaint people have about these N TSB investigations is They take years. Um, but there's a reason they take years. Like they really need to be like extremely methodical with how they go through how the accident took place. And that, that process yields great results because you can figure out how to move these pieces of cheese around or put like a more. Maybe you put like an American slice into the Swiss cheese block that it's just impenetrable.

Matt:

No holds. Yeah.

jon_raw_wav_unedited_bad_noise:

Yeah. Or some Vel Veta just shove some Velveeta in the holes.

Matt:

All right. Well, with that image, uh, maybe we, uh, we call it, was there anything, uh, any other nuggets of cheesy wisdom you have to slather all over our users,

jon_raw_wav_unedited_bad_noise:

Oh man. Well, when you put it that way, uh, now there, there wasn't really much else that I really wanted to talk about. This was a very good chapter. It covered a lot of material. There was definitely things that we didn't talk about, but, uh, those were, I think those were the main things I wanted to

Matt:

You got, you gotta read the book. You gotta buy the book.

jon_raw_wav_unedited_bad_noise:

Exactly. Buy the book, put money in Don Norman's pockets.

Matt:

this one, I, this one was good. This was a really good chapter. I, I feel like we've, uh, leveled some criticism, uh, at him before, but I felt like this was, This was peak, peak done. We kind of knew this. He was, he was teasing the Arrow chapter for a while.

jon_raw_wav_unedited_bad_noise:

yeah. Nah, he's, he's in true form in this checker.

Matt:

was a pay pay payoff. All right, well, okay, so let's, let's see now for next chapter. It's called Design Thinking chapter six, design.

jon_raw_wav_unedited_bad_noise:

Oh, okay. Awesome. Awesome.

Matt:

I'll see you there, John.

jon_raw_wav_unedited_bad_noise:

See you there, Matt.