The extra mile vs skill: Risk management in our lives

Today I would like to talk a bit about the “extra mile mentality” and doing what is right. Recently, I was on a trip to Milos island where there was this beach where people were swimming alongside boats. It was an accident waiting to happen. When I inquired about it to our organisers, they brushed off my concerns with a laugh: Don’t worry, it will be fine.

We do that: There are often things that have, let’s say, a 2% chance of happening. For example, a swimmer gets hit by the propeller of a boat. For this to be prevented through some precautions need to be taken: There needs to be a net protecting the swimmers, boats need to avoid entering the bay altogether and they need to keep their distance from the beach.

An accident waiting to happen

There are certain actions that we take an extra 30 seconds effort that prevents rare bad outcomes from happening. These actions, however, usually require some organisation, a little bit of coordination, a certain amount of effort and let’s face it, more often than not we are trying to duck away from them. We don’t want to go the extra mile to shield ourselves from something that has a 5% chance of happening and we leave it to luck. 

Even worse, this ghastly lack of consideration often gets twisted into a testament of skill. “I don’t need to wear a helmet, I am a good driver and this kind of protection is for the newbies”.  I was watching a video from some old tapes in Greece where people were not wearing seatbelts and they were proud about it. If you wore a seatbelt you were not deemed worthy to be called a driver. It was as if the moment the sound of the belt clicking was heard, your street cred as a driver was washed away. If you are taking precautions against something, it means you are afraid and if you are afraid it means you are not good enough. 

This is a wider issue. It has to do with the ability of people to see beyond what materialises in front of their eyes. We don’t build our cities to be protected against flooding or protect our houses against earthquakes, because “weeeell… this happens very rarely, who cares”. It’s a very greedy approach of “ let’s just survive till the next day” mentality, which is summarised very beautifully in a Greek saying loosely translated as “Don’t worry about tomorrow, until then who knows who’s gonna be alive and who’s gonna be dead”.

In Greece’s defence, we have a very unpredictable and uncountable government. When you don’t know whether you will make ends meet on an average day you don’t really care about hypotheticals. You only worry about what’s in front of you. That doesn’t make it less wrong though, it only makes it more difficult to convince people otherwise. 

There is a pattern in this kind of decision making which is the heart of risk management: Usually on one hand you have something solid, countable and concrete that costs money and/or time: Wear a helmet, put on your seatbelt, spend money to make a building seismic proof, don’t allow houses to be built on places that may be dangerously flooded.

On the other hand we have a hypothetical worst case scenario, that has a very low chance of happening. An earthquake, a boat accident, someone getting drowned on a beach without a lifeguard, a forest fire. 

In the middle you have people who have a natural disdain for unfortunate hypotheticals. This is one of the core challenges in risk management: Convince people to do something countable to guard themselves against something uncountable. Do something real to prepare for something hypothetical. Spend real money and real time on nebulous conditionals. Change their everyday routine to prepare for the extraordinary.

There are a variety of factors that can amplify or weaken this logical fallacy.

  • Probability of the risk occurring.
  • Frequency of occurrence.
  • Whether the risk has been encountered before or not. 
  • Whether people have seen the risk management strategy working before. 

How often does that happen? 

A risk’s impact is probability *  severity. This means that even if something that’s happen that often it can still have a huge impact, an accident over a nuclear plan for example.

Here comes the kicker: The slimmer the probability of occurrence of a risk is the more difficult it becomes to convince people to take actions against it. Especially if they have never experienced it before. 

Sometimes a risk occurs frequent enough to convince people to take action. For example, leave some time free in a sprint in order to have contingency plans. This is something everyone is happy to play by, they see sprints get derailed every 2 weeks. Interestingly though once it moves up to the milestone level, people are less willing to accept the option of failure. Maybe it’s a combination of a larger time scale, which puts the consequences in a more nebulous future and the severity of the risk being greater (a milestone failing is a major setback), which puts that risk in the convenient “let’s pretend this will not happen and lets name this mentality optimism and confidence”. 

It’s not gonna be me…

You tell people that they should wear seat belts, you show them pictures from accidents, however the impact fades aways from their own everyday experience of not crashing the car. This creates a fake aura of confidence and, more important, leads to the fallacy that something is entirely within your control. If I am driving carefully, there is no way something bad will happen right

I have been driving for years and never crashed once, I don’t need the seatbelt anymore it’s all based on my skill.

People, in general, tend to have an innate dislike for taking precautions against their own failures. I remember in Sony, the Technical Director was trying to convince programmers to adopt testing in their coding routine. Coders just couldn’t adapt to that. 

Arguably, it’s quite arduous for people to change their everyday routine, even in the simplest manner. However, I cannot help but think that at some level, people didn’t really want to include a routine in their everyday life which was an admission of potential mistakes being made.

Sometimes it takes time for results to show….

The biggest issue to convince people working out is that results take some weeks to show, but the pain is right there, since day one. Systems that lack immediate feedback are the toughest ones to convince people to take action on. 

We like the quid pro quo type of deals. Doing something, with no immediate and obvious return, not so much. It will take some time for the code base to be significantly improved after testing has been implemented but during all this time coders will have to take their extra time writing the tests.

…and only if everyone cooperates.

 Lack of immediate feedback is the first element of the unholy trinity when it comes to risk management strategy. The other one is when everyone needs to cooperate in order for something to happen. This is one of the most difficult risks to manage. It’s the equivalent of a mass prisoner’s dilemma where every single one of the prisoners must trust that everyone else will do their part. 

This type of problem has applications in a surprisingly wide variety of issues beyond traditional project management, from paying your taxes, to convincing someone to vote. However, there is a third element which can make some risks exceptionally tough to handle and which completes the unholy trinity. 

The problem is not related to your previous experiences.

Doctors tell you all the time, don’t drink a lot, you will damage your liver. Don’t eat lots of fat and exercise or else you will damage your heart. Yet, 1 in every 4 deaths in the USA is related to heart disease.

We treat our bodies as if it’s easy to get a replacement part, but why? Well, it’s simple: Everything else in our lives can be replaced. Window broke? Buy a new one. Laptop not running efficiently? Replace it. Car not working? Pay a mechanic and fix it. We tend to make things replaceable and fixable. It’s practical to make them this way. The fallacy starts when we think that our bodies operate this way. We are surrounded by human constructs and, thus, we think everything operates like that.

Point me one human construct around us that once it breaks down, it cannot be replaced but can only be fixed slowly over time. I am gonna add the cherry on top: You can’t live without it. 

Everything in this image is replaceable. Yes, even the desk

This is the reality that hits most people, when they are told they have cancer, or heart diseases or that their over excessive drinking did irreversible damage to your liver. Yeap, you are not a FIAT.

To put it in a more generic way: We tend to extrapolate our own experiences, even though we shouldn’t do that and to systems that behave completely differently. 

Coronavirus

And now let’s combine everything to understand the coronavirus crisis. As you can see the 

  • It’s something that doesn’t happen often. People can’t draw from their past experiences to understand it. Furthermore the system was unprepared against it. 
  • Quarantine works only if everyone cooperates in a n-person prisoner’s dilemma.
  • The logic behind this risk is not related to human constructs and it’s practically alien. 

Right, so what do I mean by alien? Well the obvious answer is that a virus is not a human construct. It is something that evolved quite on its own. However, the “alien” factor there is math related. Let me introduce you to the exponential function.

It’s a positive feedback function and that’s why it’s so scary. The more it advances the more powerful it gets. My brother yesterday asked me: Let’s say water lily filles 1% of the lake and every day it’s population doubles. If the entire lake gets filled in 6 days, how many days does it take to fill half the lake? 

I know right? Only 5! 

We avoid using it even in game design. Why? It’s just so difficult to control. Let me frank: Almost nothing in your everyday life that is constructed by humans performs according to the exponential function. The way the gas tank empties in your car is linear. The way the salary gets deposited in the bank every month? Linear.  They way your mobile phone’s battery gets spent? Linear. 

This is why it’s so hard to contemplate the disproportionate kind of damage just a single person can cause, just like patient 31 from Korea where a single person had 1,160 contacts and blew everything out of proportions. 

This is what we call Social Responsibility…

Social responsibility is an ethical theory, in which individuals are accountable for fulfilling their civic duty; the actions of an individual must benefit the whole of society. Which at its heart is the essence of risk management. 

  • Admitting that things can go wrong.
  • Taking precautions is not pessimism not it means we are incompetent. 
  • Not everything works with the type of logic we are used to.

I have worked in quite a few multi-million dollar projects and I can attest to the fact that true risk management is not an action-esque emergency plan, that involves explosions and one guy magically saves the day. It’s more about little every day actions that accumulate to something great.