For years, I’ve noticed an issue that arises when something comes along that can save many lives, but may cost a few in the process. Here are two examples.
When self-driving cars become the norm, we’ll likely save 30,000 lives per year that would have otherwise died in car accidents. However, it’s also likely the the self-driving cars won’t eliminate every accident and a small number of lives will still be lost. The problem is that we’ll know those losses by name when they happen, but the 30,000 people who would have otherwise died will have no idea how fortunate they are.
We also saw this with COVID vaccines. It’s estimated that they saved roughly three million lives, but also caused a number deaths that may not have otherwise occurred (that number is widely disputed, but it’s at least the nine that died from the J&J vaccine). We know specifically who died from it, but those that were saved have no idea. Perhaps you or I might have died of COVID if we hadn’t been vaccinated, but there is no way to know for sure.
In both cases we can put names on those who died, but we’ll never have any of who was saved, even though the numbers prove that we saved multitudes more than were killed.
Upstream
Dan Heath’s book “Upstream” tackles this problem a few times. He gives no real solution for it, as there isn’t one, but he’s able to summarize it better than I can. First, he gives an example of improved road conditions that have prevented many accidents:
On sharp curves where accidents tend to happen, transportation departments have begun to install high friction surface treatments (HFSTs)—overlays of ultra-rough material superglued to existing roads. In Kentucky, where the treatments have been used widely, crashes have been reduced almost 80%. None of those drivers, who avoided crashes they would have suffered in an alternate world, will ever know that they may owe their lives to some construction workers who installed a super-gritty road.
The bulk of the book is about taking upstream action — preventing problems before they occur. That kind of work is generally less expensive than downstream work, but it’s very difficult to measure and incredibly hard to personify.
He gives another example, which is where I pulled the title of this post:
That’s one reason why we tend to favor reaction: Because it’s more tangible. Downstream work is easier to see. Easier to measure. There is a maddening ambiguity about upstream efforts. One day, there’s a family that does not get into a car accident because a police officer’s presence made them incrementally more cautious. That family has no idea what didn’t happen, and neither does the officer. How do you prove what did not happen? Your only hope, as a police chief, is to keep such good evidence of crashes that you can detect success when the numbers start falling. But even if you feel confident your efforts accomplished something, you’ll still never know who you helped.
This shows up in so many areas of our lives. Another big one that is often debated is healthcare; why doesn’t most insurance cover proactive health benefits instead of just fixing us when we’re broken? Studies show that it can save them a ton of money, but it can’t be individually measured so it largely gets ignored.
The self-driving car aspect of this has me the most curious for the future. While the full benefits are still years away, I’m confident that they’ll save many thousands of lives. However, as soon as one of those cars makes a mistake and we all mourn the loss of “John Smith, father of three”, it gets really tricky. The 30,000 lives that were saved remain blissfully unaware, but “that stupid self-driving car killed John, so we need to do something about it“.
It’ll be fascinating to see how our opinions and laws are shaped in the coming years.
unclebeezer says
That last paragraph hits on one of the big issues upstream efforts. Even if we put in preventative measures (self-driving cars), because the EXCEPTION (John dying from a self-driving car) has a persona attached to it people want to rush to fix it. We (as a people) tend to focus too much on the exceptions and try to build rules and regulations to address those. Really, we should be empathetic and sympathetic to those exceptions, but we should build our rules and regulations to address the 80-90% of situations.
Mickey Mellen says
Yep, that’s exactly it. Really, I have no idea how to fix it, because that’s how humans work.
If that exception happened to be Kelly, I don’t think I could be this rational about it even though I really should be.
unclebeezer says
And I wouldn’t expect you to be. When it’s super personal and hits home, we react differently than watching it from a distance.
And I don’t know how to fix it either.