How we make decisions can depend on a wide variety of things – how we feel at the time, what else is going on in our lives and, the granddaddy of them all, looking for information to confirm what we already believe (sometimes called Confirmation Bias, but I prefer to call this the “Ah-hah I told you so” bias).
There is another common error we make when taking mental shortcuts. We don’t like to consider things too rationally (because it hurts too much). Our brains like to seize on “facts” which we can then use to make decisions quickly without having to do a lot of thinking. But often these facts are not actually facts at all. Marketing companies understand this very well.
Look at ads for beauty products, anti-aging creams in particular. Appear younger, reduce this, enhance that and you too can look like this fabulous model who has been in professional make-up for 3 hours. But look at the small print and these ads will say something like “72% of 97 agree” (this is from a current ad by a French cosmetics company). So they did a test of 97 people and 72% agree so 66.24 people think it works. Hardly scientifically robust statistics but it’s the headline “facts” they want us to remember.
So politicians and marketers love to use these factoids, but it can also lead to poor decisions or perceptions in the real world as well. Recently I saw the following headline on the BBC website:-
“Google’s self-drive cars had to be stopped from crashing”
Wow those robot cars are going to kill us all, I’d never get in one! This might be a “factual” shortcut to believing that self-drive cars are four-wheeled terminators but is this a short-cut to a bad conclusion? Here are some facts about the number of times a human has had to intervene in California in 2015:-
Delphi 405 (& no I have never heard of them before either)
Bosch 625 (yes I thought these guys made tools and home appliances)
Does that make it any clearer? Well it seems to confirm that our first response was correct. I mean look at how many times they almost had accidents – 2,460 times. Actually it is how may times there was a human intervention not a potential accident, a subtle but important difference. Once you add in the number of miles traveled in each test then things become a little clearer :-
Google 13 424,000 miles
Nissan 106 1,485 miles
Mercedes 1051 1,739 miles
Delphi 405 16,662 miles
Volkswagon 260 14,945 miles
Bosch 625 935 miles (perhaps these guys should stick with tools & appliances)
So Googles cars have had to be stopped by human operators 13 times in 424,000 miles of driving. I wonder how often I had to slam on the brakes because I wasn’t paying attention or someone cut me up or pulled out without looking? Probably more that once every 32, 000 miles or so. I live on a small crowded island, so I am guessing I have to do this every 32 miles or even more. In fact, two of the incidents reported by Google were because of traffic cones and the other 11….well would there have been an accident? Who knows?
So perhaps Google’s self-driving cars aren’t so scary. In fact it is estimated that humans have 4.1 crashes per million miles in the US (estimated because minor accidents are not always reported to the authorities). Google are running at a lot more than that (30 per million miles) so they have a ways to go. But I am pretty sure humans are not going to be wiped out by terminator cars.
This decision short-cut is unhelpfully called the Availability Heuristic, not exactly a catchy or intuitive name. So I prefer to call it the Schwartzeneggar Shortcut.
Postscript: One of the case studies I use in training is Tesla Motors, the US all electric premium car company. I test drove their Tesla Model S recently during which I used their autopilot feature where the Terminator…sorry the car, did the driving. It is very unsettling to start with but after a while you get used to it. I suspect that in 20 years time I will enjoy a drive where I can sit back, relax and have a glass of wine while the car does the driving.