The last year has been an interesting way to uncover patterns in our collective thinking.
One thing I noticed is that it’s difficult for societies to think probabilistically. “Masks don’t work” vs. “it’s OK, I’m standing 2 meters away” are both symptoms of that. It seems obvious that both wearing a mask and keeping your distance is a way to minimize risk, not remove it, but that’s sometimes hard to operationalize.
People don’t seem comfortable with new ways of risk mitigation, unless they are absolute. (Different to established risk mitigation strategies: wearing seatbelts is understood to not make you absolutely safe in a crash, but it increases your chances and somehow that’s understood.) Wearing a mask and keeping your distance both lower your risk, but are not guarantees and it took a while to become accepted in a reasonable way.
At least in my corner of the world, masks were at first frowned upon (I got some downright hostile looks when I was wearing a mask before it was an enforced norm). Distancing was the opposite: people seemed to believe that if they stayed 2 meters away from each other, there would be absolutely nothing to worry about.
Another connected symptom is how we evaluate our past decisions. It’s the difference between evaluating them based on the outcome (tempting, but wrong) vs. based on the information we had at the time (unsatisfying and can feel like making excuses, but the only reasonable way).
I keep thinking this has to be trainable. My totally untested pet theory is that people, who like playing games (either video-, board- or card-games) probably find this easier. Can we use this to make probabilistic thinking more universal?