Tuesday, February 28th at 7:30 in the Parlor
These are tough times for The Truth.
In a saner and more reflective epoch, the late senator Daniel Patrick Moynihan commented that “Everyone is entitled to their own opinions, but they are not entitled to their own facts.”
These days, on the scales of subjective thought, facts seem to be utterly outweighed and overwhelmed by predispositions, biases and beliefs.
Truth be told (if you would only believe it), there was never a time, in spite of our wistful musing, when facts held sway as beacons of truth and justice. As a thinking species, we have never been all that good at objectivity. But lest we get too severe on ourselves, this is understandable predicament: Objective Thinking is Hard.
And this is the starting point for this month’s Big Question Forum, in which we want to look at the myriad ways we mishandle, misrepresent and misapprehend information – usually with the best of intentions – all because of the innate cognitive biases of our beloved human species.
A bias is really just a pre-conditioned mental shortcut. They are so common and widespread that there’s a Wikipedia page listing around 200 cognitive biases. For example, did you know there is something called the “IKEA effect”?
“The tendency for people to place a disproportionately high value on objects that they partially assembled themselves, such as furniture from IKEA, regardless of the quality of the end result.”
And so, given the preponderance of predispositions we bring to our percipience, in this month’s BQF we’ll ask: Can you trust your judgment?
We’ll begin by trying to reveal some of our favorite biases, and then proceed to discuss how those biases might negatively affect our relationships, and how we might address or compensate for our various filters, lenses and slants. If we are so inclined.
We hope to see you Tuesday, February 28th at 7:30 in the Parlor at First Parish.
Other references
- Michael Lewis’ most recent book is called The Undoing Project and is described on Amazon.com as “How a Nobel Prize–winning theory of the mind altered our perception of reality”. Lewis tells the compelling story of the two Israeli scientists, Kahneman and Tversky, who opened the world’s eyes to the inherent biases in judgment and decision-making. Our biases, they said, are a kind of shortcut in our thinking, a predetermined pattern they called a heuristic.
- From last week’s Bell Notes:
The Racial Justice Task Force challenges each of us to explore the implicit biases that we all have. We urge you to read Blind Spot: Hidden Biases of Good People by Mahzarin R. Banaji and Anthony G. Greenwald, or read/listen to shorter info here: http://blindspot.fas.harvard.edu/Book . WARNING: Awareness may be the first step to enlightenment but the following action may make you uncomfortable – and curious! Learn about social mindbugs and the hidden costs of stereotypes for all of us. Take the Implicit Bias Test here at https://implicit.harvard.edu/implicit/ (takes ~15 minutes).
- Here are two websites that list common biases:
• 12 cognitive biases that prevent you from being rational
• 20 Cognitive Biases That Affect Your Decisions
- A recent article called Cognitive bias cheat sheet did an excellent job in distilling the 200 biases into categories that fall into four primary drivers of our need for cognitive shortcuts:
• Too much information, so we find ways to distill and package it
• Not enough meaning, so we interpolate and extrapolate into neat mental models
• The need to act fast impels us to jump to conclusions, and favor convenient interpretations
• Our limited memory requires us to retain a small subset of the information we receive.
The article includes a “cheat sheet” that breaks down the four drivers of bias, and some of the resulting behaviors.