Saturday, 12 September 2015

How To Avoid Your Own Brain's Biases | Big Think

How To Avoid Your Own Brain's Biases | Big Think



Thinking, like seeing, has built-in blind spots. An
old parable and Husserl’s matchbox can illuminate these geometric,
biological, and cognitive limits. We can't evade their unseen dangers
unaided.
In the parable six blind men try to describe an elephant
they’re standing beside. Feeling what’s in front of him, each has
“direct” evidence that it’s a snake, spear, fan, tree, wall, or rope (details vary). But only combining perspectives gives the whole picture.
Perspective also constrains the sighted, as Edmund Husserl demonstrated using a matchbox:
geometry ensures only three sides are visible at a time. In Husserl’s
philosophy, all experience is embodied, and knowledge is prone to bodily
perspective limits. Assumptions also limit and frame thinking, they’re
the mental version of line of sight, and can cause “theory-induced blindness.
While reading this, you are actively ignoring another optical limitation Evolution gave our eyes blind spots
(there are no light receptors where the optic nerve connects to the
retina). These blind spots are invisible because our brains evolved to
concoct a continuous visual field.
The brain has its own blind spots, but “cognitive biases” are often misunderstood. Take Ezra Klein’s article on “identity-protective cognition” research which he says “tells us we can’t trust our own reason."
He seems surprised that our thinking is biased to protect our
self-image and peer-image. Isn’t the brain’s primary mission protecting
its owner? We all have that bias, and we’re all prone to others.
Here’s the needed logic: Since all human brains have biases
that they’re unaware of, mine must also. However certain I feel about
an issue, it’s irrational to ignore the potential influence of my own
biases. Reason dictates using assisted thinking. This is old wisdom, as
the Bible asks, “Who can discern their own errors?” Shakespeare laments,
O that you could turn your eyes toward the napes of your necks, and make but an interior survey.”
Barring rare geniuses, we often don’t think well
alone. But two heads are better only if, like two eyes, they have
different perspectives. Gerrymandering your mind by consulting only the
like-minded doesn't balance biases, it reinforces them. Constructive
co-thinking requires diversity.
An identity too many thinkers seek to protect is that
of the all-seeing, all-knowing intellectual gladiator. Debate is framed
as combat which distorts the objective to winning, not improving ideas.
Framing as a conversion or a building project is better. Conversations
are enriched by differences. And good building projects require the best
materials from any source.
A basic cognitive geometry applies: Unless what
you’re pondering is small or well understood, multiple vantage points
are advantageous. Truth usually has multiple paths, it’s safely
 approachable from different assumptions. The wise seek bias-balancing
heterospective (other-view-ness). It’s the only cure for known
unseeables.


If No Brain Is Free Of Bias, What Can We Trust? | Big Think

If No Brain Is Free Of Bias, What Can We Trust? | Big Think



If no brain is free of bias, what can we trust? Which field’s views can we rely on?
1. Redoing 100 psychology studies found two-thirds didn’t replicate, causing much concern.  
2. Meanwhile, Noah Smith matter-of-factly writes,
“Traditionally, economists ... put the facts in a subordinate role [to]
theory. ... Plausible-sounding theories are believed to be true unless
proven false, while empirical facts are often dismissed.” Isn’t that worse than failed replication? A recipe for data-decorated faith?
3. Smith calls economics “a rogue branch of applied math” that “evolved different scientific values.”
But can “scientific” rightly apply where empirical facts don’t rule?
Isn’t that utter non-science? Unless facts reign, what separates the
sciences from superstition?
4. Real sciences permit only shakable faiths (see Science’s Toughest Test).
Only bias-balancing processes are held sacred — not inputs or outputs,
not cherished assumptions or results. That isn’t the game Smith
describes. No one is immune to their beloved beliefs (or
“identity-protective cognition”), but the sciences organize themselves to counter such biases — they’re reality refereed.  
5. Perhaps these economic quibbles are minor? Apparently principles “are by no means universally agreed.”  And faith in free-market economics rests on incomplete logic and near-utopian assumptions — in no real case can free markets do what’s preached.
Maybe that’s OK... if in your game plausible theory-faith beats
empirical facts. But in more trustworthy games, new facts must oust old
certainties (e.g., redistribution ≠ less growth).
6. Smith hopes theory-free “Big Data” means that
empirical economics will soon “dominate.” But economic data suffers high
“causal density.” And its “gold standard” randomized clinical trials have limits. Meanwhile, key metrics like GDP don’t capture key distinctions. Plus, without changes in professional values or theory-beats-data practices, will economics be more trustworthy? Maybe economics is safer descriptively rather than prescriptively.
7. Economic historian Michael Lind says economics isn’t a natural science. However much physics-gas-like math it (ab)uses, economics can’t escape its history-like aspects.
8. How data works in history is different than in physics.
In history, innovation happens. Patterns change. Yesterday’s
impossibilities become today’s driving forces. Unlike behavior in
physics, human behavior isn’t as safely generalizable. Nothing in
physics chooses. Or changes how it chooses = in social sciences extrapolation is riskier (here’s a data journalism example). Perhaps market “laws” aren’t gravity-like.
9. Biases and flaws are like foreheads — it’s easier to see others’ than your own. Escaping our own biases requires tools.
Before trusting experts, ask if their field is organized to challenge
cherished assumptions. Is its game “reality refereed”? We should trust more in processes that rigorously balance biases, not in individuals. Confirmation bias haunts even geniuses.
10. Psychology’s ills are worrying, but economics’
beliefs are more dangerous. Markets enact our ethics, powerfully and
globally. Do we want markets to make musical toilets while some starve?
11. We’re betting the planet on economic faiths (e.g., profit before planetary health). If cherry-picked data-doting free-market dogma doesn’t pan out, what’s our fallback?
In a world full of biases and risks, the wise guard against “theory-induced blindness.” And they contingency plan.