My Brain Hurts

This isn’t about guns, but it is about everything, and since TTAG reviewers analyze and review everything related to guns, it seems valid to consider whether they’re doing useful work – or just confusing the issue. In We Are All Talk Radio Hosts, Jonah Lehrer challenges the usefulness of rational analysis. He starts off talking about strawberry jam – specifically, conflicting reviews of name-brand jams.

What happened? Wilson and Schooler argue that “thinking too much” about strawberry jam causes us to focus on all sorts of variables that don’t actually matter. Instead of just listening to our instinctive preferences, we start searching for reasons to prefer one jam over another. For example, we might notice that the Acme brand is particularly easy to spread, and so we’ll give it a high ranking, even if we don’t actually care about the spreadability of jam. Or we might notice that Knott’s Berry Farm has a chunky texture, which seems like a bad thing, even if we’ve never really thought about the texture of jam before. But having a chunky texture sounds like a plausible reason to dislike a jam, and so we revise our preferences to reflect this convoluted logic.

And it’s not just jam: Wilson and others have since demonstrated that the same effect can interfere with our choice of posters, jelly beans, cars, IKEA couches and apartments. We assume that more rational analysis leads to better choices but, in many instances, that assumption is exactly backwards.

Presumably, a reviewer could also get all caught up in the minutiae of a given gun design, disdaining the pattern of the grip while ignoring that he and other shooters instinctively find the gun easy to shoot.

The larger moral is that our metaphors for reasoning are all wrong. We like to believe that the gift of human reason lets us think like scientists, so that our conscious thoughts lead us closer to the truth. But here’s the paradox: all that reasoning and confabulation can often lead us astray, so that we end up knowing less about what jams/cars/jelly beans we actually prefer. So here’s my new metaphor for human reason: our rational faculty isn’t a scientist – it’s a talk radio host. That voice in your head spewing out eloquent reasons to do this or do that doesn’t actually know what’s going on, and it’s not particularly adept at getting you nearer to reality.

Instead, it only cares about finding reasons that sound good, even if the reasons are actually irrelevant or false. (Put another way, we’re not being rational – we’re rationalizing.)

While it’s easy to read these crazy blog comments and feel smug, secure in our own sober thinking, it’s also worth remembering that we’re all vulnerable to sloppy reasoning and the confirmation bias. Everybody has a blowhard inside them.

And this is why it’s so important to be aware of our cognitive limitations. Unless we take our innate flaws into account, the blessing of human reason can easily become a curse.

2 Responses to Thinking Too Much

  1. The authors make interesting points, but I think the truth lies somewhere in the middle. If you spend too long navel-gazing or admiring the problem, you end up with paralysis by analysis. On the other hand, if you rely entirely on "feelings," "instinct" or "intuition" you're likely to make impulsively wrong decisions. I like the concept of being aware of our own cognitive limitations, but it's important to also be aware of our biases. For instance, if I acknowledge my fondness for 1911s, at what point does that translate into a bias against polymer pistols? It's something I have to be aware of when I write, and make sure I'm not dissing some pistol just because it's not a 1911. (In point of fact, I'm rather fond of XDs, Sigs, and Rugers, too.) But you get my point. We all filter information through our own biases and must deal with the limitations of our ability to obtain/process/analyze data. To ignore this is to invite faulty decisions.

    • As is so often the case, this suspicion of rationality is all over the internet, from Scott Adams to Newsweek:

      http://dilbert.com/blog/entry/first_impressions/

      But it made me reflect on how many times my own first impressions are accurate. Consider movies. I can tell you whether or not I will like an entire movie within the first two minutes, with perhaps a 95% success rate. In fact, that first two minutes is probably more predictive than the movie trailer.

      It’s the same with books. I can open a book to any page, read any half-dozen sentences, and come away with an accurate idea of how much I might enjoy the entire book.

      Cars, homes, pets – it’s the same thing. Whatever I like in the first minute, I usually like forever. Assuming most of you are the same way, to some degree, what does it say about people?

      One theory is that we’re good at predicting the quality of things from scant clues. But can you really tell if a movie will have a good plot, which presumably matters, from the first two minutes?

      A second theory is that we make up our minds about things based on the first few irrational cues, and everything that follows is rationalization. So if there’s something in the first two minutes of a movie that I like, for whatever subconscious reasons, I later think that the directing, acting, and plot were also good (enough), even if on some objective level they were not.

      As part of my training for hypnosis, years ago, I learned that human brains are rationalization machines, not logic machines. That’s hard to accept, especially in yourself. Your brain tells you otherwise. It insists it is completely rational.

      http://www.newsweek.com/2010/08/05/the-limits-of-reason.html

      That puts poor reasoning in a completely different light. Arguing, after all, is less about seeking truth than about overcoming opposing views. So while confirmation bias, for instance, may mislead us about what’s true and real, by letting examples that support our view monopolize our memory and perception, it maximizes the artillery we wield when trying to convince someone that, say, he really is “late all the time.” Confirmation bias “has a straightforward explanation,” argues Mercier. “It contributes to effective argumentation.”

Leave a Reply

Your email address will not be published. Required fields are marked *