The dish

Thinking, Fast and Slow.

Daniel Kahneman won the 2002 Nobel Prize in Economic Sciences (yes, the ‘fake’ Nobel) for his groundbreaking work in behavioral economics, the branch of the dismal science that shows we are even bigger idiots than we previously believed. Kahnemann’s work, and his best-selling book Thinking, Fast and Slow, identify and detail the various cognitive biases and illusions that affect our judgment and decision-making, often leading to suboptimal or undesirable outcomes that might be avoided if we stop and think more critically and less intuitively. (It’s just $2.99 for the Kindle right now, through that link.)

Kahnemann breaks the part of our brain that responds to questions, challenges, or other problems into two separate systems, which he calls System 1 and System 2. System 1 is the fast-reaction system: When you hear or read a question, or face a specific stimulus, your brain brings back an answer, an image, or a memory without you having to consciously search the hard drive and call up the file. System 2 does what we would normally think of as “thinking:” slow calculations, considering variables, weighing options, and so on. The problem, as Kahnemann defines it, is that System 2 is lazy and often takes cues from System 1 without sufficiently questioning them. System 1 can be helpful, but it isn’t always your friend, and System 2 is passed out drunk half the time you need it.


Systems 1 and 2 in a rare moment of concordance.

The good news here is that Kahneman’s work, much of which with his late colleage Amos Tversky (who died before he could share the Nobel Prize with Kahneman), offers specific guidance on the breakdowns in our critical thinking engines, much of which can be circumvented through different processes or detoured by slowing down our thinking. One of the biggest pitfalls is what Kahneman calls WYSIATI – What You See Is All There Is, the process by which the brain jumps to a conclusion on the basis of insufficient evidence, because that evidence is all the brain has, and the human brain has evolved to seek causes for events or patterns. This leads to a number of biases or errors, including:

Although Kahneman has crafted enough of a flow to keep the book coherent from chapter to chapter, Thinking, Fast and Slow is primarily a list of significant biases or flawed heuristics our brains employ and explanations of how they work and how to try to avoid them. This includes the availability heuristic, where we answer a question about probability or prevalence by substituting the easier question of how easy it is to remember examples or instances of the topic in question. If I give you a few seconds to tell me how many countries there are in Africa, you might name a few in your head, and the faster those names come to you, the larger your guess will be for the total.

Thinking, Fast and Slow also offers an unsettling section for anyone whose career is built on obtaining and delivering knowledge, such as subject-matter experts paid for their opinions, a category that includes me: We aren’t that good at our jobs, and we probably can’t be. One major reason is the representativeness fallacy, which leads to the base-rate neglect I mentioned earlier. The representativeness fallacy leads the subject – let’s say an area scout here, watching a college position player – to overvalue the variables he sees that are specific to this one player, without adequately weighting variables common to the entire class of college position players. It may be that college position players from that particular conference don’t fare as well in pro ball as those from the SEC or ACC; it may be that college position players who have or lack a specific skill have higher/lower rates of success. The area scout’s report, taken by itself, won’t consider those “base rates” enough, if at all, and to a large degree teams do not expect or ask the area scouts to do so. However, teams that don’t employ any kind of system to bring those base rates into their overall decision-making, from historical research on player archetypes to analysis of individual player statistics adjusted for context, will confuse a plethora of scouting opinions for a variety of viewpoints, and will end up making flawed or biased decisions as a result.

Kahneman’s explanation of regression to the mean, and how that should impact our forecasting, is the best and clearest I’ve come across yet – and it’s a topic of real interest to anyone who follows baseball, even if you’re not actually running your own projections software or building an internal decision-sciences system. Humans are especially bad at making predictions where randomness (“luck”) is a major variable, and we tend to overweight recent, usually small samples and ignore the base rates from larger histories. Kahneman lays out the failure to account for regression in a simple fashion, pointing out that if results = skill + luck, then the change in results (from one game to the next, for example) = skill + change in luck. At some point, skill does change, but it’s hard or impossible to pinpoint when that transpires. Many respected baseball analysts working online and for teams argue for the need to regress certain metrics back to the mean to try to account for the interference of randomness; one of my main concerns with this approach is that while it’s rational, it may make teams slower to recognize actual changes in skill level (or health, which affects skill) as a result. Then again, that’s where scouts can come in, noticing a decline in bat speed, a change in arm slot, or a new pitch that might explain why the noise has more signal than a regression algorithm would indicate.

One more chapter relevant to sports analytics covers the planning fallacy, or what Christina Kahrl always referred to as “wishcasting:” Forecasting results too close to best-case scenarios that don’t adequately consider the results of other, similar cases. The response, promulgated by Danish planning expert Bert Flyvbjerg (I just wanted to type that name), is called reference class forecasting, and is just what you’d expect the treatment for the planning fallacy to include. If you want to build a bridge, you find as many bridge construction projects as you can, and obtain all their statistics, such as cost, time to build, distance to be covered, and so on. You build your baseline predictions off of the inputs and results of the reference class, and you adjust it accordingly for your specific case – but only slightly. If all 30 MLB teams did this, no free-agent reliever would ever get a four-year deal again.

Thinking explains many other biases and heuristics that lead to inferior decision-making, including loss aversion, the endowment effect, and the one Ned Colletti just screwed up, the sunk cost fallacy, where money that is already spent (whether you continue to employ the player or not) affects decisions on whether or not to continue spending on that investment (or to keep Brandon League on the 40-man roster). He doesn’t specifically name recency bias, but discusses its effects at length in the final section, where he points out that if you ask someone how happy s/he is with his/her life, the answer will depend on what’s happened most recently (or is happening right now) to the respondent. This also invokes the substitution effect: It’s hard for me to tell you exactly how happy or satisfied I am with my life as a whole, so my brain will substitute an easier question, namely how happy I feel at this specific moment.

That last third of the book shifts its focus more to the psychological side of behavioral economics, with subjects like what determines our happiness or satisfaction with life or events within, and the difficulty we have in making rational – that is, internally consistent – choices. (Kahneman uses the word “rational” in its economic and I think traditional sense, describing thinking that is reasonable, coherent, and not self-contradictory, rather than the current sense of “rational” as skeptical or atheist.) He presents these arguments with the same rigor he employs throughout the book, and the fact that he can be so rigorous without slowing down his prose is Thinking‘s greatest strength. While Malcolm Gladwell can craft brilliant narratives, Kahneman builds his story up from scientific, controlled research, and lets the narrative be what it may. (Cf. “narrative fallacy,” pp. 199-200.) If there’s a weak spot in the book, in fact, it comes when Kahneman cites Moneyball as an example of a response (Oakland’s use of statistical analysis) to the representativeness fallacy of scouting – but never mentions the part about Tim Hudson, Mark Mulder, and Barry Zito helping lead to those “excellent results at low cost.” That aside – and hey, maybe he only saw the movie – Thinking, Fast and Slow is one of the most important books for my professional life that I have ever read, and if you don’t mind prose that can be a little dense when Kahneman details his experiments, it is an essential read.

Exit mobile version