Everything is Obvious.

Duncan Watts’ book Everything is Obvious *Once You Know the Answer: How Common Sense Fails Us fits in well in the recent string of books explaining or demonstrating how the way we think often leads us astray. As with Thinking Fast and Slow, by Nobel Prize winner Daniel Kahneman, Watts’ book highlights some specific cognitive biases, notably our overreliance on what we consider “common sense,” lead us to false conclusions, especially in the spheres of the social sciences, with clear ramifications in the business and political worlds as well as some strong messages for journalists who always seek to graft narratives on to facts as if the latter were inevitable outcomes.

The argument from common sense is one of the most frequently seen logical fallacies out there – X must be true because common sense says it’s true. But common sense itself is, of course, inherently limited; our common sense is the result of our individual and collective experiences, not something innate given to us by God or contained in our genes. Given the human cognitive tendency to assign explanations to every event, even those that are the result of random chance, this is a recipe for bad results, whether it’s the fawning over a CEO who had little or nothing to do with his company’s strong results or top-down policy prescriptions that lead to billions in wasted foreign aid.

Watts runs through various cognitive biases and illusions that you may have encountered in other works, although a few of them were new to me, like the Matthew Effect, by which the rich get richer and the poor get poorer. According to the theory behind it, the Matthew Effect argues that success breeds success, because it means those people get greater opportunities going forward. A band that has a hit album will get greater airplay for its next record, even if that isn’t as good as the first one, or markedly inferior to an album released on the same day by an unknown artist. A good student born into privilege will have a better chance to attend a fancy-pants college, like, say, Harfurd, and thus benefits further from having the prestigious brand name on his resume. A writer who has nearly half a million Twitter followers might find it easier to land a deal for a major publisher to produce his book, Smart Baseball, available in stores now, and that major publisher then has the contacts and resources to ensure the book is reviewed in critical publications. It could be that the book sells well because it’s a good book, but I doubt it.

Watts similarly dispenses with the ‘great man theory of history’ – and with history in general, if we’re being honest. He points out that historical accounts will always include judgments or information that was not available to actors at the time of these events, citing the example of a soldier wandering around the battlefield in War and Peace, noticing that the realities of war look nothing like the genteel paintings of battle scenes hanging in Russian drawing rooms. He asks if the Mona Lisa, which wasn’t regarded as the world’s greatest painting or even its most famous until it was stolen from the Louvre by an Italian nationalist before World War II, ascended to that status because of innate qualities of the painting – or if circumstances pushed it to the top, and only after the fact do art experts argue for its supremacy based on the fact that it’s already become the Mona Lisa of legend. In other words, the Mona Lisa may be great simply because it’s the Mona Lisa, and perhaps had the disgruntled employee stolen another painting, da Vinci’s masterpiece would be seen as just another painting. (His description of seeing the painting for the first time mirrored my own: It’s kind of small, and because it’s behind shatterproof glass, you can’t really get close it.)

Without directly referring to it, Watts also perfectly describes the inexcusable habit of sportswriters to assign huge portions of the credit for team successes to head coaches or managers rather than distributing the credit across the entire team or even the organization. I’ve long used the example of the 2001 Arizona Diamondbacks as a team that won the World Series in spite of the best efforts of its manager, Bob Brenly, to give the series away – repeatedly playing small ball (like bunting) in front of Luis Gonzalez, who’d hit 57 homers that year, and using Byung-Hyun Kim in save situations when it was clear he wasn’t the optimal choice. Only the superhuman efforts by Randy Johnson and That Guy managed to save the day for Arizona, and even then, it took a rare misplay by Mariano Rivera and a weakly hit single to an open spot on the field for the Yanks to lose. Yet Brenly will forever be a “World Series-winning manager,” even though there’s no evidence he did anything to make the win possible. Being present when a big success happens can change a person’s reputation for a long time, and then future successes may be ascribed to that person even if he had nothing to do with them.

Another cognitive bias Watts discusses, the Halo Effect, seems particularly relevant to my work evaluating and ranking prospects. First named by psychologist Edward Thorndike, the Halo Effect refers to our tendency to apply positive impressions of a person, group, or company to their other properties or characteristics, so we might subconsciously consider a good-looking person to be better at his/her job. For example, do first-round draft picks get greater considerations from their organizations when it comes to promotions or even major-league opportunities? Will an org give such a player more time to work out of a period of non-performance than they’d give an eighth-rounder? Do some scouts rate players differently, even if it’s entirely subconscious, based on where they were drafted or how big their signing bonuses were? I don’t think I do this directly, but my rankings are based on feedback from scouts and team execs, so if their own information – including how teams internally rank their prospects – is affected by the Halo Effect, then my rankings will be too, unless I’m actively looking for it and trying to sieve it out.

Where I wish Watts had spent even more time was in describing the implications of these ideas and research for government policies, especially foreign aid, most of which would be just as productive if we flushed it all down those overpriced Pentagon toilets. Foreign aid tends to go to where the donors, whether private or government, think it should go, because the recipients are poor but the donors know how to fix it. In reality, this money rarely spurs any sort of real change or economic growth, because the common-sense explanation – the way to fix poverty is to send money and goods to poor people – never bothers to examine the root causes of the problem the donors want to solve, asking the targets what they really need, examining and removing obstacles (e.g., lack of infrastructure) that might require more time and effort to fix but prevent the aid from doing any good. Sending a boat full of food to a country in the grip of a famine only makes sense if you have a way to get the food to the starving people, but if the roads are bad, dangerous, or simply don’t exist, then that food will sit in the harbor until it rots or some bureaucrat sells it.

Everything Is Obvious is aimed at a more general audience than Thinking Fast and Slow, as its text is a little less dense and it contains fewer and shorter descriptions of research experiments. Watts refers to Kahneman and his late reseach partner Amos Tversky a few times, as well as other researchers in the field, so it seems to me like this book is meant as another building block on the foundation of Kahneman’s work. I think it applies to all kinds of areas of our lives, even just as a way to think about your own thinking and to try to help yourself avoid pitfalls in your financial planning or other decisions, but it’s especially apt for folks like me who write for a living and should watch for our human tendency to try to ascribe causes post hoc to events that may have come about as much due to chance as any deliberate factors.