Eating to Extinction.

Dan Saladino’s Eating to Extinction: The World’s Rarest Foods and Why We Need to Save Them makes its important point – that declining biodiversity will impact our food supply in multiple ways – in unusual fashion: Rather than arguing the point in a straight narrative, Saladino gives the reader a tour of many of the rare foods at risk of extinction from environmental degradation, globalization, even over-regulation in some cases, presenting the scientific case for preserving them but relying more on emotional appeals. We’ll miss these foods if they’re gone, or maybe we’ll want to try them more for knowing they exist and might disappear.

The strongest arguments here come in the various sections on plants, because of the evolutionary case Saladino offers. Take the banana, probably the best-known sustainability problem in our food supply: Most of the bananas sold in the world are Cavendish bananas, every plant of which is genetically identical, because the plants themselves are sterile and must be propagated via clones. This deprives the plants of the opportunity to develop new defenses to pathogens or environmental changes via evolution; mutations are discouraged in monoculture farming. The Cavendish itself is now defenseless against a real threat to its existence: Panama disease, which previously wiped out Gros Michel banana plantations, has mutated and is in the process of wiping out Cavendish plantations as well. The banana you know and love is, to put it bluntly, fucked.

Saladino offers examples from the other side of the evolutionary equation, identifying rare fruits, vegetables, and other plants like wild coffee that offer both the genetic diversity these plants will need to survive – forever, even after our species is gone – and more immediate benefits to us, such as unique flavors or cultural legacies. Coffee is struggling in the face of climate change that is driving it to higher altitudes and pests like the fungus that causes coffee-leaf rust; the wild coffees of Ethiopia may provide genetic solutions, at least until the next crisis comes along. There’s a wild maize plant in Mexico that fixes its own nitrogen through a symbiotic relationship with a bacterium, a crop that could help address the world’s growing need for food. The wheat we’ve selected for easy harvesting and processing is close to a monoculture, and it wouldn’t take much to collapse the annual crop, even though there are hundreds of thousands of known varieties of wild wheat, like the wild emmer wheat of eastern Turkey known as kavilca.

He explores the impact that even so-called ‘sustainable’ solutions often have on wild populations, and how what works for our food supply in the short term leaves it even more vulnerable in the long term. We’ve nearly wiped out wild Atlantic salmon and are well on our way to doing the same in the Pacific, while farmed salmon fill our stores and plates, but when those farmed salmon get loose from their aquaculture pens, they interbreed with wild populations and can reduce genetic diversity, leaving those fish more vulnerable to diseases.

Some of these endangered foods are more closely tied to culture than to global food needs or biodiversity, such as the honey gathered by the native Hadza people in Tanzania, where local bee and bird populations are threatened both by habitat destruction and the loss of symbiotic relationships they’ve developed with humans. Certain birds would identify hives in baobab trees that contained honey, and humans would hear their calls and bring down the nests. The humans would eat the honey and parts of the honeycomb, while the birds would wait nearby to consume what the humans did not. This entire way of life is disappearing as native populations lose their land and become assimilated into urban life and dependent on processed foods.

Along the way, Saladino explains (several times) the presence of various seed banks around the world, including the critical one on the island of Svalbard in the Arctic Ocean, and the two great success stories of the Haber-Bosch process of fixing nitrogen in artificial fertilizer and the Green Revolution – the post-WWII adoption of high-yielding varieties of cereal and grain crops, notably dwarf wheat and rice, along with scientific methods of increasing yields through those artificial fertilizers and massive monocultures. (Not mentioned is how Haber’s research, which has helped accelerate climate change, also led to the development of Zyklon-B.) There’s quite a bit of science in here, which does help move things along in what amounts to a series of mini-essays on dozens of foods.

Saladino’s reference-work approach isn’t entirely successful for that last reason; sometimes, it’s like reading an encyclopedia. It’s often an interesting one, and Saladino went to all of these places to try the endangered foods and eat them with the locals who grow or gather or develop them. But such a broad look at the subject guarantees that some essays will be duds, and by the time we get to the end, Saladino’s epilogue, “think like a Hadza,” is so far removed from the opening essay on those people and their honey-gathering that the throughline connecting all of these foods has started to fray a bit. It works best as a call to action – we need to find and value these products, to keep them alive and protect those habitats or those cultures, and to stop relying on these monocultures to feed ourselves. You can find other wheat flours even at Whole Foods and similar stores, while there might even be local mills or growers near you offering unconventional (and thus genetically distinct) flours and grains and beans. Our diets will be richer for it, and we’ll be taking a small step towards protecting the future of humanity before we scorch the planet growing the same five crops.

Next up: I just finished Jonathan Franzen’s Freedom.

This is Your Mind on Plants.

Michael Pollan made a name for himself, or perhaps a bigger name, for his book The Omnivore’s Dilemma, which came off like such an attack on our modern diets that he wrote a brief companion book called In Defense of Food. In defense of Pollan, however, his writing goes well beyond those two books or that subject; he can be a gifted writer on many matters of food and food science, and is not the scold that Omnivore’s Dilemma might lead you to believe that he is. Cooked: A Natural History of Transformation is a history of food and food science, and an explanation of how we used fire and heat to change the way we ate, in turn changing the trajectory of our species. His most recent book, a collection of two previously published essays plus a third, is called This is Your Mind on Plants, and covers three psychoactive compounds or chemicals produced by the plant world: opium, caffeine, and mescaline.

By far, my favorite part of this book was the portion on caffeine, which was originally released as an Audible original and excerpted by The Guardian as part of its longread series a few months ago. Pollan was a caffeine addict, like the overwhelming majority of Americans, and as part of his research into the chemical’s effects on our brains and our lives, chose to give it up completely before gradually reintroducing it into his life. He spoke to Dr. Matthew Walker, author of How We Sleep, who is a scold, at least on this topic, and among other things claims that caffeine’s half-life is around 6 hours, so a quarter of the caffeine you consumed in a cup of joe at 9 am is still in your system at 9 pm. (Estimates of its actual half-life vary, but it may be closer to 5 hours, which would push up that latter time to 7 pm.) Caffeine in the afternoon, which we often consume to combat our bodies’ evolved tendency towards biphasic sleep, is especially harmful; the iced coffee you have at 2 pm would still leave more than a quarter of its caffeine in your system at 11 pm, a typical bedtime for adults who have kids or at least have to work in the morning.

Most people understand on some level that caffeine can harm your sleep quantity and quality, but Pollan also points out how much we depend on caffeine each day for simple alertness, to feel like we think clearly, to clear the fog of sleep – or, of course, the fog of caffeine withdrawal. There is even research showing that caffeine can help certain types of recall and improve our reaction times in certain physical tasks, although viewers of Good Eats know that caffeine may make you work faster, but it doesn’t make you work smarter. Pollan gives a breezy history of caffeine and its two major delivery systems (tea and coffee), including descriptions of their ties to colonialism, exploitation of native peoples, and slavery, before bringing us back to the narrative of his caffeine withdrawal and reintroduction.

The opium essay appeared in slightly redacted form in Harper’s in the late 1990s, and is less about what the drugs derived from opium do than Pollan’s own misadventures in growing poppies in his own garden, only to discover that he may be violating federal law by doing so. Opium is a latex taken from the seed capsules of the Papaver somniferum plant, although Pollan claims that there are other poppies that can produce some of the same compounds, just in smaller quantities. The drugs we associate with poppies are opiates, alkaloids found within the latex, including morphine and codeine; or derivative products, such as heroin (made through acetylation of morphine) or oxycodone (synthesized from thebaine in the latex). You can consume the raw latex, which is supposed to be unspeakably bitter, and will cause nervous system depression. Pollan didn’t end up doing that, although he certainly thought about it, and wrote about thinking about it, and expunged a few pages until releasing the full article here. He describes the conversations from the time around what it was safe to write, while his editor at the time, John R. MacArthur, has disputed Pollan’s version of events. Anyway, Pollan drank some opium tea, and said it tasted awful but felt nice.

Then there’s mescaline, which, of these three drugs, has the unusual characteristic of offering very little downside to the user. Its use is highly restricted, because Drugs Are Bad! even though there’s a small body of evidence that mescaline, derived from a cactus that grows in the American southwest, and psilocybin, produced by several hundred species of fungi mostly in the Psilocybe genus, may help people with severe depression or anxiety. The majority of Pollan’s essay here revolves around mescaline’s somewhat recent history of use in religious ceremonies among certain indigenous American tribes, the ridiculous laws around its use, and environmental and cultural concerns around it. He eventually tries some as well, and has what sounds like a very pleasant experience of heightened awareness with mild hallucinations, not something that might fit the stereotype of a trip. I have never tried either of these psychotropics, and Pollan’s narrative made me slightly more curious about them.

Pollan the anti-scold is an insightful, conversational writer who is unafraid to educate his readers but never loses sight of the need to entertain at the same time. There might be a bit too much of him in the opium section – the idea of DEA agents bashing down his door because he had two poppies in his garden might come across as paranoid – but despite his first-person writing in the remaining two sections, he takes care not to let his persona take over. His thoughtfulness in describing the mescaline ceremony he witnesses, for example, does him credit; he’s just trying to get high, so to speak, not to appropriate anyone’s culture. It’s a short book, compiling some pieces you may have read before, but an enjoyable diversion, and one more tiny brick in the wall for drug decriminalization.

Next up: Helen DeWitt’s The Last Samurai, because Mike Schur told me to read it.

Infinite Powers.

I’m a sucker for a good book about math, but a lot of books about math aren’t that good – either they’re dry, or they don’t do enough to explain why any of this matters. (Sometimes it doesn’t matter, as in Prime Obsession, but the author did such a good job of explaining the problem, and benefited from the fact that it’s still unsolved.) Steven Strogatz’s Infinite Powers: How Calculus Reveals the Secrets of the Universe manages to be entertaining, practical, and also educational, as the author builds up the reader through some essentials of pre-calculus before getting into the good stuff, to the point that I recommended that my daughter check it out before next year when she takes calculus in school.

Calculus underlies everything in the universe; it is the foundation upon which the universe, and everything in it, functions. It is also one of humanity’s most remarkable discoveries, one that required multiple leaps of mathematical faith to uncover hidden truths about the universe. Physicist Richard Feynman quipped that it is “the language that God talks,” although he meant it in a secular sense, while mathematician Felix Klein said that one could not understand “the basis on which the scientific explanation of nature rests” without at least some understanding of differential and integral calculus.

 The story of how both Isaac Newton and Gottfried Wilhelm Leibniz simultaneously discovered calculus in the late 1600s, doing so both with their own remarkable insights and by building on the discoveries of mathematicians before them, going back to the ancient Greeks, would by itself be enough for an entertaining history. Strogatz does start with that, and uses the history as scaffolding to bring the reader up from algebra through geometry and trigonometry to the mathematics of limits, which is the essential precursor to calculus, before getting to the main event.

Or I should say “events,” as differential and integral calculus, while two sides of the same analytical coin, were discovered at separate times, with separate methods, and Strogatz tells their stories separately before bringing them together towards the end of the book. Differential calculus is what we learn first in schools, at least in the United States. It’s the mathematics of the rates of change; the rate at which a function changes is the derivative of that function. Acceleration is the derivative of velocity – that is, the rate at which velocity is changing. Velocity, in turn, is the derivative of position – the rate at which an object’s position changes. That also makes acceleration the second derivative of position, which is why you see a 2 in the formula for the acceleration of an object falling due to Earth’s gravity (9.8 m/s2): a position might be measured in meters, so velocity is measured as the change in position (meters) by time (seconds), and acceleration is the change in velocity (meters per second) by time (seconds, again).

Integral calculus goes the other way – given an object’s acceleration, what is its velocity at a given point in time? Given its velocity, what is its position? But Leibniz and Newton – I expect to hear from Newton’s lawyers for listing him second – conceived of integration as a way to solve an entirely different problem: How to determine the area under a curved function. Those two didn’t think of it that way – the concept of a function came somewhat later – but they understood the need to find out the area underneath a curve, and came up, independently, with the same solution, which broke apart the space into a series of rectangles of known heights and near-zero widths, giving rise to the infinitesimals familiar to any student who’s taken integral calculus. They aren’t real numbers, although they do appear in more arcane number systems like the hyperreals, yet the sum of the areas of this infinitesimally narrow rectangles turns out to be a real number, giving you the area under the curve in question. This insight, which was probably Leibniz’s first, opened the world up for integral calculus, which turns out to have no end of important applications in physics, biology, and beyond.

Strogatz grounds the book in those applications, devoting the last quarter or so of Infinite Powers to discussing the modern ways in which we depend on calculus, even taking its existence for granted. GPS devices are the most obvious way, as the system wouldn’t function without the precision that calculus, which GPS uses for dealing with errors in the measurements of distances, offers – indeed, it’s also used to help planes land accurately. Yet calculus appears in even less-expected places; biologists used it to model the shape of the double helix of strands of DNA, treating a discrete object (DNA is just a series of connected molecules) as a continuous one. If your high school student ever asks why they need to learn this stuff, Infinite Powers has the answers, but also gives the reader the background to understand the author’s explanations even if you haven’t taken math in a few decades.

Next up: David Mitchell’s The Thousand Autumns of Jacob de Zoet.

Mistakes Were Made (But Not by Me).

Mistakes Were Made (But Not by Me) is the story of cognitive dissonance, from its origins in the 1950s – one of the authors worked with Dr. Leon Festinger, the man who coined the term – to the modern day, when we routinely hear politicians, police officers, and sportsball figures employ it to avoid blame for their errors. What Dr. Carol Tavris and Dr. Elliot Aronson, the authors of the book, emphasize in Mistakes Were Made, however, is that this is not mere fecklessness, or sociopathy, or evil, but a natural defense mechanism in our brains that protects our sense of self.

Cognitive dissonance refers to the conflict that arises in our brains when an established belief runs into contradictory information. We have the choice: Admit our beliefs were mistaken, and conform our beliefs to the new information; or, explain away the new information, by dismissing it, or interpreting it more favorably (and less accurately), so that our preconceived notions remain intact. You can see this playing out right now on social media, where anti-vaxxers and COVID denialists will refuse to accept the copious amounts of evidence undermining their views, claiming that any contradictory research came from “Pharma shills,” or was in unreliable journals (like JAMA or BMJ, you know, sketchy ones) or offering specious objections, like the possible trollbot account claiming a sample size of 2300 was too small.

The term goes back to the 1950s, however, when a deranged Wisconsin housewife named Dorothy Martin claimed she’d been communicating with an alien race, and a bunch of other morons followed her, in some cases selling their worldly possessions, because the Earth was going to be destroyed and the aliens were coming to pick them up and bring them to … I don’t know where, the fifth dimension or something. Known as the Seekers, they were inevitably disappointed when the aliens didn’t know. The crazy woman at the head of the cult claimed that the aliens had changed their minds, and her followers had somehow saved the planet after all.

What interested Festinger and his colleagues was how the adherents responded to the obvious disconfirmation of their beliefs. The aliens didn’t come, because there were no aliens. Yet many of the believers still believed, despite the absolute failure of the prophecy – giving Festinger et al the name of their publication on the aftermath, When Prophecy Fails. The ways in which these people would contort their thinking to avoid the reality that they’d just fallen for a giant scam, giving up their wealth, their jobs, sometimes even family connections to chase this illusion opened up a new field of study for psychologists.

Tavris and Aronson take this concept and pull it forward into modern contexts so we can identify cognitive dissonance in ourselves and in others, and then figure out what to do about it when it rears its ugly head. They give many examples from politicians, such as the members of the Bush Administration who said it wasn’t torture if we did it – a line of argument that President Obama did not reject when he could have – even though we were torturing people at Guantanamo Bay, and Abu Ghraib, and other so-called “black sites.” They also show how cognitive dissonance works in more commonplace contexts, such as how it can affect married couples’ abilities to solve conflicts between them – how we respond to issues big and small in our marriages (or other long-term relationships) can determine whether these relationships endure, but we may be stymied by our minds’ need to preserve our senses of self. We aren’t bad people, we just made mistakes – or mistakes were made, by someone – and it’s easier to remain believers in our inherent goodness if we deny the mistakes, or ascribe them to an external cause. (You can take this to the extreme, where abusers say that their victims “made” them hit them.)

There are two chapters here that I found especially damning, and very frustrating to read because they underscore how insoluble these problems might be. One looks at wrongful convictions, and how prosecutors and police officers refuse to admit they got the wrong guy even when DNA evidence proves that they got the wrong guy. The forces who put the Central Park Five in prison still insisted those five innocent men were guilty even after someone else admitted he was the sole culprit. The other troubling chapter looked at the awful history of repressed memory therapy, which is bullshit – there are no “repressed memories,” so the whole idea is based on a lie. Memories can be altered by suggestion, however, and we have substantial experimental research showing how easily you can implant a memory into someone’s mind, and have them believe it was real. Yet therapists pushed this nonsense extensively in the 1980s, leading to the day care sex abuse scares (which put many innocent people in jail, sometimes for decades), and some still push it today. I just saw a tweet from someone I don’t know who said he was dealing with the trauma of learning he’d been sexually abused as a child, memories he had repressed and only learned about through therapy. It’s nonsense, and now his life – and probably that of at least one family member – will be destroyed by a possibly well-meaning but definitely wrong therapist. Tavris and Aronson provide numerous examples, often from cases well-covered in the media, of therapists insisting that their “discoveries” were correct, or displaying open hostility to evidence-based methods and even threatening scientists whose research showed that repressed memories aren’t real.

I see this stuff play out pretty much any time I say something negative about a team. I pointed out on a podcast last week that the Mets have overlooked numerous qualified candidates of color, in apparent violation of baseball’s “Selig rule,” while reaching well beyond normal circles and apparently targeting less qualified candidates. The response from some Met fans was bitter acknowledgement, but many Met fans responded by attacking me, claiming I couldn’t possibly know what I know (as if, say, I couldn’t just call or text a reported candidate to see if he’d been contacted), or to otherwise defend the Mets’ bizarre behavior. Many pointed out that they tried to interview the Yankees’ Jean Afterman, yet she has made it clear for years that she has no interest in a GM job, which makes this request – if it happened at all – eyewash, a way to appear to comply with the Selig rule’s letter rather than its intent. Allowing cognitive dissonance to drive an irrational defense of yourself, or your family, or maybe even your company is bad enough, but allowing it to make you an irrational defender of a sportsball team in which you have no stake other than your fandom? I might buy a thousand copies of Craig Calcaterra’s new book and just hand it out at random.

Theauthors updated Mistakes Were Made in 2016, in a third edition that includes a new prologue and updates many parts of the text, with references to more recent events, like the murders of Tamir Rice and Eric Garner, so that the text doesn’t feel as dated with its extensive look at the errors that led us into the Iraq War. I also appreciated the short section on Andrew Wakefield and how his paper has created gravitational waves of cognitive dissonance that we will probably face until our species drives itself extinct. I couldn’t help but wonder, however, how the authors might feel now about Michael Shermer, who appears in a story about people who believe they’ve been abducted by aliens (he had such an experience, but knew it was the result of a bout of sleep paralysis) and who provides a quote for the back of the book … but who was accused of sexual harassment and worse before this last edition was published. Did cognitive dissonance lead them to dismiss the allegations (from multiple women) and leave the story and quote in place? The authors are human, too, and certainly as prone to experiencing cognitive dissonance as anyone else is. Perhaps it only strengthens the arguments in this short and easy-to-read book. Mistakes Were Made should be handed to every high school student in the country, at least until we ban books from schools entirely.

Next up: David Mitchell’s The Thousand Autumns of Jacob de Zoet.

Bird by Bird.

When I asked readers for suggestions for books about writing, the second-most cited book, after Stephen King’s On Writing, was Anne Lamott’s Bird by Bird: Some Instructions on Writing and Life. It’s a wonderful, slim book of short but very potent essays on just about everything related to writing, with an emphasis on fiction (and, at that, I’d say the short form), but much of it is also applicable to other forms of writing or merely the act of writing itself. It inspired me, and I say that as someone who is infrequently inspired at this point, even when it comes to writing about things I enjoy.

The book is filled with advice, and I don’t want to reproduce much of it here, because you should go read the book itself, and also because the advice just sounds much better in Lamott’s voice, with her wry humor and copious examples. She draws extensively on her experience teaching writing classes as well as writing for herself, allowing her to speak about things like writer’s block, creating credible characters, publishing, not publishing, and more in both her own voice and those of her students. I found nearly all of this advice to either ring true to my own experiences – especially that on writer’s block, something I haven’t truly experienced, because I can always just write something else and get things moving again – or to answer questions I’ve always had, such as how to do things like create those credible characters or write dialogue that sounds true, both to how people talk (which isn’t as easy as it sounds) and to the characters speaking it.

There’s plenty in here on getting started, which is something I often hear from aspiring writers is a huge part of the problem – they want to write, but can’t figure out how to begin. (With the first word, of course.) Lamott has sage advice on reasons to write, and reasons not to do so – not if you think it’s a quick route to wealth, or financial freedom, or popularity; if you doubt her, she has plenty of failure stories from her own career, from books rejected by publishers to dealing with self-doubt and the voices in her head that love to tell her she’s not any good at writing. (She is, though. Very.) It’s always helpful to know that other writers, especially those who have had more success than I have or have had longer careers, deal with the same kind of doubts and impostor syndrome that I do, and to be reminded that writing is its own end. Writing should give you joy, to use the popular bromide of the day. If it doesn’t, don’t do it. If it does, then how much you make from it – if you make anything at all, if you even publish – doesn’t matter. 

Lamott is an irreverent writer who is perhaps best known for some of her writing on faith, including the best-selling Traveling Mercies, and while her beliefs do show up in the pages here, I thought it was always in service of her larger points, without proselytizing or excluding; on the contrary, she goes out of her way to include people of all faiths and no faiths in the book. I can’t say I was concerned – I try to read as diverse a set of authors as possible – but I include this for anyone who might have felt disinclined to read for Bird by Bird for this reason.

The title of Bird by Bird comes from a wonderful anecdote within an early essay that, in short, is the writing equivalent of taking it one day at a time. One of the biggest obstacles I have always faced as a writer, regardless of my subject, has been the discouragement I feel when I think about the whole project – its size, yes, but my ability to complete it, and make it good, and in a timely fashion, and not to be distracted by that thing I’ve been meaning to bake or that game I’ve wanted to play. So much of Bird by Bird comprises gentle reminders that you can do this, and it’s okay to fail, or think you’re going to fail. Just keep going, bird by bird.

I also read another of your recommendations, Verlyn Klinkenborg’s Several Short Sentences About Writing. It’s a twee book with advice written to look like verse, in a voice that would make me think violent thoughts about any teacher who lectured in it. There’s some useful advice buried within it, but I encountered at least as much advice that I would say I violate every time I start to write, and while it’s written by a journalist largely for journalists, I’m not sure how much of the counsel here I’d truly endorse. I did enjoy the last 50 pages, with examples of bad writing from students he’s taught over the years, which ranges from the execrable to the unintentionally hilarious. It’s more than a matter of laughing at bad writing, but many of the examples illuminate problems with the language itself, ways in which English, or a lack of command of it, can lead us astray. There’s value in that. Perhaps he should have made three-fourths of the book out of that, and limited his advice to the remainder – without the pompous formatting.

Of Dice and Men.

When I interviewed Conor Murphy of Foxing on my podcast a few weeks ago, he recommended a book called Of Dice and Men: The Story of Dungeons & Dragons and the People Who Play It, by David Ewalt, that gives a light history of Dungeons & Dragons. It’s a fun read even if you’re just a casual player, one where the author leans into his self-vowed nerdiness, mixing the history of the game (and tabletop role-playing games in general) with his own experiences playing as a kid and again as an adult.

It may surprise some of you who know of my love for tabletop board games – or just think I’m a big nerd myself, which is probably accurate – that I have never been much of a PnP (pen and paper) D&D player. I did try it in middle school, and also played a little bit of the post-apocalyptic game Gamma World, which came from the same publisher, but never played either very much or for very long. I had some friends who really tried to get me into it, but I found the in-person experience kind of slow and often disorganized. My knowledge of D&D derives far more from playing computer games based on it, notably The Pool of Radiance (the first “gold box” game) and the Baldur’s Gate trilogy, than from the paper version. I liked Pool of Radiance, which I played up until I faced Tyranthraxus, the big foozle at the end of the game, whom I could never defeat, but I loved the Baldur’s Gate games for their incredible story, strong writing, and rich production values, and played the whole thing through multiple times. I can still quote lines from the audio track, and have given up on several similar games I’ve tried since then because they either couldn’t offer the same kind of thoughtful, immersive environment (Temple of Elemental Evil), or because I’d face a poorly designed, difficult battle early in the game, and just bailed (Icewind Dale).

Ewalt’s book is about the pen and paper game, and starts back in the 1950s, well before the game we know now as D&D was even a gleam in the eyes of Gary Gygax and David Arneson. D&D was novel in several ways, especially its open-ended nature and the legacy aspect of one play session affecting the next, but it has its roots in multiple games that came well before it. War games predate role-playing games by a few decades, and several, including the 1960s title Braunstein, directly influenced Arneson (who used it as inspiration for his own fantasy campaign setting Blackmoor, which later became an official D&D campaign setting). Ewalt gives a brief history of gaming, going back to ancient Egypt, then fast-forwards to the 19th and 20th centuries, getting to wargaming and the advent of D&D in short order.

The history of Dungeons and Dragons could probably fill a longer book, although it might bog down in stories of internecine warfare, as Gygax especially seemed to have a habit of alienating colleagues, running Arneson out of the company and trying to erase the latter’s contributions entirely (spurring multiple lawsuits Gygax and his company, TSR, would lose). Gygax’s personality, including what Ewalt depicts as a belief that TSR was his own personal fiefdom, led to his ouster from the firm after a few years of financial mismanagement. Wizards of the Coast bought TSR in 1997, as the company was approaching insolvency, and Hasbro later bought Wizards of the Coast, so D&D now resides in the portfolio of the largest board game publisher in the world.

Ewalt intersperses stories from the main campaign he’s playing as an adult at the time he was writing this book, which I found less interesting than the actual D&D history he provides, and that probably won’t make much sense if you’ve never played the game yourself. However, that narrative allows Ewalt to go into some of the specifics of D&D for the non-gamer – the basic framework of characters and parties, different mechanics, the changes in rules over the course of D&D’s history, even more arcane stuff like why there are clerics and bards and monks in the game. I was willing to hang with the details of his own campaign – which I found a bit ridiculous, as a non-PnP guy who’s pretty much stuck to CRPGs in fantasy settings – because it served that broader purpose.

If you’re not a DnD player at all, but would enjoy learning the superficial history of the game, you might enjoy Of Dice and Men anyway, since it’s very light and well-written, with some self-deprecating humor that helps Ewalt from sounding too pretentious. If you’ve played the game anywhere, in any form, you’ll probably enjoy the trip down memory lane.

Next up: Anne Lamott’s Bird by Bird: Some Instructions on Writing and Life.

Noise.

Nobel Prize-winning economist Daniel Kahneman’s first book, Thinking Fast and Slow, has been hugely influential on the baseball industry and on my own career, inspiring me to write The Inside Game as a way to bring some of the same concepts to a broader audience. Kahneman is back with a sequel of sorts, co-authoring the book Noise: A Human Flaw in Human Judgment with Cass Sunstein and Oliver Sibony, that shifts the focus away from cognitive biases towards a different phenomenon, one that the authors call “noise.”

Noise, in their definition, involves “variability in judgments that should be identical.” They break this down into three different types of noise, all of which add up together to be “system noise.” (There’s a lot of jargon in the book, and that’s one of its major drawbacks.)

  • Level noise, where different individuals make different judgments across different sets of data. The authors cite “some judges are generally more severe than others, and others are more lenient” as an example.
  • Pattern noise, where different individuals make different judgments with the same data.
  • Occasion noise, where an individual makes different judgment depending on when they see the data (which can literally mean the time of day or day of the week). This is probably the hardest for people to accept, but there’s clear evidence that doctors prescribe more opioids near the end of a work day, and judges are more lenient when the local football team won on Sunday.

There’s a hierarchy of noise here, where level noise comprises pattern noise, and pattern noise comprises occasion noise (which they classify as transient pattern noise, as opposed to “stable” pattern noise, which would be, say, how I underrate hitting prospects with high contact rates but maybe Eric Longenhagen rates them consistently more highly). That’s the entire premise of Noise; the book devotes its time to exploring noise in different fields, notably the criminal justice system and medicine, where the stakes are so high and the benefit of a reduction in noise is likely to justify the costs, and to ways we can try to reduce noise in our fields of work.

As with Thinking Fast and Slow, Noise doesn’tmake many accommodations for the lay reader. There’s an expectation here that you are comfortable with the vernacular of behavioral economics and with some basic statistical arguments. It’s an arduous read with a strong payoff if you can get through it, but I concede that it was probably the hardest I’ve worked to read (and understand) anything I’ve read this year. It doesn’t help that noise is itself a more abstruse concept than bias, and the authors make constant references to the difference here.

Some of the examples here will be familiar if you’ve read any literature on behavioral economics before. The sentencing guidelines that resulted from Marvin Frankel, a well-known judge and human rights advocate, pointing out the gross inequities that resulted from giving judges wide latitude in sentencing – resulting in sentences that might range from a few months to 20 years for two defendants convicted the same crime. (The guidelines that resulted from Frankel’s work were later struck down by the Supreme Court, which not only reintroduced noise into the system, but restored old levels of racial bias in sentencing as well.) The authors also attempt to bring noise identification and noise reduction into the business world, with some examples where they brought evidence of noise to the attention of executives who sometimes didn’t believe them.

Nothing was more familiar to me than the discussion of the low value of performance evaluations in the workplace. For certain jobs, with measurable progress and objectives, they may make sense, but in my experience across a lot of jobs in several industries, they’re a big waste of time – and I do mean a big one, because if you add up the hours dedicated to filling out the forms required, writing them up, conducting the reviews, and so on, that’s a lot of lost productivity. One problem is that there’s a lack of consistency in ratings, because raters do not have a common frame of reference for their grades, making grades more noise than signal. Another is that raters tend not to think in relative terms, so you end up with oxymoronic results like 98% of employees grading out as above average. The authors estimate that 70-80% of the output from traditional performance evaluations is noise – meaning it’s useless for its intended purpose of allowing for objective evaluation of employee performance, and thus also useless for important decisions like pay raises, promotions, and other increases in responsibility. Two possible solutions: ditching performance evaluations altogether, using them solely for developmental purposes (particularly 360-degree systems, which are rather in vogue), or spend time and money to train raters and develop evaluation metrics that have objective measurements or “behaviorally anchored” rating scales.

It wouldn’t be a Daniel Kahneman product if Noise failed to take aim at one of his particular bêtes noires, the hiring interview. He explained why they’re next to worthless in Thinking Fast and Slow, and here he does it again, saying explicitly, “if your goal is to determine which candidates will succeed in a job and which will fail, standard interviews … are not very informative. To put it more starkly, they are often useless.” There’s almost no correlation between interview success and job performance, and that’s not surprising, because the skills that make someone good at interviewing would only make them a better employee if the job in question also requires those same skills, which is … not most jobs. Unstructured interviews, the kind most of us know, are little more than conversations, and they serve as ideal growth media for noise. Two interviewers will have vastly differing opinions of the same candidate, even if they interview the candidate together as part of a panel. This pattern noise is amplified by the occasion noise prompted by how well the first few minutes of an interview go. (They don’t mention something I’ve suspected: You’ll fare better in an interview if the person interviewing you isn’t too tired or hungry, so you don’t want to be the last interview before lunch or the last one of the day.) They cite one psychology experiment where researchers assigned students to role-play interviews, splitting them between interviewer and candidate, and then told half of the candidates to answer questions randomly … and none of the interviewers caught on.

There’s plenty of good material here in Noise, concepts and recommended solutions that would apply to a lot of industries and a lot of individuals, but you have to wade through a fair bit of jargon to get to it. It’s also less specific than Thinking Fast and Slow, and I suspect that reducing noise in any environment is going to be a lot harder than reducing bias (or specific biases) would be. But the thesis that noise is at least as significant a problem in decision-making as bias is should get wider attention, and it’s hard to read about the defenses of the “human element” in sentencing people convicted of crimes and not think of how equally specious defenses of the “human element” in sports can be.

Next up: Martha Wells’ Nebula & Locus Award-winning novel Network Effect, part of her MurderBot series.

A Promised Land.

I usually don’t read political autobiographies, because I feel reasonably sure that I’m going to get more self-serving renditions of history than true eludication or, dare we expect so much, real candor from the authors. I’m just not that interested in hearing the stories from people who have much to gain or lose from the way in which those stories are told.

So when my daughter bought me Barack Obama’s A Promised Land, the first part of his memoirs from his time as President, I was more than a little skeptical that I’d enjoy or appreciate it. I admire President Obama, and believe his tenure was more successful than his critics on the right or the far left want you to believe, and that Republican obstructionism was the major reason why he didn’t accomplish more – but I also see many missteps and lost opportunities, as well as policies that just defy reason (the use and frequency of drone strikes in the Middle East, especially Yemen) or that took too long for him to embrace (marriage equality). I was unsure in 2016 and 2017 how much blame to lay at the Obama Administration’s feet for failing to anticipate the rise of Trump and white nationalism, going back to his handling of the birther hoax. And I didn’t want to read 700-plus pages of rationalization or revisionism.

That’s not what A Promised Land is, though. I’m sure there is some inexactness in the retelling of certain stories – I find it hard to believe he’d have all of those quotes written down or memorized, especially with some going back twenty-odd years – and it’s impossible to know what details he chose to omit from the book. But it feels thorough, in detail and in intent, as Obama does acknowledge multiple mistakes in policy and in his management of the executive branch, and if the book has a major flaw it’s that thoroughness – he recounts so many conversations and trips in so much detail that the book drags, and I can’t believe this is only half of the intended volume.

A Promised Land takes us from Obama’s youth through the military operation that led to the killing of Osama bin Laden, so it’s more than a memoir of his time in the White House, or even in politics, and if you’re curious about the development of his character – or, as I was, how someone from a rather unlikely background rose so quickly from a state legislative position to the White House – that is the book’s true throughline. We learn far more about Barack Obama the person here than about, say, how certain decisions came to pass. That may seem a strange comment on a book of this length (and small font), but there’s a distinction between giving us every detail of a meeting, such as every word spoken or gesture made, and giving context and nuance to the scene. This book is a depiction rather than an explanation. So many of the compromises of Obama’s first term, large or small, are attributed to political expediency, often to the argument that it was “do this or the deal doesn’t get done.” Yes, that is how our unwieldy system of government works, but A Promised Land doesn’t connect enough of the dots here.

So much of the part of the book that covers his first two years in office is really a lengthy indictment of the existence of the United States Senate, which gives so much power to legislators who represent wildly unequal numbers of constituents. The camera needs to pan back and show the whole scene, and then Obama could, at least, argue that the system prevents those within it from enacting real, progressive change, even if a majority of Americans support it. The section on the fight over the Affordable Care Act, which is at least the most important event within the book and gets substantial coverage, shows how the sausage is made but never really concludes that the process means the sausage is hazardous to your health.

There is some self-serving messaging here, some rationalization that, as President, he had no choice but to do this or that, to leave troops in Iraq or Afghanistan longer than he’d promised, to check which way the wind was blowing before supporting marriage equality, and so on. A lot of the text around his first year in office amounts to “we inherited a colossal mess,” and that’s probably true, and more instructive now than it was a year ago, as President Biden appears to have inherited an even bigger mess. But doesn’t every President who replaces a predecessor of the other party feel, on some level, that he inherited a mess? Even though the transition of power from President George W. Bush to President Obama was smooth, and Bush deserves some plaudits for how open and cordial he and his staff were to their successors, in the end, you’re restaffing a giant monolith that moves at the pace of a glacier and trying to make quick course corrections that might run to 180 degrees. Did you succeed in spite of those limitations, and if not, what did you learn that you might tell the next guy (well, the guy after the next guy)?

Obama is witty, and he’s a gifted storyteller – his prose isn’t quick, but it’s evocative of image and place, and he captures many of the personalities around him well enough to help distinguish the many people around him in his office. He’s just wordy – his prose is, in fact, too prolix – although I imagine his editors might have been reluctant to ask him to cut back, because, hey, he’s Barack Obama. If there’s an abridged version, as much as I’m loath to recommend those, it might be better for readers who just want to know what happened and how. As for the why, and what we can learn from it, perhaps that’ll come in the second book.

Next up: I just finished Gilbert King’s Devil in the Grove: Thurgood Marshall, the Groveland Boys, and the Dawn of a New America, winner of the 2013 Pulitzer Prize for Non-Fiction.

Imbibe!

David Wondrich’s Imbibe! had been on my wishlist for several years, as it was recommended by several folks I follow on Twitter (including, I think, the great follow @creativedrunk), and he later appeared on the podcast Hugh Acheson Stirs the Pot. I finally picked it up a month or two ago when it was on sale for the Kindle, and while it’s a different book than I expected, it’s a great read if you’re a fan of cocktails, especially vintage ones, and how they took over the American drinking scene at least twice in history.

The inspiration for Imbibe! is “Professor” Jerry Thomas, a very successful if peripatetic bartender in the mid-1800s who mixed drinks at swanky bars and dives on both coasts and wrote what is believed to be the first book on drinks ever published in the United States, Bar-Tender’s Guide. He claimed that he invented the Tom and Jerry, an eggnog-like cocktail, and certainly did a lot to popularize the Tom Collins in the United States. He’s a towering figure in cocktail history … but he’s not really enough to support a whole book.

The real meat of the book is the drinks, and the way Wondrich presents the stories around each drink. Many of the classic cocktails we associate with the Roaring Twenties and the period before Prohibition have their origins in the late 19th century, as far back as the 1850s in some cases, a time of great experimentation with alcoholic spirits, which may simply have been a reaction to the inconsistent or low quality of the spirits available at the time. Thomas spent time tending bar in northern California during the Gold Rush, when he was mixing what I presume was god-knows-what sold as whiskey or brandy or whatever, and thus encouraged the introduction of various mixers and flavorings, notably sugar and other sweetening syrups, as well as peculiar combinations of liquors that would have produced cocktails so strong that you didn’t notice the taste.

I’m using the term cocktails loosely here to describe any sort of mixed drink, but Wondrich adheres to the strict historical definitions of cocktail, punch, sling, and more. A punch has four or five main ingredients – sour, sweet, strong (the booze), weak, and perhaps spice. A cocktail is a punch with the addition of some sort of bitters, potable or nonpotable. A sling is a punch without the sour element, and usually has nutmeg as its sprice. There are also sours (with lemon juice and sugar), collinses (a long sour, meaning it adds soda), juleps (with mint), smashes (with chunks of fruit), flips (with egg), and more. Wondrich walks through these categories and more with historical notes, pinpointing drink origins where possible and debunking the occasional myth.

Many of these drinks are best lost to history, with bizarre combinations of ingredients that result in drinks that sound like they’d have served no other purpose beyond getting the drinker as drunk as possible as quickly as possible. There are champagne cocktails that you’d never make with actual champagne, given the wine’s cost and how most people at least appreciate its flavor. Many drinks in the 1800s were topped with port, a fortified, often sweet wine that would have added color and alcohol but would have run through the flavor of the cocktail beneath like a rhinoceros on amphetamines. And all the eggs … there are some exceptions, to be sure, like a proper egg nog at the holidays, but I cannot see the appeal of mixed drinks with whole eggs in them, warm or cold.

Imbibe! is definitely not a book for every tippler, as it is, pun intended, rather dry in parts. Many of these drinks are antiquated, often lost to history, or only recently seeing a resurgence in interest because of the spread of artisan cocktail bars (which are, unfortunately, likely among the businesses most hurt by our government’s failed response to the pandemic). Some of the ingredients Wondrich identifies in original recipes are no longer available, or extremely difficult to find, and he has to recommend modern substitutes, which is fine but also would raise the question of whether we’re simply better off consuming cocktails and punches designed with those modern ingredients in mind. I’ve read enough about distilled spirits, especially rum, that I approached this book with more history of reading about this sort of thing – and perhaps a bit more specific interest in the makeup of some of the drinks. If you enjoy a good collins or sling, or are interested in the way flavors may or may not combine to create something novel in a glass, Imbibe! is as impeccably researched as you’ll find.

Next up: I’m playing catchup here on reviews but right now I’m reading the short story collection Addis Ababa Noir, edited by Booker Prize nominee Maaza Mengiste.

Breath: The New Science of a Lost Art.

Over the summer, I linked to an interesting longread in The Guardian, an excerpt from a new book by James Nestor called Breath: The New Science of a Lost Art. The excerpt and the title both promised an evidence-based approach to the rather fundamental act of respiration, one that comes up in areas from pulmonary and cardiovascular health to allergies to meditation and mindfulness. It was a huge disappointment: Breath is a lot of woo and anecdote, with a little bit of science hidden in the endnotes. It imparts very little useful information on how to improve your breathing, or address any problems with it.

Nestor starts Breath explaining an experiment he and a fellow “pulmonaut” underwent, where they agreed to block their nasal passages so they’d be forced to breath through their mouths for about three weeks , so they could see how much their health would deteriorate in the meantime. From there, he points out that humans are the only species with our wide range of dental problems, a product of evolution and our changing diet, and speculates that this has led to a constricted airway (which creates the conditions for sleep apnea) and says most of us are just breathing the wrong way.

One major way in which we do it wrong is breathing through our mouths, which bypasses the nose’s air-filtering, humidifying, and warming mechanisms, which came about via evolution and allow us to take less particulate matter into our lungs, while getting warmer, less dry air. Nasal breathing helps filter out some airborne pathogens, while the mouth has no such filtration. There’s even some evidence that breathing through the nose while exercising can improve performance, because “breathing through the nose releases nitric oxide, which is necessary to increase carbon dioxide (CO2) in the blood, which, in turn, is what releases oxygen.”

There’s at least some scientific evidence to back up the claims he presents in those parts of the book, and there’s copious evidence that sleep apnea is associated with serious health problems over the long term. As the book progresses, however, he veers farther and farther into pseudoscientific territory, discussing the Hindu concept of prana (the life force coursing through all living things in Hinduism) as if it were a scientific fact, which it’s not. He mentions how he breathes through his right nostril to improve his digestion, a belief from yoga that appears to have zero scientific evidence to support it. He also appears to advocate some extreme breathing hacks, such as the Buddhist method known as g Tum-mo meditation, that have little to no controlled research showing their efficacy or safety. There are even some internal contradictions here around hypoventilation and its effects, especially since there’s at least some literature showing a connection between hypoventilation and obesity.

I have some very mild breathing issues, mostly connected to sleeping, and thought I might get some useful tips from Breath to help with that, but all I really got out of the book was the advice to breathe more slowly, and remind myself to breathe through my nose when exercising. The former is something you’d get from any resource on mindful meditation, all of which start out with awareness-of-breath exercises. The latter is something I tried on Monday during a run … without success. It turns out that when it’s 40 degrees outside, breathing through your nose is not all that effective in delivering warm, moist air to your lungs, which is counterproductive when you’re trying to run at peak capacity. Apparently this is something you can build up to doing through practice, which I will continue to try to do over the next few weeks, but this isn’t advice for the larger audience.

There’s probably a decent book to write on this topic, but Breath isn’t it. With too much reliance on anecdote and the eventual devolution into woo, it’s not the kind of evidence-based argument I’d want to see for anything related to health or wellness.

Next up: I’ve got a few other books to review, but at the moment I’m reading Jude the Obscure.