A Brief History of Infinity.

Infinity is a big topic, to put it mildly. The mere concept of a limitless quantity has vexed mathematicians, philosophers, and theologians for over two centuries. The Greeks developed some of the first infinite series, some divergent (they approach infinity) and some convergent (they approach a finite number), with Zeno making use of these concepts in some of his famous paradoxes. Galileo is better known for his observations in astronomy and work in optics, but he developed an early paradox that he argued meant that we couldn’t compare the sizes of infinite sets in a meaningful way, showing that, although we know intuitively that there are more integers in total than there are integers that are perfect squares, you can map the integers to the perfect squares in a 1:1 ratio that appears to show that the two sets are the same size. Georg Cantor later explained this paradox in his development of set theory, coining the aleph terminology for infinite sets, and then went mad trying to further his theories of infinity, a math-induced insanity that later afflicted Kurt Gödel in his work on incompleteness. There remain numerous – dare I say infinite? – unsolved problems in mathematics that revolve around infinity itself or whether there are an infinite number of some entity, such as primes or perfect numbers, in the infinite set of whole numbers or integers.

Science writer Brian Clegg attempts to make these topics accessible to the lay reader in his book A Brief History of Infinity, part of the Brief History series from the imprint Constable & Robinson. Rather than delving too far into the mathematics of the infinite, which would require more than passing introductions to set theory, the transfinite numbers, and integral calculus, Clegg focuses on the history of infinity as a concept in math and philosophy, going back to the ancient Greeks, walking through western scholars’ troubles with infinity (and objections from the Church), telling the well-known story of Newton and Leibniz’s fight over “the” calculus, and bringing the reader up through the works of Cantor, Gödel, and other modern mathematicians in illuminating the infinite both large and small. (It’s $6 for the Kindle and $5 for the paperback as I write this.)

Infinity can be inconvenient, but we couldn’t have modern calculus without it, and it comes up repeatedly in other fields including fractal mathematics and quantum physics. Sometimes it’s the infinitely small – the “ghosts of departed quantities” called infinitesimals that Newton and Leibniz required for integration – and sometimes it’s infinitely large, but despite several millennia of attempts to argue infinity out of mathematics, there’s no avoiding its existence and even the necessity of using it. Clegg excels when recounting great controversies over infinity from the history of math, such as the battle between Newton and Leibniz over who invented the calculus, or the battle between Cantor and his former teacher Leopold Kronecker, who disdained not just infinity but even the transcendental numbers (like π, e, or the Hilbert number) and actively worked to prevent Cantor from publishing his seminal papers on set theory.

Clegg’s book won’t likely satisfy the more math-inclined readers who want a crunchier treatment of this topic, especially the recent history of infinity from Cantor forward. Cantor developed modern set theory and published numerous proofs about infinity, proving that there are at least two distinct sets of infinities (the integers, aleph-null, are infinite, but not as numerous as the real numbers, aleph-one; aleph notation measures the cardinality of infinities, not the quantity of infinity itself). I also found Clegg’s discussion of Gödel’s incompleteness theorems rather … um … incomplete, which is understandable given the theorems’ abstract nature, but also meant Gödel earned very little screen time in the book other than the overemphasized parallel between his own descent into insanity and Cantor’s. I was disappointed that he didn’t get into Russell’s paradox*, which is a critical link between Cantor’s work (and Hilbert’s hope for a resolution in favor of completeness) and Gödel’s finding that completeness was impossible.

Let R be the set of all sets that are not members of themselves. If R is not a member of itself, then it must be a member of R … but that produces a contradiction by the definition of R.

Clegg does a much better job than David Foster Wallace did in his own book on infinity, Everything and More: A Compact History of Infinity, which tried to get into the mathier stuff but ultimately failed to make the material accessible enough to the reader (and perhaps exposed the limits of Wallace’s knowledge of the topic too). This is a book just about anyone who took one calculus class can follow, and it has enough personal intrigue to hold the reader’s attention. My personal taste in history of science/math books leans towards the more technical or granular, but I wouldn’t use that as an indictment of Clegg’s approach here.

Next up: I’m reading another Nero Wolfe mystery, after which I’ll tackle Michael Ondaatje’s Booker Prize-winning novel The English Patient.

Thank You for Being Late.

Thomas Friedman’s Thank You For Being Late: An Optimist’s Guide to Thriving in the Age of Accelerations is a solid book about the fast-moving present and immediate future written by a man whose prose is firmly, almost embarrassingly stuck in the past. Friedman has obviously thought deeply about the topics in this collection of connected essays, and talked to many experts, and there are many insights here that would be useful to almost anyone in or soon to enter the American workforce, as well as to the people who are attempting to manage and regulate this fast-moving economy. It was just hard to get through the clunky writing and jokes that don’t even rise to dad level.

Friedman’s main thesis here is that the world is accelerating, and many people – I think his main audience is Americans, although it’s not limited to them – are unprepared for it. Technology has substantially increased the pace of change since the Industrial Revolution, and 100-plus years of accelerations now has the developed world changing at a rate that leads us to a point where it doesn’t even take a full generation of people to churn through more than one generation of tech. These technologies also collapse borders, threaten sovereignty of states, and increase economic inequality. Everyone reading this likely knows about the debate over automation and machine learning (please, stop calling it AI, they are not the same thing), but Friedman is arguing that we need policy makers at all levels to accept it as given and respond to it with policies that produce a populace better equipped to cope with it – and that people themselves accept that continuous learning is likely to be a part of their entire working lives.

Friedman refers to the cloud – a term I’m not 100% sure he even understands — as “the supernova,” a pointless and confusing substitution of a fabricated term for a more commonly accepted one, and then refers back to it frequently throughout the book as the source of much of this technological change. He’s certainly correct that the power of distributed computing has allowed us to solve more problems than we were ever able to solve previously, no matter how many chips you were able to cram into one box; he also gives the sense that he thinks P = NP, that this accelerating rate of growth in computing firepower will eventually be able to solve problems that, in nonmathematical terms, probably can’t be solved in a reasonable time frame. And Moore’s law, which he cites often, has changed in the last few years, as the growth in the number of transistors Intel et al can put on a chip has slowed from 18-24 months to more like 30, and with Intel projecting to hit the 10 nm transistor width this year, we’re probably butting up against the limits of particle physics.

The strongest aspects of Thank You For Being Late are Friedman’s exhortations to readers to accept that the old idea of learning one job and then doing it for 40 years is probably dead. Most jobs, even those we might once have spoken of dismissively as blue-collar or low-skilled, now require a greater knowledge of and comfort with technology. (There’s an effective CG commercial out now for University of Phoenix, where we see a mom working in a factory where all of the workers are slowly replaced by machines until one day the supervisor comes for her. She eventually pursues some sort of IT degree through the for-profit school, and the commercial ends with her walking through stacks of servers.) He lauds companies like AT&T that have already set up programs for employees to take new courses and then make it easier for those employees to identify new jobs within the company for which they qualify – or could try to qualify with further learning. He also discusses municipal and NGO efforts to build job sites that help connect people with skills with learning opportunities and employment opportunities.

There is, however, a bit of a Pollyanna vibe about Friedman, who refers to himself repeatedly as an optimist, and seems to think that more people in the American working class have the time to be able to take classes after hours – or that they have sufficient background to go get, say, a certificate in data science. I looked up some of the programs he mentions in the book; the one related to data science expected students to come in with significant knowlege of programming or scripting languages. He supports government efforts to support lifelong learning and to improve diversity in the workplace and in our communities, but doesn’t even acknowledge the potential government role in ensuring equal access to health care (essential to a functioning economy) or the mere idea of universal basic income, even if to just explain why he thinks it wouldn’t work.

And then there’s Friedman’s overuse of hackneyed quips that felt dated twenty years ago. “Attention K-Mart shoppers!” didn’t resonate with me in the 1980s, since there wasn’t a K-Mart anywhere near where I grew up; the chain has since been obliterated by competition from Wal-Mart and Target, and K-Mart operates 75% fewer stores today than it did at its peak, fewer than 500 nationwide. “This isn’t your grandpa’s X” is just lazy writing at this point; besides, if my daughter read that, she’d likely point out that her grandpa is a retired electrical engineer with two master’s degrees who already did a lot of the lifelong learning that Friedman describes.

Friedman’s writing is also dense, which I find surprising given his background as a newspaper columnist; perhaps he feels like he’s finally set free to prattle on as long as he wants, without anyone to stop him. There’s a level of detail in some parts of the story, such as his overlong descriptions of the halcyon days of the Minnesota town where he grew up, which I’m sure was very nice but probably not quite the Mayberry he describes.

There’s value in here, certainly, but I found it a grind to get through. This could have easily been a series of a dozen or so columns in the New York Times — that they wouldn’t run today because they’re too busy running columns denying climate change or explaining how so-called ‘incels’ need sex robots — rather than a 500-page book. He’s right about his core premise, though: Expect to learn throughout your working life and to see your job, whatever it is, change regularly over the course of your career.

Next up: Roddy Doyle’s Man Booker Prize-winning novel Paddy Clarke Ha Ha Ha.

Locking Up Our Own.

James Forman, Jr., was a public defender in DC for six years, right after he clerked for Sandra Day O’Connor, and encountered the results of two decades of disastrous policies in the criminal justice system of the nation’s capital, many of which led to differential policing and mass incarceration of the city’s black residents. He discussed the history and causes of this system in his 2017 book Locking Up Our Own: Crime and Punishment in Black America, which lays much of the blame for the high incarceration rates on policies embraced and advocated by black community leaders themselves. The book won the 2018 Pulitzer Prize for Non-Fiction this past April.

Forman’s parents met while working for the Student Nonviolent Coordinating Committee (known colloquially as “snick”) during the civil rights movement, which he says spurred his decision to move off the career track into the public defender’s office, eventually moving from there into teaching at Georgetown’s and now Yale’s law schools. Where the 2016 documentary The 13th laid all of the blame for the high rates of black incarceration in the United States on two-plus centuries of racism and white domination – a view that is largely justified – Forman’s book lays bare the role that leaders in black communities played in supporting those policies. Foremost among them: Fighting early progressive efforts to decriminalize possession and personal use of small amounts of marijuana.

Washington DC didn’t achieve some semblance of home rule until 1973, and Congress still holds the power to overturn some laws passed by the DC council and could even, in theory, dismiss the city’s council at will. This gives the city’s residents a status not too much greater than those of territories like Puerto Rico or the U.S. Virgin Islands, although I suppose if two hurricanes knocked out power to DC for several months the federal government would be a little quicker to address the problem. DC’s population is nearly half African-American, and the high rates of incarceration and different policing strategies in its neighborhoods with higher black populations have had a severe effect on the city’s economy, including continuing high crime rates. Forman explains how DC got into this mess, going back to the end of the civil rights movement and explaining how it was actually a white progressive council member who tried to decriminalize marijuana possession, but found himself opposed by black church leaders, Nation of Islam leaders, and even some black city council members, all of whom ended up working together to scotch the proposal (which may not have passed muster with Congress anyway). When a similar proposal arose a few years later to create mandatory minimum sentencing to fight rising crime rates in DC – themselves at least in part the result of the crack cocaine epidemic – black community leaders were all for the new law, responding to residents’ concerns about violent street crime and home invasions, but also enforcing a longstanding moral viewpoint that African-Americans could defeat stereotypes about them by, in essence, behaving better. If DC cracked down on even trivial crimes, even misdemeanors, the theory went, it would improve the quality of life for all DC residents while also working against white politicians and community leaders who worked to disenfranchise and/or limit the economic mobility of people of color.

None of this worked, as Forman writes, and instead helped fuel a new DC underclass – as it did in other cities, including Detroit, the US city with the highest proportion of residents who are African-American – of blacks, mostly men, who were now de facto unemployable because they had criminal records. Such ex-convicts also could find themselves ineligible for certain government assistance programs, turned down for housing, and even unable to vote. Forman, as a public defender, worked with many such clients, but, in his own telling, he was struggling upstream against a system that simultaneously limited the advancement of African-Americans in its police force and judiciary and also aggressively pursued policies that further hindered the black community. He touches on greater arrest rates in black wards of DC versus white, the long-term harm of “stop and frisk” policies (formally known as a Terry stop, and of dubious constitutionality, especially when opponents can show disparate impact by race of police targets), and the formal and informal obstacles that efforts at community improvement can face from municipal police forces – even when officers and administrators are themselves African-American.

Locking Up Our Own is a sobering look at how we got here, but perhaps short on prescriptions for undoing forty years of damage. Marijuana decriminalization is finally happening, although it’s driven by white stoners and libertarians rather than black citizens and provides no procedure for vacating past convictions for trivial possession cases. Stop and frisk was ruled unconstitutional in NYC in 2013, but our current President and Attorney General have both explicitly endorsed the practice. Mandatory minimums remain popular, in large part because they serve “tough on crime” candidates well – and who would dare to stand up and say that criminals deserve shorter sentences? A path to greater African-American enfranchisement and sovereignty in majority black neighborhoods would likely be impossible in any system where higher level, white-dominated government bodies can invalidate city or state policies. Any change that starts at the bottom will fail without a change at the top.

Next up: Claude M. Steele’s Whistling Vivaldi: How Stereotypes Affect Us and What We Can Do.

Why We Sleep.

Why do we sleep? If sleep doesn’t serve some essential function, then it is evolution’s biggest mistake, according to one evolutionary scientist quoted in Matthew Walker’s book Why We Sleep: Unlocking the Power of Sleep and Dreams, which explains what sleep seems to do for us, what sleep deprivation does to us, and why we should all be getting more sleep and encouraging our kids and our employees to do the same.

Walker, a sleep researcher and Professor of Neuroscience and Psychology at Cal-Berkeley, begins by delving into what we know about the history of sleep in humans, and how sleep itself is structured. Humans were, for most of our history as a species, biphasic sleepers – we slept twice in each 24 hour period. We retain vestiges of this practice, which only ended in the 19th century in the developed world with the Industrial Revolution, in our Circadian rhythms, which still give us that post-prandial ‘slump’ that led to customs like the siesta. (It had never occurred to me that the word “circadian” itself came from the Latin words for “almost a day,” because that rhythm in our bodies isn’t quite 24 hours long.)

Sleep is, itself, two different processes that occur sequentially, alternating through a night of full sleep. Most people are familiar with REM sleep, referring to the rapid eye movements visible to an observer standing not at all creepily over you while you slumber. The remaining periods of sleep are, creatively, called nREM or non-REM sleep, and themselves comprise three different sub stages. Both phases of sleep are important; REM sleep is when dreaming occurs, which itself seems to serve the purposes of helping the brain process various events and the associated emotions from the previous day(s), as well as allowing the brain to form connections between seemingly unrelated memories or facts that can seem like bursts of creativity the next day. Your body becomes mostly paralyzed during REM sleep, or else you’d start moving around while you dream, perhaps kicking, flailing, or even acting out events in your dreams – which can happen in people with certain rare sleep disorders. N-REM sleep allows the body to repair itself, helps cement new information into memories in the brain’s storage, boosts the immune system, and contributes to feelings of wakefulness in the next day. The part of N-REM sleep that accomplishes the most, called deep or N3 sleep, decreases as you age, which is why older people may find it hard to sleep longer during the night and then feel less refreshed the next morning.

The bulk of Why We Sleep, however, is a giant warning call to the world about the hazards of short- and long-term sleep deprivation, which Walker never clearly defines but seems to think of as sleeping for a period of less than six hours. (He calls bullshit on people, like our current President and I believe his predecessor too, who claim they can function well on just four or five hours of sleep a night.) Sleep deprivation affects cognition and memory, and long-term deprivation contributes to cancer, diabetes, mental illnesses, Alzheimer’s, and more. Rats deprived of sleep for several days eventually die of infections from bacteria that would normally live harmlessly in the rats’ intestinal tracts.

We don’t sleep enough any more as a society, and there are real costs to this. Drowsy driving kills more people annually than drunk driving, and if you think you’ve never done this, you’re probably wrong: People suffering from insufficient sleep can fall into “micro-sleeps” that are enough to cause a fatal accident if one occurs while you’re at the wheel. Sleep deprivation in adolescents seems to lead to increased risks of various mental illnesses that tend to first manifest at that age, while also contributing to behavioral problems and reducing the brain’s ability to retain new information. Walker even ends the book with arguments that corporations should encourage better sleep hygiene as a productivity tool and a way to reduce health care costs, and that high schools should move their school days back to accommodate the naturally later sleep cycles of teenagers, whose circadian rhythms operate somewhat later than those of preteens or adults.

One major culprit in our national sleep deficit — which, by the way, isn’t one you can pay; you can’t ‘catch up’ on lost sleep — is artificial light, especially blue light, which is especially prevalent in LED light sources like the one in this iPad on which I’m typing and the phone on which you’re probably reading this post. Blue light sources are everywhere, including the LED bulbs the environmentally responsible among us are now using in our house to replace inefficient incandescent bulbs or mercury-laden CFLs. Blue light confuses the body’s natural melatonin cycle, which is distinct from the circadian rhythm, and delays the normal release of melatonin in the evenings, which thus further delays the onset of sleep.

Sleep confers enormous benefits on those who choose to get enough of it, benefits that, if more people knew and understand them, should encourage better sleep hygiene in people who at least have the discretion to sleep more. Sleep helps cement new information in your memory; if you learn new information, such as vocabulary in a foreign language, and then nap afterwards, you’re significantly more likely to retain what you learned afterwards. Sleep also provides the body with time to repair some types of cell damage and to recover from muscle fatigue – so, yes, ballplayers getting more sleep might be less prone to injuries related to fatigue, although sleep can’t repair a frayed labrum or tearing UCL.

Walker says he gives himself a non-negotiable eight-hour sleep window every night. I am not sure how he can reconcile that with, say, his trans-Atlantic travel, but he does point out that changing time zones can wreak havoc on our sleep cycles. He suggests avoiding alcohol or caffeine within eight hours of bedtime — so, yes, he even says if you want that pint of beer, have it with breakfast — and offers numerous suggestions for preparing the body for sleep as you approach bedtime, including turning off LED light sources, using blue light filters on devices if you just can’t put them down, and even using blackout shades for total darkness into the morning.

There are some chapters in the middle of Why We Sleep that would stand well on their own, even if they’re not necessarily as relevant to most readers as the rest of it. The chapter on sleep disorders, including narcolepsy and fatal familial insomnia (about as awful a way to die as I could imagine), is fascinating in its own right. Walker also delivers a damning rant on sleeping pills, which produce unconsciousness but not actual sleep, not in a way that will help the body perform the essential functions of sleep. He does say melatonin may help some people, although I think he believes its placebo effect is more reliable, and he questions whether over the counter melatonin supplements deliver as much of the hormone as they claim they do.

Why We Sleep was both illuminating and life-altering in the most literal sense: Since reading it, I’ve set Night Shift modes on my devices, set alarms to remind me to get to bed eight hours before the morning alarm, stopped trying to make myself warmer at night (cold prepares the body for sleep, and you sleep best in temperatures around 57 degrees), and so on. I had already been in the habit of pulling over to nap if I became drowsy on a long drive, but now I build more time into drives to accommodate that, and to give myself more time to wake up afterwards – Walker suggests 20 minutes are required for full cognitive function after even a brief nap. Hearing the health benefits of sleeping more and risks of insufficient sleep, including higher rates of heart disease, cancer, and Alzheimer’s, was more than enough to scare me straight.

Next up: I’m halfway through Brian Clegg’s A Brief History of Infinity: The Quest to Think the Unthinkable.

Not Dead Yet.

I came of age as a music fan right around 1980, thanks in part to some of those old K-Tel pop hits collections (on vinyl!) that my parents bought me as gifts, one of which included Genesis’ hit “Abacab.” I loved the song right away, despite having no idea what it was about (still don’t), and it made me a quick fan of Genesis, and, by extension, Phil Collins’ solo material, which at that point already included “In the Air Tonight.” I’d say I continued as a fan of both until the early 1990s, when Genesis released their self-immolating We Can’t Dance (an atrocious, boring pop record) and Collins’ own solo work became similarly formulaic and dull. It was only well after the fact that I heard any of the first phase of Genesis, where Peter Gabriel was still in the band and their music was progressive art rock that featured adventurous writing and technical proficiency.

Collins’ memoir, Not Dead Yet, details the history of the band through his eyes as well as a look at his solo career and his tangled personal life, some of which made tabloid headlines, leading up to his inadvertent effort at drinking himself to death just a few years ago. The book seems open about many aspects of Collins’ life, including mistreatment of his three wives and his children (mostly by choosing work over his familial duties) and his refusal to accept that he had a substance-abuse problem, but there’s also a strain of self-justification for much of his behavior that I found offputting.

From a narrative sense, the book’s high point is too close to the beginning: When Collins was just starting out in the English music scene, his path intersected with numerous musicians who’d later become superstars and some of whom would be his friends and/or writing partners later in life, including Eric Clapton, Robert Plant, and George Harrison. The Sing Street-ish feel to those chapters is so charming I wondered how much was really accurate, but Collins does at least depict himself as a star struck kid encountering some of his heroes while he’s still learning his craft as a drummer. I also didn’t know Collins was a child actor, even taking a few significant stage roles in London, before his voice broke and he switched to music as a full-time vocation.

The Genesis chapters feel a little Behind the Music, but they’re fairly cordial overall – Collins doesn’t dish on his ex-mates and if anything seems at pains to depict Gabriel as a good bandmate and friend whose vision happened to grow beyond what the band was willing or able to achieve. It’s the stuff on Collins’ personal life that really starts to grate: He talks about being a terrible husband and father, but there’s enough equivocation in his writing (often quite erudite, even though he didn’t finish high school) to suggest that he isn’t taking full responsibility for his actions. He cheated on two wives, he ignored their wishes that he devote more time to his family, and he seems to have harassed the woman half his age (he was 44, she 22) who became his third wife and mother of the last two of his five kids.

It’s also hard to reconcile Collins’ comments on his own songwriting, both on solo records and in later word for Disney films and Broadway shows, with the inferior quality of most of his lyrics. Collins’ strengths were his voice, his sense of melody, and of course his work on the drums. His lyrics often left a lot to be desired, and their quality, never high, merely declined as he became more popular. Even his last #1 song in the U.S., “Another Day in Paradise,” is a mawkish take on the same subject covered more sensitively in “The Way It Is” and a dozen other songs on visible poverty in a developed, wealthy economy.

Since that’s all I have to say on the book, I’ll tell one random Collins-related story. When I was in high school, MTV briefly had an afternoon show called the Heavy Metal Half-Hour, which they later retitled the Hard 30. It was hair metal, so not really very heavy by an objective standard, but harder rock than what they played the rest of the time. One day during the Hard 30 run, they played … Phil Collins’ cover of “You Can’t Hurry Love.” I’m convinced this wasn’t an accident, but a test to see if anyone was watching. The show was cancelled a few weeks later.

Next up: I’m about halfway through Peter Carey’s Booker Prize-winning novel Oscar and Lucinda, later turned into a movie with a very young Voldemort and Queen Elizabeth.

Killers of the Flower Moon.

David Grann’s Killers of the Flower Moon: The Osage Murders and the Birth of the FBI is a non-fiction ‘novel’ that manages to combine a real-world mystery with noir and organized crime elements while also elucidating historical racism against a population seldom considered in modern reevaluations of our own history of oppressing minorities. Drawing on what appears to be a wealth of notes from the initial investigation as well as private correspondence, Grann gives the reader a murder story with a proper resolution, but enough loose ends to set up a final section to the book where he continues exploring unsolved crimes, revealing even further how little the government did to protect the Osage against pitiless enemies. It’s among the leading candidates to win the Pulitzer Prize for Non-Fiction on Monday.

The Osage were one of the Native American tribes banished to present-day Oklahoma when that area was known as “Indian Territory,” marked as such on many maps of the late 19th century; Oklahoma as we know it didn’t exist until 1907, when it became the 46th state. (It always amused me to think of the ‘hole’ in the map of the U.S. as late as 1906, before Oklahoma, Arizona, and New Mexico attained statehood.) By a fortunate accident, the plot of apparently useless land to which the federal government exiled the Osage sat on top of one of the largest petroleum deposits in the continental U.S., which made the Osage mineral millionaires. The government couldn’t quite revoke their rights, but instead ruled that the Osage, being savages, were incompetent to run their own affairs, and that Osage adults required white ‘guardians’ to oversee their financial decisions, which, of course, led to much thievery and embezzlement and, in time, foul play, such as white citizens marrying Osage members and then poisoning their spouses to gain legal control of their headrights and the income they provided.

Two murders in particular attracted the attention of authorities outside of the county, however, as both Osage victims were shot in the head at close range, so there was no question of claiming natural causes, as was often the case when victims were poisoned (often in whiskey, so alcohol could be blamed). These murders were part of a spate of dozens of killings, many of which didn’t appear at first to be connected other than that the victims were either Osage themselves or were in some way investigating the crimes; the sheer scope of this and some media coverage brought in the attention of a young, ambitious bureaucrat named J. Edgar Hoover, who decided to put one of his top agents at the nascent Bureau of Investigation (no ‘federal’ in its title) on the case. The subsequent unraveling of the deceptions and the revelation that the mastermind of the plot was someone closer to the Osage than anyone expected included both early forensic science and dogged investigative work, leading eventually to one confession that toppled the criminal enterprise – only to have the trial twist and turn more than once before the final verdict.

Grann couldn’t have picked a better subject for the book, because these characters often seem plucked from Twin Peaks, from the Osage woman Molly, a survivor of a poisoning attempt whose sister was one of the victims killed by gunshot and who had several other family members die in suspicious circumstances, on up to the head of the scheme, a man whose greed and malice lay hidden behind a façade of benevolence toward his Osage neighbors. Killers of the Flower Moon would make an excellent dramatic film if told straight, but it would take just a little artistic license to turn it into the sort of crime tapestry in which HBO has excelled for years by sharpening or exaggerating some of the individuals’ personalities.

The story of the murders and the federal agents’ work to convict the killers is, in itself, more than enough to stand alone as a compelling narrative work, but Grann explains how the federal, state, and county authorities regularly worked to strip the Osage of their rights, fueled by outright racism and by jealousy of the tribe’s good fortune (with, it appears, no consideration of how racism and avarice drove the tribe to Oklahoma in the first place). After the verdict and what might normally stand as an epilogue, Grann himself appears, writing in the first person about his experiences researching the book and how he found evidence that the Bureau didn’t solve all of the murders, or even most of them, but assumed that they’d gotten the Big Foozle and had thus closed the case. Grann may have solved one more murder himself, but as he interviews more surviving relatives of the victims – many of whom ask him to find out who killed their fathers or uncles or sisters – it becomes clear that the majority of these killings will remain unsolved, a sort of ultimate insult on top of the lifetime of indignities to which these Osage victims were subjected.

It’s hard to escape the conclusion, although Grann never makes it explicit, that this would never have happened if any of the governing (white) authorities viewed the Osage tribe members as actual people. Dozens of killings went unsolved and unaddressed for several years before Hoover’s men arrived, and some unknown but large percentage of the killings will never be solved. What white officials didn’t do for the Osage in the 1920s continues today in what mostly (but not always) white officials don’t do today to address violence in urban, mostly African-American communities, including right near me in the majority-black city of Wilmington, nicknamed “Murder Town” for its disproportionately high rate of deaths by gun. If the governments responsible for the safety of these citizens don’t see those citizens’ deaths as important, or as equal to the deaths of white citizens, then it is unlikely that anything of substance will be done to stop it.

I listened to the audio version of Grann’s book, which has three narrators, one of whom, actor Will Patton, does an unbelievable job of bringing the various characters, especially the conspirators, to life. The other narrators were fine, but Patton’s voice and intonations made this one of the most memorable audiobooks I’ve listened to.

Next up: I just finished George Saunders’ Lincoln in the Bardo, which won the Man Booker Prize in 2017 and is among the favorites to win the Pulitzer Prize for Fiction next week; and have begun Joan Silber’s Improvement, also from 2017.

The Origins of Totalitarianism.

I spent my first year in college as a Government major, with some vague idea of studying law and/or working in politics after graduation, but abandoned the major completely by the middle of my sophomore year because the reading absolutely killed me. I like to read – I would hope that was evident to regulars here – but the kind of writing we were assigned in those classes was just dreadful. There was a book by Samuel Huntington (The Clash of Civilizations and the Remaking of World Order) that ended any interest I might have had in the subject because it was such an arduous, opaque read, and I eventually switched to a joint sociology/economics major, which got me into more of my comfort zone of a blend of math and theory.

Hannah Arendt’s The Origins of Totalitarianism reminded me tremendously of Huntington and John Stuart Mill and other books I was assigned in Gov 1040 but never actually finished, both in prose style and in tone. I understand that this book is considered extremely influential and an important work in our comprehension of how movements like the Nazi Party arise and even gain a modicum of popular support. The arguments herein, however, are almost exclusively assertions, with anecdotal evidence or no evidence at all, and the circumlocutory writing style meant that even though I retain a lot of what I read in most cases, I found I wasn’t even retaining what I read here from one page to the next.

Arendt’s main thrust here is that totalitarian governments, which she distinguishes from mere autocracies, arise when their leaders follow a rough playbook that sets up specific groups as enemies of the state, rallies disaffected followers against those groups, and often makes their supports into unwitting advocates of their own eventual oppression. Such governments then retain power by eliminating the possibility of what Arendt refers to as human spontaneity through an Orwellian system of truth-denial and unpredictable favoritism that puts subjects on ever-shifting ground, preventing them from mounting any effective system of dissent or resistance.

At least, I think that’s what she was arguing, but she used a lot of extraneous words to get there – and some of what she described in the early going, where she addresses the history of the so-called “Jewish question,” sounded a lot like victim blaming. She certainly says the Jews of Europe did not adequately understand how they were being used by European elites or how their connections to unpopular leaders like the Hapsburgs thus put them in the crosshairs of populist movements that aimed at overthrowing the monarchical or despotic status quo. She also seems to credit the same movements with their willingness to employ efficient methods of killing for its surprise value – no one expected anything like the Nazis’ system of killing masses of people, based itself on a process of dehumanization of entire classes of the population.

Whether I fully grasped the arguments Arendt makes in this book – and I freely acknowledge I probably did not – but much of what she does assert seems apposite to our present-day political situation, including the way in which Trump supporters, including his sycophants in the media, have repeatedly handwaved away his distortions of fact or his apparent collusion with a hostile foreign power. I’ll close, therefore, with this selection of quotes from The Origins of Totalitarianism that could just as easily have been written today about our current environment.

In the United States, social antisemitism may one day become the very dangerous nucleus for a political movement.

Politically speaking, tribal nationalism always insists that its own people is surrounded by “a world of enemies,” “one against all,” that a fundamental difference exists between this people and all others. It claims its people to be unique, individual, incompatible with all others, and denies theoretically the very possibility of a common mankind long before it is used to destroy the humanity of man.

The rank and file is not disturbed in the least when it becomes obvious that their policy serves foreign-policy interests of another and even hostile power.

(The Nazis) impressed the population as being very different from the “idle talkers” of other parties.

The mob really believed that truth was whatever respectable society had hypocritically passed over, or covered up with corruption.

Hitler circulated millions of copies of his book in which he stated that to be successful, a lie must be enormous.

The ideal subject of totalitarian rule is not the convinced Nazi or the convinced Communist, but people for whom the distinction between fact and fiction (I.e., the reality of experience) and the distinction between true and false (I.e., the standards of thought) no longer exist.

The Beak of the Finch.

Winner of the 1995 Pulitzer Prize for Non-Fiction, Jonathan Weiner’s The Beak of the Finch: A Story of Evolution in Our Time should have ended most of the inane arguments still coming from creationists and other science deniers about the accuracy of the theory of evolution. Weiner tells the story of the Grants, a married couple of biologists who spent 20 years studying Galapagos finches – the same species that Darwin spotted on his voyage with the Beagle and that helped him develop his first theory of adaptation via natural selection – and observed natural selection and evolution in action. This remarkable study, which also showed how species evolve in response to changes in their environment and to other species in their ecosystems, was a landmark effort to both verify Darwin’s original claims and strengthen them in a way that, again, should have put an end to this utter stupidity that still infects so much of our society, even creeping into public science education in the south and Midwest.

The finches are actually a set of species across the different islands of the Galapagos, with the Grants studying those on Daphne Major, an uninhabited island in the archipelago that has multiple species of finch existing alongside each other because they occupy different ecological niches. Over the two decades they studied these species, massive changes in weather patterns (in part caused by El Niño and La Niña) led to years of total drought and years of historically high rainfall, with various species on the island responding to these fluctuations in the environment in ways that affected both population growth and characteristics. The beaks of the book’s title refer to the Grants’ focus on beak dimensions, which showed that the finches’ beaks would change in response to those environmental changes. In times of drought, for example, the supply of certain seeds that specific finch species relied on for their sustenance might become more scarce, and there would be a response within a few generations (or even one) favoring birds with longer or stronger beaks that gave them access to new supplies of food. Many Galapagos finches crack open seed cases to get to the edible portions within, so if those seeds are rarer in a given year, the birds with stronger beaks can crack open more cases and get to more food, given them a tangible advantage in the rather ruthless world of natural selection.

Weiner focuses on the Grants’ project and discoveries throughout the book, but intersperses it with other anecdotes and with notes from Darwin’s travels and his two major works on the subject, On the Origin of Species and The Descent of Man. He incorporates the discovery of DNA and how that has accelerated our ability to study and understand evolutionary changes. He goes into the famous example of the white English moth that found itself at a severe disadvantage in the polluted world of the early Industrial Revolution, and how a single gene that determined wing color led to a shift in the moth’s population from mostly white to mostly black (to match the soot covering trees near Manchester and London) – and back again after England finally took steps to clean up its air. This one example is especially instructive in our ongoing experience of climate change, which Weiner refers to throughout as global warming (the preferred term at the time), and opens up a discussion about “artificial selection,” from how we’re screwing up the global ecosystem to antibiotic resistance to the futility of pesticide-driven agriculture (with the targeted pests evolving resistance very rapidly to each new chemical we dump on our crops).

Although Weiner doesn’t stake out a clear position on theism, the tone of the book, especially the final third, goes beyond mere anti-creationism into an outright rejection of any supernatural role in the processes of natural selection and evolution. While that may be appropriate for most of the book, as such processes as the development of the human eye (the argument about the hypothetical watchmaker) can be explained through Darwinian evolution, Weiner does overstep when he discusses the rise of human consciousness, handwaving it away as perhaps just a simple change in neurons or a single genetic mutation that led to the very thing that makes us us. (Which isn’t to say we’re that different from chimpanzees, with whom we still share 99% of our genes. Perhaps David Brin was on to something with his “neo-chimps” in the Uplift series after all.)

The most common rejoinder I encounter online when I mention that evolution is real is that we can’t actually see evolution and therefore it’s “only a theory.” The latter misunderstands the scientific definition of theory, but the former is just not true: We do see evolution, we have seen it, and we’ve seen dramatic shifts in species’ characteristics in ordinary time. Some speciation may occur in geological time, but the evolution of new species of monocellular organisms can happen in days (again, if you don’t believe in evolution, keep taking penicillin for that staph infection), and natural selection in vertebrates can take place rapidly enough for us to see it happen. If The Beak of the Finch were required reading in every high school biology class, perhaps we’d have fewer people – the book cites a survey from the 1990s that claims half of Americans don’t accept evolution – still denying science here in 2018.

Next up: David Grann’s Killers of the Flower Moon: The Osage Murders and the Birth of the FBI, among the favorites to win the Pulitzer for Non-Fiction this year.

Nudge.

Richard Thaler won the 2017 Nobel Prize in Economics – or whatever the longer title is, it’s the one Nobel Prize people don’t seem to take all that seriously – for his work in the burgeoning field of behavioral economics, especially on what is now called “choice architecture.” Thaler’s work focuses on how the way we make decisions is affected by the way in which we are presented with choices. I mentioned one of Thaler’s findings in my most recent stick to baseball roundup – the candidate listed first on a ballot receives an average boost of 3.5% in the voting, with the benefit higher in races where all candidates are equally unknown (e.g., there’s no incumbent). You would probably like to think that voters are more rational than that, or at least just not really that irrational, but the data are clear that the order in which names are listed on ballots affects the outcomes. (It came up in that post because Iowa Republicans are trying to rig election outcomes in that state, with one possible move to list Republican candidates first on nearly every ballot in the state.)

Thaler’s first big book, Nudge: Improving Decisions About Health, Wealth, and Happiness, co-authored with Harvard Law School professor Cass Sunstein came out in 2008, and explains the effects of choice architecture while offering numerous policy prescriptions for various real-world problems where giving consumers or voters different choices, or giving them choices in a different order, or even just flipping the wording of certain questions could dramatically alter outcomes. Thaler describes this approach as “libertarian paternalism,” saying that the goal here is not to mandate or restrict choices, but to use subtle ‘nudges’ to push consumers toward decisions that are better for them and for society as a whole. The audiobook is just $4.49 as I write this.

This approach probably mirrors my own beliefs on how governments should craft economic policies, although it doesn’t appear to be in favor with either major party right now. For example, trans fats are pretty clearly bad for your health, and if Americans consume too many trans fats, national expenditures on health care will likely rise as more Americans succumb to heart disease and possibly cancer as well. However, banning trans fats, as New York City has done, is paternalism without liberty – these jurisdictions have decided for consumers that they can’t be trusted to consume only small, safer amounts of trans fats. You can certainly have tiny amounts of trans fats without significantly altering your risk of heart disease, and you may decide for yourself that the small increase in health risk is justified by the improved flavor or texture of products containing trans fats. (For example, pie crusts made with traditional shortening have a better texture than those made with new, trans fat-free shortening. And don’t get me started on Oreos.) That’s your choice to make, even if it potentially harms your health in the long run.

Choice architecture theory says that you can deter people from consuming trans fats or reduce such consumption by how you present information to consumers at the point of purchase. Merely putting trans fat content on nutrition labels is one step – if consumers see that broken out as a separate line item, they may be less likely to purchase the product. Warning labels that trans fats are bad for your heart might also help. Some consumers will consume trans fats anyway, but that is their choice as free citizens. The policy goal is to reduce the public expenditure on health care expenses related to such consumption without infringing on individual choice. There are many such debates in the food policy world, especially when it comes to importing food products from outside the U.S. – the USDA has been trying for years to ban or curtail imports of certain cheeses made from raw milk, because of the low risk that they’ll carry dangerous pathogens, even though the fermentation process discourages the growth of such bugs. (I’m not talking about raw milk itself, which has a different risk profile, and has made a lot of people sick as it’s come back into vogue.) I’ve also run into trouble trying to get products imported from Italy like bottarga and neonata, which are completely safe, but for whatever reason run afoul of U.S. laws on bringing animal products into the country.

Thaler and Sunstein fry bigger fish than neonata in Nudge, examining how choice architecture might improve employee participation in and choices within their retirement accounts, increase participation in organ donation programs, or increase energy conservation. (The last one is almost funny: If you tell people their neighbors are better at conserving energy, then it makes those people reduce their own energy use. South Africa has been using this and similar techniques to try to reduce water consumption in drought-stricken Cape Town. Unfortunately, publicizing “Day Zero” has also hurt the city’s tourism industry.) Thaler distinguishes between Econs, the theoretical, entirely rational actors of traditional economic theory; and Humans, the very real, often irrational people who live in this universe and make inefficient or even dumb choices all the time.

Nudge is enlightening, but unlike most books in this niche, like Thinking, Fast and Slow or The Invisible Gorilla, it probably won’t help you make better choices in your own life. You can become more aware of choice architecture, and maybe you’ll overrule your status quo bias, or will look at the top or bottom shelves in the supermarket instead of what’s at eye level (hint: the retailer charges producers more to place their products at eye level), but the people Nudge is most likely to help seem like the ones least likely to read it: Elected and appointed officials. I’ve mentioned many times how disgusted I was with Arizona’s lack of any kind of energy or water conservation policies. They have more sun than almost any place in the country, but have done little to nothing to encourage solar uptake, although the state’s utility commission may have finally forced some change on the renewable energy front this week. Las Vegas actually pays residents to remove grass lawns and replace them with low-water landscaping; Arizona does nothing of the sort, and charges far too little for water given its scarcity and dwindling supply. Improving choice architecture in that state could improve its environmental policies quickly without infringing on Arizonans’ rights to leave the lights on all night.

Speaking of Thinking, Fast and Slow, its author, Daniel Kahneman, was a guest last week on NPR’s Hidden Brain podcast, and it was both entertaining and illuminating.

Next up: Hannah Arendt’s The Origins of Totalitarianism. No reason.

The Hidden Brain.

I’ve become a huge fan of the NPR prodcast The Hidden Brain, hosted by Shankar Vedantam, a journalist whose 2010 book The Hidden Brain: How Our Unconscious Minds Elect Presidents, Control Markets, Wage Wars, and Save Our Lives spawned the podcast and a regular radio program on NPR. Covering how our subconscious mind influences our decisions in ways that traditional economists would call ‘irrational’ but modern behavioral economists recognize as typical human behavior, Vedantam’s book is a great introduction to this increasingly important way of understanding how people act and think.

Vedantam walks the reader through these theories via concrete examples, much as he now does in the podcast – this week’s episode, “Why Now?” about the #MeToo movement and our society’s sudden decision to pay attention to these women, is among its best. Some of the stories in the book are shocking and/or hard to believe, but they’re true and serve to emphasize these seemingly counterintuitive concepts. He discusses a rape victim who had focused on remembering details about her attacker, and was 100% sure she’d correctly identified the man who raped her – but thirteen years after the man she identified was convicted of the crime, a DNA test showed she was wrong, and she then discovered a specific detail she’d overlooked at the time of the investigation because no one asked her the ‘right’ question. This is a conscientious, intelligent woman who was certain of her memories, and she still made a mistake.

Another example that particularly stuck with me was how people react in the face of imminent danger or catastrophe. Just before the 2004 Indian Ocean tsunami, the sea receded from coastal areas, a typical feature before a tidal wave hits. Vedantam cites reports from multiple areas where people living in those regions “gathered to discuss the phenomenon” and “asked one another what was happening,” instead of running like hell for high ground. Similar reports came from the World Trade Center after 9/11. People in those instances didn’t rely on their instincts to flee, but sought confirmation from others nearby – if you don’t run, maybe I don’t need to run either. In this case, he points to the evolutionary history of man, where staying with the group was typically the safe move in the face of danger; if running were the dominant, successful strategy for survival, that would still be our instinct today. It even explains why multiple bystanders did not help Deletha Word, a woman who was nearly beaten to death in a road-rage incident on the packed Belle Isle bridge in Detroit in 1996 – if no one else helped her, why should I?

Vedantam’s writing and speaking style offers a perfect blend of colloquial storytelling and evidence-based arguments. He interviews transgender people who describe the changes attitudes they encounter between before and after their outward appearances changed. (One transgender man says, “I can even complete a whole sentence [post-transition] without being interrupted by a man.) And he looks at data on racial disparities in sentencing convicted criminals to death – including data that show darker-skinned blacks are more likely to receive a death sentence than lighter-skinned blacks.

The last chapter of The Hidden Brain came up last week on Twitter, where I retweeted a link to a story in the New York Times from the wife of a former NFL player, describing her husband’s apparent symptoms of serious brain trauma. One slightly bizarre response I received was that this was an “appeal to emotion” argument – I wasn’t arguing anything, just sharing a story I thought was well-written and worth reading – because it was a single datum rather than an extensive study. Vedantam points out, with examples and some research, that the human brain does much better at understanding the suffering of one than at understanding the suffering of many. He tells how the story of a dog named Hokget, lost in the Pacific on an abandoned ship, spurred people to donate thousands of dollars, with money coming from 39 states and four countries. ( An excerpt from this chapter is still online on The Week‘s site.) So why were people so quick to send money to save one dog when they’re so much less likely to send money when they hear of mass suffering, like genocide or disaster victims in Asia or Africa? Because, Vedantam argues, we process the suffering of an individual in a more “visceral” sense than we do the more abstract suffering of many – and he cites experimental data from psychologist Paul Slovic to back it up.

The Hidden Brain could have been twice as long and I would still have devoured it; Vedantam’s writing is much like his podcast narration, breezy yet never dumbed down, thoroughly explanatory without becoming dense or patronizing. If you enjoy books in the Thinking Fast and Slow or Everybody Lies vein, you’ll enjoy both this title and the podcast, which has become one of my go-to listens to power me through mindless chores around the house.