Seven Games.

The title of Oliver Roeder’s book Seven Games: A Human History is a misnomer in two ways: It’s not really a book about games, and it’s far more a history of computers than of humans. It is, instead, a history of attempts to use what is now unfortunately referred to as “AI” to tackle the myriad problems posed by seven popular board and card games from human history, from chess to bridge. Each of these games presents the programmers with specific, novel issues, and while machine-learning techniques have succeeded in solving some games (like checkers), others have and may forever prove inscrutable (like bridge).

Roeder is a journalist for the Financial Times and clearly a gamer, and someone who loves the games for what they are beyond their competitive aspect (although it becomes clear he is a fierce competitor as well). He writes as an experienced player of all seven games in the book, even though he must have varying skill levels in each – I’d be shocked if he were much of a checkers player, because who on earth in the year of our lord 2024 is a great checkers player? His experience with the games helps infuse a book that could be a rather dry and grim affair with more than a touch of life, especially as he enters tournaments or otherwise competes against experts in games like poker, Scrabble, and backgammon.

What Roeder is really getting at here, however, is the symbiotic relationship between games and machine learning, which is what everyone now calls AI. (AI is itself a misnomer, and there are many philosophers who argue that there can be no intelligence, artificial or otherwise, without culture.) Games are perfect fodder for training AI modules because they tend to present short sets of rules and clear goals, thus giving the code and its coder targets for whatever optimization algorithm(s) they choose. In the case of checkers, this proved simple once the computing power was available; checkers is considered “weakly solved,” with a draw inevitable if both players play perfectly. (Connect 4 is strongly solved; the first player can always win with perfect play.) In the case of bridge, on the other hand, the game may never be solved, both because of its computational complexity and because of the substantial human element involved in its play.

In one of those later chapters, Roeder mentions P=NP in a footnote, which put an entirely different spin on the book for me. P=NP is one of the six unsolved Millennium Prize Problems* in mathematics, also called the P versus NP problem, which asks if a problem’s correct solution can be verified in polynomial time, does that also mean that the problem can be solved in polynomial time? The answer would have enormous ramifications for computational theory, and could indeed impact human life in substantial ways, but the odds seem to be that P does not equal NP – that the time required to solve these problems is orders of magnitude higher than the time required to verify their solutions. (For more on this subject, I recommend Lance Fortnow’s book The Golden Ticket, which I reviewed here in 2015.)

*A seventh, the Poincaré Conjecture, is the only one that has been solved to date.

You can see a thread through the seven chapters where the machine-learning techniques adjust and improve as the games become more complex. From there, it isn’t hard to see this as a narrow retelling of the ongoing history of machine learning itself. The early efforts to solve games like checkers employed brute-force methods – examining all possible outcomes and valuing them to guide optimal choices. More complex games that present larger decision trees and more possible outcomes would require more processing power and time than we have, often more time than remains in the expected life of the universe (and certainly more than remains in the expected life of our suicidal species), and thus required new approaches. Some of the attacks on games later in the book allow the algorithm to prune the tree itself and avoid less-promising branches to reduce processing time and power, thus leading to a less complete but more efficient search method.

Roeder does acknowledge in brief that these endeavors also have a hidden cost in energy. His anecdotes include Deep Blue versus Kasparov and similar matches in poker and go, some of which gained wide press coverage for their results … but not for the energy consumed by the computers that competed in these contests. We’re overdue for a reckoning on the actual costs of ChatGPT and OpenAI and their myriad brethren in silicon, because as far as I can tell, they’re just the new crypto when it comes to accelerating climate change. That’s nice that you can get a machine to write your English 102 final paper for you or lay off a bunch of actual humans to let AI do some things, but I’d like to see you pay the full cost of the electricity you’re using to do it.

I’ve focused primarily on one aspect of Seven Games because that’s what resonated with me, but I may have undersold the book a little in the process. It’s a fun read in many ways because Roeder tells good stories for just about all seven of the games in the book – I might have done without the checkers chapter, because that’s just a terrible game, but it is an important rung in the ladder he’s constructing – and puts himself in the action in several of them, notably in poker tournaments in Vegas. There’s also a warning within the book about the power of so-called AI, and I think inherent in that is a call for caution, although Roeder doesn’t make this explicit. It seemed a very timely read even though I picked it up on a friend’s recommendation because it’s about games. Games, as it turns out, explain quite a bit of life. We wouldn’t be human without them.

Next up: Dark Matter of the Mind: The Culturally Articulated Unconscious, a book by Daniel Everett, a former evangelical Christian missionary who became an atheist and turned to linguistics after his time trying to convert the Amazonian Pirahã tribe. He appeared at length in last year’s outstanding documentary The Mission.

Ada’s Algorithm.

My top 50 free agent rankings went up Friday for Insiders, following by the “deleted scenes” post with capsules on four guys whom I wrote about before their employers picked up their club options. I’ve also got buyers’ guides to catchers and to corner infielders up, with middle infielders due on Tuesday.

Everything seems to be coming up Ada Lovelace lately; largely overlooked in her own time because she was a woman in the early Victorian era and was better known as the one legitimate offspring of the rake Lord Byron, she’s now widely recognized as the creator of the first machine algorithm, the primary ancestor of the modern computer program. The Department of Defense named a programming language (Ada) after her in the early 1980s, and she’s appeared in numerous works of fiction (such as William Gibson’s The Difference Engine) and non-fiction (including a brand-new short work aimed at schoolchildren called Ada Byron Lovelace and the Thinking Machine) over the last 25 years. Since my daughter was working on a short presentation on Lovelace – all the kids were asked to pick a scientist, and she was pissed off because there was only one woman (Marie Curie, of course) on the original list of assignments – I picked up James Essinger’s 2014 biography, Ada’s Algorithm: How Lord Byron’s Daughter Ada Lovelace Launched the Digital Age, which had most of the key details but is padded with a lot of less critical material.

Ada Lovelace’s place in history comes from her friendship with Charles Babbage, who designed (but never built) the first computers, one called the Difference Engine, of which he built one-seventh, and another called the Analytical Engine, which he never built at all due to the prohibitive cost and lack of manufacturing facilities capable of building all of the cogswheels the device required. Babbage was a bit of a mad scientist, prone to emotional outbursts and self-destructive arguments that cost him any shot to gain the funds necessary to build even part of either Engine beyond what he built. He also lacked Ada’s communications skills, and when the Italian mathematican (and later Prime Minister of Italy) Luigi Federico Menabrea wrote a paper describing Babbage’s Analytical Engine, Lovelace translated it into English and supplemented it with her own Notes, the latter of which ran more than twice as long as Menabrea’s original article, and included the algorithm that earned her posthumous fame. She saw the potential of Babbage’s machine that even Babbage did not – that programmers could use it to solve all kinds of mathematical problems beyond mere arithmetic, as long as the programmer could conceive the necessary series of steps for the calculations.

Lovelace died of uterine cancer at 36, and much of the detail of her life is lost both to time and, it’s believed, to her mother’s decision to destroy much of Ada’s correspondence after the latter’s death. Even many of the letters she exchanged with Babbage are gone, leaving any biographer with relatively meager material from which to construct a story of her life. Essinger barely makes it past 200 pages, and even to get to that point he has to fill with material that’s not all that relevant to the reader primarily interested in Ada’s Notes and the algorithm of the book’s title. For example, we don’t need two chapters on Lord Byron, and I was certainly glad I got the book away from my daughter (who found it boring anyway) before she got to the mentions of his incestuous relationship with his half-sister Augusta or the story of how his nanny would take him into her bed, masturbate him, and then later turn around and beat him, often doing both things in the presence of her friends. (That material would seem essential in any biography of Byron himself, though, since it probably explains his later promiscuity and other “immoral” behavior relative to the mores of the era.) Byron was out of Ada’s life for good while she was still an infant, and including such details on his life seems more than just out of place but almost pandering.

Essinger gives us too much of the text of some of her less relevant letters, and inserts his own speculation on things like whether she might have met certain personages of the era, like Charles Darwin, or whether Babbage was in love with Ada, for which there’s no tangible evidence. The first hardcover edition also has numerous typos and minor errors in the text – for example, using “inconceivable” when he meant “conceivable,” which is kind of a weak word anyway – that further added to my impression that I was reading Essinger’s thoughts and opinions rather than a narrative rendering of her life. It seems that we don’t know enough about Ada Lovelace for a full biography, but that doesn’t quite justify surrounding what we do know with speculation or tangential details.

Next up: Speaking of Gibson, I’m reading Mona Lisa Overdrive, the third book in his Sprawl trilogy, which began with the Hugo-winning Neuromancer.

The Golden Ticket.

Lance Fortnow wrote a piece for Communications of the Association for Computing Machinery in 2009 on the state of the P vs NP problem, one of the most important unsolved problems in both mathematics and computer science. That article led to the short (~175 page) book The Golden Ticket: P, NP, and the Search for the Impossible, which I recently read in its Kindle edition (it’s also on iBooks); Fortnow does a solid job of making an abstruse problem accessible to a wider audience, even engaging in some flights of fancy describing a world in which P equals NP … which is almost certainly not true (but we haven’t proven that yet either!).

P vs NP, which was first posed by Kurt Gödel in 1956, is one of the seven Millennium Problems posed by the Clay Mathematics Institute in 2000; solve one and you get a million bucks. One of them, proving the Poincaré Conjecture (which relates to the shape of the universe), was solved in 2010. But if you solve P vs NP affirmatively, you can probably solve the remaining five and collect a cool $6 million for your problems. You’ll find a box of materials under your desk.

Of course, this is far from an easy question to solve. P and NP are two classes of problems in computer science, and while it seems probable that they are not equivalent, no one’s been able to prove that yet. P is the set of all problems that can be quickly (in deterministic polynomial time – so, like, before the heat death of the universe) solved by an efficient algorithm; NP is the set of all problems whose solutions, once found, can be quickly verified by an efficient algorithm. For example, factoring a huge composite number is in NP: There is no known efficient algorithm to factor a large number, but once we’ve found two factors, a computer can quickly verify that the solution is correct. The “traveling salesman problem” is also in NP; it’s considered NP-complete, meaning that it is in NP and in NP-hard, the set of all problems which are at least as hard as the hardest problems in NP. We can find good solutions to many NP-hard problems using heuristics, but we do not have efficient algorithms to find the optimal solution to such a problem.

If P does in fact equal NP, then we can find efficient algorithms for all problems in NP, even those problems that are NP-complete, and Fortnow details all of these consequences, both positive and negative. One major negative consequence, and one in which Fortnow spends a significant amount of time, would be the effective death of most current systems of cryptography, including public-key cryptography and one-way hashing functions. (In fact, the existence of one-way functions as a mathematical truth is still an unsolved problem; if they exist, then P does not equal NP.) But the positive consequences are rather enormous; Fortnow gives numerous examples, the most striking one is the potential for quickly developing individualized medicines to treat cancer and other diseases where protein structure prediction is an obstacle in quickly crafting effective treatments. He also works in a baseball story, where the game has been dramatically changed across the board by the discovery that P=NP – from better scheduling to accurate ball/strike calls (but only in the minors) to the 2022 prohibition of the use of computers in the dugout. It’s Shangri-La territory, but serves to underscore the value of an affirmative proof: If we can solve NP problems in deterministic polynomial time (as opposed to nondeterministic polynomial time, where NP gets its name), our ability to tease relationships out of huge databases and find solutions to seemingly intractable logical and mathematical problems is far greater than we realized.

Of course, P probably doesn’t equal NP, because that would just be too easy. That doesn’t mean that NP-complete problems are lost causes, but that those who work in those areas – operations research, medicine, cryptography, and so on – have to use other methods to find solutions that are merely good rather than optimal. Those methods include using heuristics that simplify the problem, approximating solutions, and solving a different but related problem that’s in P. If Fortnow falls short at all in this book, it’s in devoting so much more time to the brigadoon where P=NP and less to the likely real world quandary of solving NP-complete problems in a universe where P≠NP. He also gives over a chapter to the still theoretical promise of quantum computing, including its applications to cryptography (significant) and teleportation (come on), but it seems like a digression from the core question in The Golden Ticket. We don’t know if P equals NP, but as Fortnow reiterates in the conclusion, even thinking about the question and possible approaches to proving it in either direction affect work in various fields that underpin most of our current technological infrastructure. If you’ve ever bought anything online, or even logged into web-based email, you’ve used a bit of technology that only works because, as of right now, we can’t prove that P=NP. For a very fundamental question, the P vs NP problem is scarcely known, and Fortnow does a strong job of presenting it in a way that many more readers can understand it.

If this sounds like it’s up your alley or you’ve already read it, I also suggest John Derbyshire’s Prime Obsession, about the Riemann Hypothesis, another of the Clay Millennium Institute’s six as-yet unsolved problems.