Skip to main content

Verified by Psychology Today

Math: The Extra Sense

In How Not to Be Wrong, Jordan Ellenberg argues that math is not a foreign language but a way of idiot-proofing our native tongue.

There is nothing like reading correspondence between two geniuses to appreciate all the ways in which a person can be right—which is to say, can be brilliant—and still feel stymied. In lauding his cousin Francis Galton's study of individual differences in aptitude, which paved the way for the field of psychometrics, Darwin wrote to Galton: "I find it very hard work; but that is wholly the fault of my brain and not of your beautifully clear style." Darwin wasn't just being modest. In his memoirs he confessed that he "deeply regretted" never having attained mathematical competence because "men thus endowed seem to have an extra sense."

An empirically rigorous genius by any standard, Darwin had a healthy reverence for his cousin's more quantitative work. He knew enough to appreciate Galton's findings, but also enough to know what he himself didn't know. How Not to Be Wrong is probably denser per page with the work of bona fide geniuses than 99 percent of popular writing (not to put too fine a mathematical point on it), and it is at the same time marvelously accessible. It acknowledges the chasm between appreciating and formulating—between Darwin's grasp of sophisticated math and Galton's—and offers a way in.

Jordan Ellenberg, a professor of mathematics at the University of Wisconsin-Madison and himself a former child prodigy (he's so modest about this that a casual reader could miss the confession), explores inference, linearity, and probability with a poet's touch. You may not possess the language with which to talk about infinitesimal quantities, but you can surely recognize that 0.999999[...] and 1 are comparable to two different words that refer to the same object. And you may not know that contemporary scientific discourse lurches forward on the back of the slippery p-value 0.05, but you can appreciate that a way of evaluating scientific findings is to ask how surprised one should be by any particular result. Ellenberg distills ideas so gracefully that fields from philosophy to decision theory collapse in a tour de force of clear thinking. Here's his parallel between the biological revolution ushered in by Darwin and Galton's mathematical one: "Darwin showed that one could meaningfully talk about progress without any need to invoke purpose. Galton showed that one could meaningfully talk about association without any need to invoke underlying cause."

If the book is syncretic, it is not only because Ellenberg is an erudite and skilled guide. It is also because mathematics contains "many, many complicated objects but only a few simple ones." How early should you arrive at the airport, given that time spent waiting must be weighed against the risk of missing your plane? What is the ideal rate of taxation given that the extremes yield no revenue at all (at 0 percent the government amasses no revenue, and at 100 percent there's nothing to collect because nobody is incentivized to work)? Both questions can be described as a simple curve wherein the extremes are untenable, so the optimal solution falls in the intermediate range. Ellenberg surfs the curve, beginning with Zeno, the famous Greek philosopher whose obsession with infinitesimal numbers anticipated Newtonian calculus, specifically the Newtonian "fluxion" or derivative. It is as simple as the zoomed-in section at which a segment of a curve looks like a line. Ellenberg explains this on a number-free page subtitled "The Page Where I Teach You Calculus," and why not? Throughout, he argues that math's overarching power derives from the fact that it is "the extension of common sense by other means." Math, we are assured, is not a foreign language, but a way of idiot-proofing our native tongue.

And yet, idiotic applications of math abound. John Allen Paulos took on Americans' mathematical incompetence beginning with his 1988 bestseller Innumeracy. Likewise, How Not to Be Wrong is full of cautionary tales. On the subject of proportion, Ellenberg enjoins us from the mistakes that have felled many an op-ed contributor and member of Congress: Don't compare proportions of populations eliminated if you're comparing across space and time, sample size, and denominator. If you do, you just might find yourself asserting that the massacre of the Herero and King Leopold II's genocidal policies were the biggest atrocities of the twentieth century because they killed such a huge proportion of the population in Namibia and the Congo respectively.

The genre of explanatory science writing has evolved as rapidly as has the field of compressed sensing (Google it) since Paulos's bestseller debuted. Today's publishing formula calls for the anecdote-driven ideas book. How Not to Be Wrong delivers with a high-low tour of Western Civ, in which Pascal's wager gets its due alongside a three-point plan for braving Powerball.

Ellenberg takes a strong stance on the (mis)use of statistical hypothesis testing, an issue that has roiled social science for decades and continues to gather steam. Say you want to design a study to demonstrate that eating kale cures insomnia. The rule of significance testing is that data cannot confirm your theory, it can only fail to disconfirm it. In other words, when testing, assume that kale has no effect on sleep (the null hypothesis) and calculate the probability of nonetheless obtaining the data you obtain. The smaller the odds, the better for your experiment: The generally—critics would say, "arbitrarily"—agreed upon threshold for significance is 0.05 (the p-value), which suggests odds of about 1 in 20 that the observed effect of kale on sleep is not due to chance.

The null hypothesis taken literally is almost always false, because everything has some effect, however minuscule. Add to that the problem of "p-hacking," wherein researchers endeavor to bludgeon data to the 0.05 threshold. Then there's the file drawer problem: Studies that fail to replicate results are never published. We've ceded far too much power to the significance test, argues Ellenberg, making it the arbiter of our scientific corpus when really it is no more than a signal to look further.

Early on, Ellenberg cites John von Neumann's legendary 1947 essay, "The Mathematician," in a passage that calls for math to remain grounded in realworld inquiries. Von Neumann could just as easily be describing some strands of postmodern literary criticism or social science theorizing in the absence of peer-reviewed, replicated data, when he writes: "As a [mathematical] discipline travels from its empirical source, or still more, if it is a second and third generation only indirectly inspired by ideas coming from 'reality,' it is beset with very grave dangers. It becomes more and more purely aestheticizing, more and more purely l'art pour l'art."

There's no danger of that in How Not to Be Wrong, which is defiantly nontrivial. To truly comprehend a mathematical construct, says Ellenberg, is to "feel you've reached into the universe's guts and put your hand on the wire." This book is for those who will never touch the wire, but who appreciate the sparks.

Yes, Virginia, There Is a Way to Ace Calculus

Lessons from A Mind for Numbers: How to Excel at Math and Science (Even If You Flunked Algebra)

It is a dilemma for many a liberal arts major: Do you cling tight to your B.A. in Slavic Languages and Literature or strike out for more quantitative shores?

Barbara Oakley's life, and her most recent book, offer inspiration.

A gifted linguist, Oakley enlisted in the U.S.Army to become a translator of Russian, but as an officer she was assigned to the Signal Corps, where she found radio, cable, and telephone switching systems intimidating. Oakley made a conscious decision to embrace the STEM work she'd spent her young life avoiding. Using the GI bill, she returned to school and wrestled her computational aversions into oblivion. Oakley ultimately earned a doctorate in systems engineering. Now a professor at Oakland University in Michigan, she argues that "we develop a passion for what we're good at. The mistake is thinking that if we aren't good at something, we do not have and can never develop a passion for it."

To that end, her book A Mind for Numbers amasses both cognitive and time-management techniques for efficiently retaining difficult material, distinguishing time-wasters from strategies that truly encode knowledge. Many a student will be chagrined to learn that passively rereading and highlighting is the worst approach you can take. Oakley pulls tips from the files of Nobel laureates, as well as from students and professors. Sometimes you need to go abstract: Generate a metaphor or mnemonic for a concept, or mentally inhabit the problem. Cytogeneticist Barbara McClintock brought a chromosome's eye view to her Nobel-winning work on genetic transposition in maize plants, imagining herself so vividly embedded in her quarry that she later called the chromosomal structures of corn her "friends." Such creative methods are best applied alongside pragmatic processes. These include writing information out by hand, spaced repetition of new material, and timed work sessions (25 solid minutes is a good target). Oakley's grab bag of tactics reflects the paradoxical nature of learning itself, which requires both focused attention and diffuse integration to truly master a concept.