Never before has it been so fashionable to be against numerical thinking.
Charges of anti-intellectualism in American life are as old as the republic. It’s the inevitable consequence of being a bottom-up state and the high degree of pragmatism that comes with it. As Jules Verne wrote of the mid-nineteenth-century United States, “The Yankees are engineers the way Italians are musicians and Germans are metaphysicians: by birth.”
What makes our current moment unique is the fact that this time the fear of ideas isn’t coming from the prairies, the backwaters, or the hilltops. It’s coming from within the elite bastions themselves, those citadels of the urbane and the cosmopolitan. At stake in this revolt is nothing less than the place of quantification within everyday life. Never before has it been so fashionable to be against numerical thinking.
The dean of this new wave is Leon Wieseltier, literary editor of The New Republic, who is making something of a crusade to cow science and quantification into submission:
“What [science] denies is that the differences between the various realms of human existence, and between the disciplines that investigate them, are final…. All data points are not equally instructive or equally valuable, intellectually and historically. Judgments will still have to be made; frameworks for evaluation will still have to be readied against the informational deluge.”
Wieseltier’s dream is that the world can be neatly partitioned into two kinds of thought, scientific and humanistic, quantitative and qualitative, remaking the history of ideas in the image of C.P. Snow’s two cultures. Quantity is OK as long as it doesn’t touch those quintessentially human practices of art, culture, value, and meaning.
Wieseltier’s goal is as unfortunate as it is myopic. It does a disservice, first, to the humanistic bases of scientific inquiry. Scientists aren’t robots—they draw their ideas from deep, critical reflection using numerical and linguistic forms of reasoning. The idea that you can separate these into two camps would make little sense to most practicing researchers. To be sure, we all know of laughable abuses of numbers, especially when applied to cultural or human phenomena (happiness studies anyone?). This is all the more reason to argue for not sequestering numbers off from humanistic disciplines that pride themselves on conceptual complexity. Creating knowledge fences only worsens the problem.
But it also has the effect of hardening educational trends in which we think of students as either math and science kids or reading and verbal ones. Our curricula are designed to reinforce Wieseltier’s argument into over-simplified binaries, ones that come at the expense of our own human potential. Most importantly, in my view, is the way this line of thinking lacks precedent in the history of ideas. Some of the intellectual giants of the past, people like Leibniz, Descartes, or Ludwig Wittgenstein—presumably the folks Wieseltier would admire most—were trained as mathematicians. The history of literature, too, that most prohibited territory of numbers’ perennial overreach, is rife with quantitative significance. Why are there nine circles of hell in Dante’s Inferno? 100 stories in Boccaccio’s Decameron? 365 chapters in Hugo’s Les Misérables? 108 lines in Poe’s “The Raven”? Not to mention the entire field of prosody: why is so much French classical theatre composed in lines of 12 syllables or English drama in 10? Why did there emerge a poetic genre of exactly 14 lines that has lasted for over half a millennium?
Such questions only begin to scratch the surface of the ways in which quantity, space, shape, and pattern are integral to the human understanding of beauty and ideas. Quantity is part of that drama of what Wieseltier calls the need to know “why.”
Wieseltier’s campaign is just the more robust clarion call of subtler and ongoing assumptions one comes across all the time, whether in the op-eds of major newspapers, blogs of cultural reviews, or the halls of academe. Nicholas Kristof’s charge that academic writing is irrelevant because it relies on quantification is one of the more high-profile cases. The recent reception of Franco Moretti’s National Book Critics Award for Distant Reading is another good case in point. What’s so valuable about Moretti’s work on quantifying literary history, according to The New Yorker’s books blog, is that we can ignore it. “I feel grateful for Moretti,” writes Joshua Rothman. “As readers, we now find ourselves benefitting from a division of critical labor. We can continue to read the old-fashioned way. Moretti, from afar, will tell us what he learns.”
We can continue doing things the way we’ve always done them. We don’t have to change. The saddest part about this line of thought is this is not just the voice of journalism. You hear this thing inside academia all the time. It (meaning the computer or sometimes just numbers) can’t tell you what I already know. Indeed, the “we already knew that” meme is one of the most powerful ways of dismissing any attempt at trying to bring together quantitative and qualitative approaches to thinking about the history of ideas.
As an inevitable backlash to its seeming ubiquity in everyday life, quantification today is tarnished with a host of evils. It is seen as a source of intellectual isolation (when academics use numbers they are alienating themselves from the public); a moral danger (when academics use numbers to understand things that shouldn’t be quantified they threaten to undo what matters most); and finally, quantification is just irrelevant. We already know all there is to know about culture, so don’t even bother.
I hope one day this will all pass and we’ll see the benefits of not thinking about these issues in such either/or ways, like the visionary secretary of Jules Verne’s imaginary “Baltimore Gun Club,” who cries, “Would you like figures? I will give you some eloquent ones!” In the future, I hope there will be a new wave of intellectualism that insists on conjoining these two forms of thought, the numerical and the literal, figures and eloquence. It seems so much more human.