The technology already exists. It’s only the will we’re lacking.” These sentences from Cathy O’Neil’s new book Weapons of Math Destructionhave been haunting me since I read it. They come from the last chapter of a book in which she has illustrated again and again how, in the words of her subtitle, “big data increases inequality and threatens democracy.” With Facebook’s new trending topics algorithm and data-driven policingin the news, the book is certainly timely.
Weapons of math destruction, which O’Neil refers to throughout the book as WMDs, are mathematical models or algorithms that claim to quantify important traits: teacher quality, recidivism risk, creditworthiness but have harmful outcomes and often reinforce inequality, keeping the poor poor and the rich rich. They have three things in common: opacity, scale, and damage. They are often proprietary or otherwise shielded from prying eyes, so they have the effect of being a black box. They affect large numbers of people, increasing the chances that they get it wrong for some of them. And they have a negative effect on people, perhaps by encoding racism or other biases into an algorithm or enabling predatory companies to advertise selectively to vulnerable people, or even by causing a global financial crisis.
O’Neil is an ideal person to write this book. She is an academic mathematician turned Wall Street quant turned data scientist who has been involved in Occupy Wall Street and recently started an algorithmic auditing company. She is one of the strongest voices speaking out for limiting the ways we allow algorithms to influence our lives and against the notion that an algorithm, because it is implemented by an unemotional machine, cannot perpetrate bias or injustice.