Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence • By Kate Crawford • Yale University Press • 336 pages • ISBN: 978-0-300-20957-0 • £20
“Ask forgiveness, not permission” has long been a guiding principle in Silicon Valley. There is no technological field in which this principle has been more practiced than the machine learning in modern AI, which depends for its existence on giant databases, almost all of which are scraped, copied, borrowed, begged, or stolen from the giant piles of data we all emit daily, knowingly or not. But this data is hardly ever rigorously sourced with the subjects’ permission.
“Because we can,” two sociologists tell Kate Crawford in Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence, by way of acknowledging that their academic institutions are no different from technology companies or government agencies in regarding any data they find as theirs for the taking to train and test algorithms. Images become infrastructure. This is how machine learning is made.
Everyone wants to talk about what AI is good or dangerous for — identifying facial images, interpreting speech commands, driving cars (not yet!). Many want to pour ethics over today’s AI, as if making rules could alter the military funding that has defined its fundamental nature. Few want to discuss AI’s true costs. Kate Crawford, a senior researcher at Microsoft and a research professor at the University of Southern California, is the exception.
In Atlas of AI, Crawford begins by deconstructing the famous contention that ‘data is the new oil‘. Normally, that leads people to talk about data’s economic value, but Crawford focuses on the fact that both are extractive technologies. Extraction is mining (as in ‘data mining’ or oil wells), and where mining goes, so follow environmental damage, human exploitation, and profound society-wide consequences.
Crawford underlines this point by heading to Silver Peak, Nevada, to visit the only operating lithium mine in the US. Lithium is, of course, a crucial component in battery packs for everything from smartphones to Teslas. Crawford follows this up by considering the widening implications of extraction for labour, the sources of data, classification algorithms, and the nation-state behaviour it all underpins, finishing up with the power structures enabled by AI-as-we-know-it. This way lies Project Maven and ‘signature strikes’ in which, as former CIA and NSA director Michael Hayden admitted, metadata kills people.
Yet some of this is patently false. Crawford traces back the image datasets on which the latest disturbing snake oil — emotion recognition — is based, and finds they were built from posed pictures in which the subjects were told to provide exaggerated examples of emotional reactions. In this case, ‘AI’ is manufactured all the way down. Is there, as Tarleton Gillespie asked about Twitter trends,