Discriminating Data -- Book Review

Book review at boundary 2

I remember snickering when Chris Anderson announced "The End of Theory" in 2008. Writing in Wired magazine, Anderson claimed that the structure of knowledge had inverted. It wasn't that models and principles revealed the facts of the world, but the reverse, that the data of the world spoke their truth unassisted. Given that data were already correlated, Anderson argued, what mattered was to extract existing structures of meaning, not to pursue some deeper cause. Anderson's simple conclusion was that "correlation supersedes causation...correlation is enough."

This hypothesis -- that correlation is enough -- is the thorny little nexus at the heart of Wendy Chun's new book, Discriminating Data. Chun's topic is data analytics, a hard target that she tackles with technical sophistication and rhetorical flair. Focusing on data-driven tech like social media, search, consumer tracking, AI, and many other things, her task is to exhume the prehistory of correlation, and to show that the new epistemology of correlation is not liberating at all, but instead a kind of curse recalling the worst ghosts of the modern age. As Chun concludes, even amid the precarious fluidity of hyper-capitalism, power operates through likeness, similarity, and correlated identity.

While interleaved with a number of divergent polemics throughout, the book focuses on four main themes: correlation, discrimination, authentication, and recognition. Chun deals with these four as general problems in society and culture, but also interestingly as specific scientific techniques. For instance correlation has a particular mathematical meaning, as well as a philosophical one. Discrimination is a social pathology but it's also integral to discrete rationality. I appreciated Chun's attention to details large and small; she's writing about big ideas -- essence, identity, love and hate, what does it mean to live together? -- but she's also engaging directly with statistics, probability, clustering algorithms, and all the minutia of data science.

Continue Reading at boundary 2

Golden Age of Analog

My essay on the "Golden Age of Analog" is now published in a special issue of Critical Inquiry on "Surplus Data" edited by Orit Halpern, Patrick Jagoda, Jeffrey West Kirkwood, and Leif Weatherby. Email me if you're paywalled.

Oh, for days long gone, when intellectuals sparred over symbolic economies and cultural logics. Gone are those heady chats about écriture and the pleasures of textuality. How quaint would it seem today for a critic to proclaim, defiant, that there is nothing outside of the text. Who speaks that way anymore? Who speaks of word, symbol, text, code, economy, social structures, or cultural logics? Of course, many of us still do; nevertheless, this language feels reminiscent of another time. Or, to be more precise, the language of language is reminiscent of another time.

The world is awash in data, yet these days it is much more common to encounter scholarly takes on a series of distinctly nondigital themes: books about affect or sensation; treatises on aesthetics as first philosophy; essays on the ethical turn (turning away from the political) or on real materiality (turning away from symbolic abstraction); manifestos proclaiming, defiant, that there is nothing outside of the real.

A generation ago, the theoretical humanities was fixated on codes, logics, the arrangement of texts, and the machinations of the symbolic order. Today the theoretical humanities is more likely to address topics such as perception, experience, indeterminacy, or contingency. Why in the digital age have some of our best thinkers turned toward characteristically analog themes?

Continue Reading

Discriminating Data

My publisher asked for an end-of-the-year book selection. I suggested Wendy Hui Kyong Chun's excellent new book Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition and sent a few sentences for context. Read the full list of picks here.

I remember snickering when Chris Anderson announced "The End of Theory" in 2008. Writing in Wired magazine, Anderson claimed that the structure of knowledge had inverted. It wasn't that models and principles revealed the facts of the world, but the reverse, that the data of the world spoke their truth unassisted. Anderson's simple conclusion was that "correlation supersedes causation...correlation is enough." Wendy Chun's excellent new book shows the social and political shortcomings of a contemporary technical infrastructure built around correlation, including the algorithms driving social media, search, consumer tracking, AI, and many other things. As Chun argues, power today operates through likeness, similarity, and correlated identity ("homophily"). Tech bros hope that by ignoring difference they can overcome it. Yet for Chun the attempt to find an unmarked category of subjectivity will necessarily erase and exclude those structurally denied access to the universal. Correlation isn't enough. It ratifies the past rather than reimagining the present. Chun ends with an eloquent call to acknowledge "a world that resonates with and in difference."

https://www.versobooks.com/blogs/5227-verso-authors-pick-their-favorite-books-of-the-year

Premature Optimization Is the Root of All Evil

"Premature optimization is the root of all evil (or at least most of it) in programming" --Donald Knuth

"Craft is attention to details" --Zach Lieberman

Donna Haraway, "Informatics of Domination"

I recently drafted a short text for a friend responding to Donna Haraway's "Informatics of Domination" chart, specifically the pair of terms "Perfection -- Optimization." My assignment was to reflect on these terms, including how Haraway has aligned them to specific historic periods, while also suggesting a third term to add to the pair. (I'll save talking about the third for when the text comes out.)

Perfection and optimization are terms that originate in moral and metaphysical discourse. Perfection refers to something having been fully accomplished, to something in a state of completion. From a Latin root verb meaning "to make," perfection entails a process of production. To perfect something is to intervene positively in its development, to push it in a particular direction, to craft it and finish it and make it shine. Perfection connotes maturity, development, completion.

Continue reading

Book Launch -- With Beatrice Fazi, Seb Franklin, and Bernard Dionysius Geoghegan

Please join me at a book launch event for Uncomputable: Play and Politics in the Long Digital Age, hosted by the Aesthetics and History of Media working group at King’s College London. This event will take place on Zoom, at 12pm EST (5pm GMT) on Tuesday November 23.

Register Here

We will begin with a short presentation by me, followed by responses from Beatrice Fazi, Seb Franklin, and Bernard Dionysius Geoghegan.

+ + +

Narrating some lesser known episodes from the deep history of digital machines, Alexander Galloway explains the technology that drives the world today, and the fascinating people who brought these machines to life. With an eye to both the computable and the uncomputable, Galloway shows how computation emerges or fails to emerge, how the digital thrives but also atrophies, how networks interconnect while also fray and fall apart. By re-building obsolete technology using today’s software, the past comes to light in new ways, from intricate algebraic patterns woven on a hand loom, to striking artificial-life simulations, to war games and back boxes. A description of the past, this book is also an assessment of all that remains uncomputable as we continue to live in the aftermath of the long digital age.

M. Beatrice Fazi is Reader in Digital Humanities in the School of Media, Arts and Humanities (University of Sussex). Her work explores questions located at the intersection of philosophy, technoscience and culture, and her research interests include media philosophy and theory, digital aesthetics, continental philosophy, computation and artificial intelligence, critical and cultural theory. She is the author of Contingent Computation: Abstraction, Experience, and Indeterminacy in Computational Aesthetics.

Seb Franklin is Senior Lecturer in Contemporary Literature in the Department of English at King’s College London. He is the author of The Digitally Disposed: Racial Capitalism and the Informatics of Value and Control: Digitality as Cultural Logic.

Bernard Dionysius Geoghegan is Senior Lecturer in the History and Theory of Digital Media in the Department of Digital Humanities at King’s College London. He is a media theorist and historian of science researching how digital technologies shape science, culture, and the environment. His book, From Information Theory to French Theory, is forthcoming from Duke University Press.

Three new essays on the digital and the analog

I'm excited to announce three new essays that will be published in the next few months. I'll link to the texts once they become available.

"Golden Age of Analog," Critical Inquiry 48, no. 2 (Winter 2022)—Why in the digital age have some of our best thinkers turned toward characteristically analog themes?

"The Gender of Math," differences 32, no. 3 (2021)—Math has a gender issue...but how and why? On the problem of essential bias in mathematics, algorithms, and digital media.

"The Origin of Geometry," Grey Room 86 (Winter 2022)—What is a point? Here I address point as unity, puncture, and mark (or in the Greek tradition monas, stigme, and semeion).