Lecture title: "Read, Write, Execute: What are the Machines that Determine Thinking?"
I remember snickering when Chris Anderson announced "The End of Theory" in 2008. Writing in Wired magazine, Anderson claimed that the structure of knowledge had inverted. It wasn't that models and principles revealed the facts of the world, but the reverse, that the data of the world spoke their truth unassisted. Given that data were already correlated, Anderson argued, what mattered was to extract existing structures of meaning, not to pursue some deeper cause. Anderson's simple conclusion was that "correlation supersedes causation...correlation is enough."
This hypothesis -- that correlation is enough -- is the thorny little nexus at the heart of Wendy Chun's new book, Discriminating Data. Chun's topic is data analytics, a hard target that she tackles with technical sophistication and rhetorical flair. Focusing on data-driven tech like social media, search, consumer tracking, AI, and many other things, her task is to exhume the prehistory of correlation, and to show that the new epistemology of correlation is not liberating at all, but instead a kind of curse recalling the worst ghosts of the modern age. As Chun concludes, even amid the precarious fluidity of hyper-capitalism, power operates through likeness, similarity, and correlated identity.
While interleaved with a number of divergent polemics throughout, the book focuses on four main themes: correlation, discrimination, authentication, and recognition. Chun deals with these four as general problems in society and culture, but also interestingly as specific scientific techniques. For instance correlation has a particular mathematical meaning, as well as a philosophical one. Discrimination is a social pathology but it's also integral to discrete rationality. I appreciated Chun's attention to details large and small; she's writing about big ideas -- essence, identity, love and hate, what does it mean to live together? -- but she's also engaging directly with statistics, probability, clustering algorithms, and all the minutia of data science.
Math has a gender issue... but how and why?
I have a new essay on "The Gender of Math" just published in the current issue of differences: A Journal of Feminist Cultural Studies. This article is on the problem of essential bias in mathematics, algorithms, and digital media.
(Email me if you're paywalled, although I believe the above link is open access.)
My essay on the "Golden Age of Analog" is now published in a special issue of Critical Inquiry on "Surplus Data" edited by Orit Halpern, Patrick Jagoda, Jeffrey West Kirkwood, and Leif Weatherby. Email me if you're paywalled.
Oh, for days long gone, when intellectuals sparred over symbolic economies and cultural logics. Gone are those heady chats about écriture and the pleasures of textuality. How quaint would it seem today for a critic to proclaim, defiant, that there is nothing outside of the text. Who speaks that way anymore? Who speaks of word, symbol, text, code, economy, social structures, or cultural logics? Of course, many of us still do; nevertheless, this language feels reminiscent of another time. Or, to be more precise, the language of language is reminiscent of another time.
The world is awash in data, yet these days it is much more common to encounter scholarly takes on a series of distinctly nondigital themes: books about affect or sensation; treatises on aesthetics as first philosophy; essays on the ethical turn (turning away from the political) or on real materiality (turning away from symbolic abstraction); manifestos proclaiming, defiant, that there is nothing outside of the real.
A generation ago, the theoretical humanities was fixated on codes, logics, the arrangement of texts, and the machinations of the symbolic order. Today the theoretical humanities is more likely to address topics such as perception, experience, indeterminacy, or contingency. Why in the digital age have some of our best thinkers turned toward characteristically analog themes?
My publisher asked for an end-of-the-year book selection. I suggested Wendy Hui Kyong Chun's excellent new book Discriminating Data: Correlation, Neighborhoods, and the New Politics of Recognition and sent a few sentences for context. Read the full list of picks here.
I remember snickering when Chris Anderson announced "The End of Theory" in 2008. Writing in Wired magazine, Anderson claimed that the structure of knowledge had inverted. It wasn't that models and principles revealed the facts of the world, but the reverse, that the data of the world spoke their truth unassisted. Anderson's simple conclusion was that "correlation supersedes causation...correlation is enough." Wendy Chun's excellent new book shows the social and political shortcomings of a contemporary technical infrastructure built around correlation, including the algorithms driving social media, search, consumer tracking, AI, and many other things. As Chun argues, power today operates through likeness, similarity, and correlated identity ("homophily"). Tech bros hope that by ignoring difference they can overcome it. Yet for Chun the attempt to find an unmarked category of subjectivity will necessarily erase and exclude those structurally denied access to the universal. Correlation isn't enough. It ratifies the past rather than reimagining the present. Chun ends with an eloquent call to acknowledge "a world that resonates with and in difference."
Recorded an interview with Chris Crawford a few weeks ago, which he has edited into a nice little book trailer.
+ + +
I also recorded the "Preface" and "Introduction" as an audiobook. Download the MP3 here or listen below..
A reminder to please join me on Zoom for two book talks in the coming days..
I'll be doing a flip flop with Jacob Gaboury (he talks about my book and I talk about his) this Wednesday, Nov 17 at 12:30pm EST.
And next Tuesday, Nov 23 at 12pm EST Beatrice Fazi, Seb Franklin, and Bernard Dionysius Geoghegan will join to roast me talk about the book.
Looking forward to these conversations.
"Premature optimization is the root of all evil (or at least most of it) in programming" --Donald Knuth
"Craft is attention to details" --Zach Lieberman
I recently drafted a short text for a friend responding to Donna Haraway's "Informatics of Domination" chart, specifically the pair of terms "Perfection -- Optimization." My assignment was to reflect on these terms, including how Haraway has aligned them to specific historic periods, while also suggesting a third term to add to the pair. (I'll save talking about the third for when the text comes out.)
Perfection and optimization are terms that originate in moral and metaphysical discourse. Perfection refers to something having been fully accomplished, to something in a state of completion. From a Latin root verb meaning "to make," perfection entails a process of production. To perfect something is to intervene positively in its development, to push it in a particular direction, to craft it and finish it and make it shine. Perfection connotes maturity, development, completion.
Please join me at a book launch event for Uncomputable: Play and Politics in the Long Digital Age, hosted by the Aesthetics and History of Media working group at King’s College London. This event will take place on Zoom, at 12pm EST (5pm GMT) on Tuesday November 23.
We will begin with a short presentation by me, followed by responses from Beatrice Fazi, Seb Franklin, and Bernard Dionysius Geoghegan.
+ + +
Narrating some lesser known episodes from the deep history of digital machines, Alexander Galloway explains the technology that drives the world today, and the fascinating people who brought these machines to life. With an eye to both the computable and the uncomputable, Galloway shows how computation emerges or fails to emerge, how the digital thrives but also atrophies, how networks interconnect while also fray and fall apart. By re-building obsolete technology using today’s software, the past comes to light in new ways, from intricate algebraic patterns woven on a hand loom, to striking artificial-life simulations, to war games and back boxes. A description of the past, this book is also an assessment of all that remains uncomputable as we continue to live in the aftermath of the long digital age.
M. Beatrice Fazi is Reader in Digital Humanities in the School of Media, Arts and Humanities (University of Sussex). Her work explores questions located at the intersection of philosophy, technoscience and culture, and her research interests include media philosophy and theory, digital aesthetics, continental philosophy, computation and artificial intelligence, critical and cultural theory. She is the author of Contingent Computation: Abstraction, Experience, and Indeterminacy in Computational Aesthetics.
Seb Franklin is Senior Lecturer in Contemporary Literature in the Department of English at King’s College London. He is the author of The Digitally Disposed: Racial Capitalism and the Informatics of Value and Control: Digitality as Cultural Logic.
Bernard Dionysius Geoghegan is Senior Lecturer in the History and Theory of Digital Media in the Department of Digital Humanities at King’s College London. He is a media theorist and historian of science researching how digital technologies shape science, culture, and the environment. His book, From Information Theory to French Theory, is forthcoming from Duke University Press.