Derrida's Macintosh

I've been finishing an essay for a friend about Friedrich Kittler and the "media a priori" of philosophy. It's an idea I have had on the back burner for several years. Ever since Wolfgang Ernst told me the story of Heidegger's radio--a burgundy Grundig 88, which he belatedly allowed into his otherwise techno-primitive hillside cabin during the Cuban Missile Crisis on fears of nuclear catastrophe--I've thought a lot about the technologies beneath and behind philosophy. I'm thinking of Freud's mystic writing-pad, or Nietzsche's typewriter (which Kittler devoted a whole chapter to in Discourse Networks). But that's just the start... Why not also Sarah Kofman's camera obscura, Du Bois's data vis, Derrida's Macintosh, Deleuze's VCR...a parade of philosophy's media a priori. Continue reading

Enjoying these Theory Podcasts...

What's Left of Philosophy -- absolute masterclass.

Acid Horizon -- love these guys.

Why Theory -- 95% wrong most of the time (why Hegel??) but very fun listening.

Aufhebunga Bunga -- high quality riffing.

Machinic Unconscious Happy Hour -- good vibes.

"Macintosh"

"...until the day when he himself closed his Macintosh."

Dear reader, I give you what might be the worst poem ever written by a theorist. Yes, I'm speaking of Michael Fried's poem titled "Macintosh," published by the venerable journal Critical Inquiry in 2007 in a special issue on the death of Jacques Derrida. Very difficult to top this...

+ + +

Macintosh

It all hangs by a hair:
one day everything is going well
and the next some test result comes back just slightly awry
and you are embarked.

When a visitor let drop the news
that Jacques was seriously ill
I interjected, "Vous êtes sûr?" What I meant was: How can that be--

as if the rate at which he produced his books
not to mention the avidity with which each was devoured
would keep him safe from harm
until the day when he himself closed his Macintosh.

When something ceases not to write itself

I'm returning to Kittler more and more, and in fact trying to write about him... I hit on a treasure trove of material that I'll say more about in future. But for the moment I wanted to post a few thoughts about identity and difference, and how identity and difference connect with the digital and the analog. The prevailing position today is that the analog real is defined via difference while the digital symbol is defined via identity. There's something to that. But the more I explore the theory of the analog and the digital, the more convinced I am that identity and difference exist somehow before or beyond the digital and the analog. We can talk about analog identity and analog difference, but we can also talk about digital identity and digital difference. Let's walk through a bit of this with reference to Kittler and Lacan. Continue reading

Software and/as Math

In the past I've made claims like "software is math"...and I usually get taken to task over it. I acknowledge that others, particularly humanists and social scientists, might be uncomfortable with such a reduction. Math and computation are not the same thing. And certainly the experience or usage of computers is a broad arena, spanning many fields including sociology and anthropology, and not merely reducible to boolean logic or symbol manipulation. Given time it would be necessary to define these terms more clearly and show how they are related. That said, I consider math, logic, and computation to be intimately connected, often so intimately connected as to justify reducing one to another. So for the time being I'm tempted not to ameliorate the debate but in fact to push this point further.  Continue reading

Can You Forge the Letter "A"?

No, you can't. At least that's the conclusion of two key philosophers of digitality, Nelson Goodman and Gottlob Frege. It's impossible to forge the discrete elements of a symbolic system, elements like the letter "A," the number "7," or the truth values "true" and "false."

Symbolic expressions like words and sentences can be forged, of course, not to mention paintings which are eminently forgeable. But symbols themselves are unforgeable. Or so claim both Goodman and Frege.

I described Frege previously in a post on "Digital Univocity." Today I'd like to explore Goodman. If Frege approached the question from the perspective of sets and truth values, Goodman's approach was essentially semiotic and graphical. And while they both come to the same conclusion, they do so for quite different reasons. It's worth walking through the argument to see why and how they claim the uniqueness of discrete atoms. This sense of the absolute incorruptibility of the symbolic order gets to the very heart of digital philosophy. Continue reading

Support the GSOC Strike at NYU

UPDATE -- GSOC has reached a tentative agreement with the administration. Very proud of all the graduate workers at NYU. Strikes work!

+ + +

I support the NYU Graduate Student Organizing Committee (GSOC) in their process of negotiating a fair contract with the university administration. NYU president Andrew Hamilton -- who makes two million dollars per year -- has announced he won't negotiate, thus forcing a strike by graduate students.

As an NYU faculty member I stand with the graduate students. I will be respecting the picket line both physically and virtually, including classes and public events.

If you work at NYU please consider signing this open letter calling on the university to settle a fair contract.

Please also consider donating to the mutual aid fund.

On Epigenesis

In March 2020 I participated in a round table discussion at New York University on the question of epigenesis in the work of Catherine Malabou. Along with Malabou herself, we were joined by Alexander Miller, Emily Apter, Peter Szendy, and Emanuela Bianchi. The talks from the event have been edited and published in the current issue of October magazine under the title "On Epigenesis."

Uncomputable

I'm happy to announce that my next book Uncomputable will be published in the fall by Verso. Ten years in the making, this book narrates a series of episodes from computer history, reanimated by hands-on experiments in coding and building things. I'm excited to have it see the light of day, and am planning a few special surprises to accompany the fall launch. Stay tuned...

+ + +

(Publisher's book blurb)

Narrating some lesser known episodes from the deep history of digital machines, Alexander Galloway explains the technology that drives the world today, and the fascinating people who brought these machines to life. With an eye to both the computable and the uncomputable, Galloway shows how computation emerges or fails to emerge, how the digital thrives but also atrophies, how networks interconnect while also fray and fall apart. By re-building obsolete technology using today’s software, the past comes to light in new ways, from intricate algebraic patterns woven on a hand loom, to striking artificial-life simulations, to war games and back boxes. A description of the past, this book is also an assessment of all that remains uncomputable as we continue to live in the aftermath of the long digital age.

https://www.versobooks.com/books/3885-uncomputable

Questions. Answers.

People have a lot of questions about digital media. I have answers.

Facial recognition technology -- should be illegal.

NFTs & blockchain -- stupid wasteful tech. pull the plug.

Facebook and Twitter -- nationalize them. end data harvesting. block all advertising.

Javascript -- awful i avoid it like the plague. alas, one of the most successful languages in the history of computing.

Jeff Bezos et al -- expropriate 100% of tech billionaire wealth. they all go to zero. maybe give them a ribbon like at the end of Wizard of Oz.

self-driving cars -- stupid wasteful tech. will increase carbon emissions and immiserate countless workers.

Uber and gig economy -- shut them all down. replace Uber with a living wage and full unionization.

HTML 5 and CSS -- psychotic. if you can code this schizo tech you are a God.

shaders -- also psychotic. but in a good way.

AI and machine learning -- just a fancy way to calculate an average. based on pilfered human labor and data. a total scam.

The Cloud -- a bad metaphor for a server farm.. which is a bad metaphor for a computer.. which is a bad metaphor for a female secretary.. which is (etc etc)

Swift -- apple did two smart things. switching to unix. and killing obj-c.

Hades -- pretty good game actually!