I ended a recent post on cheating with a reference to necessity. But necessity is such a miserable concept these days, mocked and condemned by almost everyone. It's worth exploring the concept a bit further, which I'll do here by way of a related term -- determination -- equally scorned and dismissed in contemporary theory.
In the physical sciences the “standard” model of determination goes something like this: (A) there are deterministic systems, Laplacian systems with known laws that calculate and predict behavior. and (B) complex or non-linear systems -- think of a coin toss or the famous example of the double pendulum -- such systems are still subject to known laws, but due to minute variations below the threshold of observation they appear chaotic or unpredictable. (A coin toss does not randomize the laws of physics; nevertheless it produces a seemingly random outcome within deterministic laws.) Finally, (C) systems that are genuinely non-deterministic, and which truly deviate from the Laplacian model. Quantum mechanics is probably the best example of such indeterminate phenomena.
Indeed the physical sciences have inspired certain currents in recent theory, with someone like Karen Barad using the “weird indeterminancy” of quantum mechanics as a way to locate queerness in the very atoms and particles of the physical world, or others in the materialist tradition, be they Deleuzian or otherwise, finding inspiration in so-called “aleatory” matter. (And, in fact, Laruelle's most recent writings from the last decade refer frequently to quantum theory.)
In political theory, the story of determination is told in a different way. I'm thinking of that old distinction in Marxism -- but not only Marxism -- between determinism and voluntarism, that is to say between (A) the mode of production as a determining instance, and (B) the practical unfolding of class struggle -- the latter a struggle that might forge a new society (in political terms), whereas the former subsumed by a specific developmental process (in historical terms).
Here I've been greatly influenced by the work of Fredric Jameson, particularly the way in which determinism persists in Jameson's “ontology” -- although I'm using scare quotes to indicate that, properly speaking, Marxists do not have ontologies, and indeed that Marxism requires, among other things, the abstention from metaphysical speculation. One might say, correctly, that Jameson is “against method,” to the extent that he never promulgates any sort of clean and precise method suitable for all occasions. The only quasi methodological category in Jameson is the dialectic, but that functions in a very different way. Instead Jameson returns to the notion of a “condition.” Things are subject to conditions; there exists a material condition; etc. Hence determinism within the Marxist tradition has to do with the determining nature of the material conditions of existence.
Meanwhile, Marxists are often criticized for taking such a stance, with determinism recast as a kind of totalitarianism, a doom-and-gloom pessimism regarding the potential for resistance much less revolution. We don't need to rehearse the whole story here: Theodore Adorno as the Patron Saint of Marxist Pessimism, with others -- from Louis Althusser to Stuart Hall to Judith Butler -- amending the model in such a way as to allow for mutual determination, or, in what was all the rage twenty years ago, individual agency.
These days a number of interesting thinkers are pursuing various forms of indeterminancy, from Quentin Meillassoux's work on contingency and chaos to Luciana Parisi's recent book Contagious Architecture, which is focused on randomness and incomputability. I find this work fascinating, and acknowledge the important role that contingency plays in anti-essentialist struggles. Still, I can't get past the fact that radical contingency has, historically, been so closely associated with the rise of the bourgeoisie, and certainly with capitalism as a whole. Contingency, chaos, and indeterminacy have been used as weapons against the people for so long that I have a hard time seeing them in a neutral light. And in some cases chaos and contingency have been, de facto, the arch-conservative position. I'm thinking of the way in which popular power is viewed as chaotic and violent in any number of political theorists from Hobbes on down, producing that intractable form of political condescension documented so well by Rancière in his book Hatred of Democracy.
Anti-determinism is thus fraught with difficulty. On the one hand, a deviation from determination is at the very heart of what it means to be political: “things can be otherwise.” But on the other hand, the erosion of fixity is one of the main sources of precarity and proletarianization: “all that is solid melts into air.”
As an alternative, perhaps we might consider a slightly different way of conceiving determination, not the standard model of determination inherited from the physical sciences, but a “non-standard” model of determination discovered from elsewhere entirely.
I recently gave a talk at UC-Irvine at a conference organized around the theme of “n-determination.” I didn't cook up the theme, but I was attracted to it because it gets at the heart of what critical theory actually means (and, in reverse, the unspoken aversion that many have to the critical tradition). At the conference I attempted to define the term, a gloss which I will reproduce here.
In simplest terms, I interpret the theme of n-determination as a theory of “weak foundationalism.” Or, to put it more properly, n-determination means a “foundationalism of weakness.” But what does this mean?
First and foremost, n-determination is a form of determinism -- this must be acknowledged. (And those who consider that the goal of all theory and action is to produce a greater level of freedom in the world, they will not be excited by the theory of n-determination.) Think of algebra and the notation of variables. N-determination means determination, but only a determination by n. It's not a determination by the Father, or by the Sovereign. It's not a determination by the Transcendental Subject, or by Whiteness, or by the Big Other. Not by Nature, or by Essence. In other words, the determining factor is suspended within a virtual space. Synonyms for this “n-determination” might be terms like “the generic,” “the common,” or even “the undercommons.” In fact, I would also say that “materialism” is a synonym for n-determination.
To be clear, n does not mean “number in the abstract.” It does not mean infinity, or some infinite number space. If we're using “n” as a stand-in for universal identity, then we've gotten off track. N is not abstract, universal, infinite, or eternal; it is generic, finite, insufficient, and lived. N-determination does not occupy the role of the Prime Mover or absolute cause, but rather determines “in the last instance” as Althusser or Laruelle like to say.
Likewise, as insufficient generic, this kind of materialism is, if you like, a “theory of impoverishment” or, as I said, a “foundationalism of weakness.” I mean this not so much in the religious sense -- the meek, the lamb, the martyr -- but in the theoretical sense: a theory of the minimal; a theory of insufficiency; a theory of philosophical impotence.
This is a form of pessimism, then, but only in the precise etymological sense of the term: the worst, the least, the last, the miserable, the base, the bottom. But at the same time it's also a form of utopianism, a utopia discovered within the withdrawal of sufficiency.
This is something of a reverse Nietzscheanism. A muscular subject who willfully overcomes the bankruptcy of the world might have been a radical subject position at some point in the past. But today the most radical gesture is to withhold the sufficiency of power. To withhold sufficiency itself.
Consider the age old difference between transcendental philosophy and generic theory. Transcendental philosophy is the thing that is transcendental vis-a-vis the real. Generic theory, by contrast, is the thing that is immanent vis-a-vis the real. From an existential point of view, philosophy asks the same question over and over, what is n, or what is the identity of anything whatsoever? And from an ontological point of view the inquiry is similar, all is n, or All as the principle of unity that subtends anything whatsoever. (Hence philosophy's chief maladies: essentialism, logocentrism, ontotheology.)
Generic theory does it differently. Contra philosophy's approach (what is n), the existential question of generic theory is n is what, or how is anything whatsoever an instance of identity. Likewise the generic's ontological inquiry is n is all, or the “anything whatsoever” as the principle of unity.
Philosophy is always inflationary and maximalist. Even the most hard-nosed skeptics are philosophical because they remain sufficient unto themselves -- skepticism as “adequate” for thinking. By contrast theory creates a minimalism in thought. Theory is a rigorous science of the inadequacy of material life.
This is why I'm a vulgar determinist. For me, it's not so much a question of fleeing the determinism of Newtonian laws, to be saved by the weird indeterminism of quantum phenomena. And, while I'm still engrossed by political discussions of determinism and voluntarism, I doubt the secret will be revealed there either. Instead we might return to Jameson -- or to Laruelle, or any number of other thinkers -- and the notion of an “absolute horizon” or generic base.
In the end, the emphasis should fall not so much on determinist but on vulgar. A vulgar determinist; a determinism of the vulgar. To which I'll return in a future post....