I wrote before about the anti-computer. Let me continue some of those themes, using instead an adjacent label, the uncomputer.
In an initial sense, the uncomputer comes out of whatever is subordinated or excluded as a result of the standard model of the digital. The excluded term might be the flesh, or it might be affect. It might be intuition, or aesthetic experience. The excluded term might evoke a certain poetry, mysticism, or romanticism. Or it might simply be life, mundane and unexceptional. The uncomputer means all of these things, and more. The gist is that there exists a mode of being in which discrete symbols do not take hold, or at least do not hold sway. And in the absence of such rational symbols, modern digital computation becomes difficult or impossible. Sometimes this is called the realm of "life" or "experience." Sometimes it is called the "analog" realm--indeed analog computers are some of the oldest computers.
Meanwhile, the term uncomputable developed a series of other meanings, particularly in the Twentieth Century. Two of them are about limits: rational limit and practical limit. To be sure, the resistance to rationality is as old as rationality itself. Nevertheless a rash of rational paradoxes in the early Twentieth Century culminated in a series of limits to rationality from within that very reason. Bertrand Russell's paradox from 1901--"the set of all sets that are not members of themselves"--seemed to impose limits on what was possible within the theory of mathematical sets. Kurt Gödel's incompleteness theories of 1931 showed that for any formal axiomatic system there will always be statements that are true in that system but not provable. Later in 1936 Alan Turing demonstrated that it was logically impossible to calculate what sorts of machines will halt (given a certain input) and what sorts of machines will not halt.
In other words, a key definition of "uncomputer" comes not from the outside of math and computer science but from within them. If before there were intuitive or poetic or biological limits to rationality, after Gödel, Turing, et al., there were also demonstrable limits to rationality from within rationality. Indeed in a certain sense computation in the Twentieth Century is defined more by the limits to computation, more by the uncomputable, than by a positive set of capacities. Or as Beatrice Fazi wryly put it, "the founding paradox of computer science is that it is a field that is defined by what it cannot do, rather than by what it can do."
At the same time there emerged the question of practical limit. Ask cryptographers about the "uncomputable" and they will respond: how much computing power do you have at your disposal? Can you afford to crunch the numbers until the sun burns out? Computability is thus also a strictly pragmatic question. Fields like cryptography excel at generating knots and obstacles within pure rationality, obstacles that greatly impede computation, even if they are not strictly "uncomputable." Is it possible to reverse a hash function? In the abstract, yes, but practically speaking, no. At least so goes the promise of crypto security.
If these are three types of uncomputer--analog life, rational paradox, and practical limit--a fourth meaning of the term comes from the indiscernable and the indeterminate. In her book Contagious Architecture, Luciana Parisi has written about entropy, chaos, contingency, indefiniteness, change, and what she calls patternless data. For her, the uncomputer already exists; it exists in the cracks and excesses of data, in interferences and contingencies, and how these things overwhelm seemingly impervious rational systems. (Parisi was inspired, in part, by the mathematician Gregory Chaitin who has shown that there are numbers with infinite complexity, meaning they are, by definition, uncomputable.) Digital computation traditionally has relied on discrete symbols, relied on them being discrete, and remaining that way. When symbols start to dissolve, they pose a threat to computability. When a symbol can not be fixed, it becomes more difficult to calculate.
Or does it? Part of the story involves incorporating the indiscernable and the indeterminate into the very heart of computation. According to Parisi, "error, indeterminacy, randomness, and unknowns in general have become part of technoscientific knowledge and the reasoning of machines." Indeed part of the history of computation is the history of the uncomputable being colonized by the computable. Claude Shannon in 1949, for instance, defined information explicitly in terms of entropy. Since then, randomness and contingency have been incorporated into the body of computation, not excluded from it. 3D models are frequently textured using procedural randomness. Images are "improved" by anti-aliasing, essentially a form of aesthetic noise. And the empirical turn in AI has shown how contingent data is ultimately much more valuable than predictable, determinate symbols. In a sense, randomness and contingency have become fully industrial. Today the computer is closely intertwined with the uncomputer.