top of page
hogarth.jpeg
play me or trade me.jpg
images 1 copy.jpg
mobile app 2-1.jpeg
Melania Trump reading from Error-Correct
Error-Correction: an introduction to future diagrams

Error-Correction: an introduction to future diagrams takes it’s title from the theory of least squares by Adrien-Marie Legendre and Carl Friedrich Gauss around 1805, and its subsequent application in celestial mechanics ‘formulating true statements in advance of experience’ (Bender and Marrinan 2010, 161) in the work of John Couch Adams and Urbain Le Verrier. The mathematician Pierre-Simon Laplace at the École Polytechnique, championed the possibility ‘to determine the probability of causes lying behind events by working backwards from observations’ (Bender and Marrinan 2010, 160) in his 1812 Théorie analytique des probabilities. Adolphe Quetelet went on to apply statistical analysis to social data, producing an estimation of the ‘average man’ in his writing in Sur l’homme in 1835. The Scottish physicist James Clerk Maxwell’s work on gas molecules continued the work – writing:

 

This method of dealing with groups of atoms, which I may call the statistical method, and which in the present state of our knowledge is the only available method of studying the properties of real bodies, involves an abandonment of strict dynamical principles, and an adoption of the mathematical methods belonging to the theory of probability.4 (Bender and Marrinan 2010, 179)

Subsequent work by Ludwig Boltzmann in thermo-dynamics proved Maxwell’s theorem on the ‘distribution of velocities among particles’ (Bender and Marrinan 2010, 181) and led to it becoming known as the Maxwell–Boltzman distribution. Boltzmann went on to develop the H-Theorem leading to Boltzmann’s law that abandoned a mechanistic theory of particles in motion ‘to embrace a concept of energy states in time’ (Bender and Marrinan 2010, 183). Across several different branches of science, visual perception was overtaken by what Maxwell termed as ‘the true logic of this world ... the Calculus of Probabilities, which takes account of the magnitude of the probability’. It is significant, he writes, that: ‘this branch of math ... is generally thought to favour gambling, dicing, and wagering, and therefore highly immoral’ (Bender and Marrinan 2010, 178 footnote 79).

 

Error-Correction: an introduction to future diagrams is a script reflecting on the influence of calculus, in which each articulation is just one of many takes, constantly re-edited, that references and includes openly appropriated texts, contemporary commentary, news items, anecdotal evidence; culminating in an interrelated convergence of many interwoven threads, whereby the voice (through language) is constituted between someone else's thoughts and the page. It attempts to acknowledge the multitude of influences on how thought might be arrived at, with an emphasis on any subject to speak of (as questions of authorship, also arise) emerging in synthesis with their environment.

 

I was interested in the complex ways that the body receives and processes multiple sense data, and took a cue from the physician and physicist Herman von Helmholtz’s research into mathematics of the eye, that led to probability theory, who suggested:

 

“human perceptions, so prone to error, are at best, an approximation, an estimation even, that 'operate(s) within the protocols of instruments”.  His premise was that human eyes have: “a hard-wired, involuntary drive to minimize perceptual errors - and discovered error-correction in the very nerve endings of our bodies”.

 

Here, error⁠ is no mistake - but the driver of probable outcomes.

 

I was interested in the way that the body was in question in this assemblage acting at several different scales and temporalities simultaneously, whilst the script denied any primacy of the voice, or identifiable subject, which is all the while being constituted through other people's words.

The title speaks of the development from perspectival drawing, to projected geometry, and further, beyond the human sensorium via mathematics to calculus, leading to probability theory and being able to speculate the future.  It suggests that just as Euclidean geometry was surpassed by calculus, perhaps within the current diagram of power, another strategy might be necessary, oscillating somewhere between remaining visible, whilst also hidden, within the potential of being multiple, as it became clear that the protocols of platform capitalism relied entirely on a ‘freedom’ of expression drawn out by highly addictive and sophisticated scripting mechanisms, as you uploaded exactly what was on your mind, 24/7. 

 

Over the past few years these ideas of course have become less conspiracy theory, more mainstream media, via the revelations regarding Cambridge Analytica amid more widespread understanding of the protocols of platform capitalism, whereby, rule of thumb, if it’s ‘free’, you are the product, and the currency of data becomes clear.  When ‘freedom of expression’ is so caught up in protocols that grant free access to platforms that rely inherently on ‘you’ as the product, a currency to data emerges, as we see how analysis of this buys participation through the protocols of platform capitalism.  The important drive to become ’visible’ in terms of political representation, becomes distorted when data extracted, willingly or otherwise, leads to wrongful or reductive analysis and categorisation, adding further to unprecedented levels of surveillance and contributing to often systemic practices of discrimination.  A form of power that is characterised by Deleuze as modulative, that Hayles, writes: works through the manipulation of the flows which move bodies, and the thresholds across which they must cross, and as Zuboff points out where data becomes a strategic asset and a behavioural surplus, underwriting in turn, a monetary surplus. 

 

A nonlinear text, that doesn’t sit adjacent to what came before, as proximity no longer presupposes contiguity, and causality is no longer possible in the ordinary sense of the word.  Reliant on syncopation and semantic rhythm to drive the score, the axiom, or recipe (Cramer) that dictates the form is betrayed by a pornography of language, as earworms seep until they crystallize in thought.  Any subject to speak of emerges from this assemblage via an untrustworthy body with faulty equipment, produced in synthesis with it's environment, that feels through prosthesis, with a body that matters, without mattering.

bottom of page