This is the text of a talk given as part of the lecture series “Unstitching Datafication”, organised by the Media for Cooperation research centre at the University of Siegen. It's very much a work-in-progress, and I'm interested in any comments or feedback: feel free to email me at resistingai@gmail.com or find me on social media.
intro
I would like to thank Christoph and the Media of Cooperation research centre for inviting me to be part of the Unstitching Datafication series. I'll try to honour the series by using terms like 'seams' and 'unstitching' in my talk.
This talk is titled 'Decomputing as Resistance'. In it, I will argue that contemporary AI reveals the seams our current system in ways that can't be ignored but have to be countered. Anyone who's used a large language model for any amount of time will be familiar with how glitch-prone they are. I'm proposing that AI is itself a glitch; a stuttering misstep of the neoliberal order that hints at nothing less than internal disarray.
What I'll be proposing as a countermeasure is decomputing; an approach that takes direct aim at AI, but is concerned with unstitching more than just the digital. Decomputing is a way of reconfiguring our broader social and economic relations.
ai considered harmful
When I talk about AI I'm referring to the specific technologies that legitimate so much social and environmental damage right now, which are neural networks and transformer models.
It's been fascinating to watch the rise of these mechanisms of AI, because they're always faking it. Predictive AI is as unreliable as generative AI; they're both giant operations in pattern-matching whose only hold on reality is correlation. It's QAnon as computer science.
The fact that they're also foundationally opaque makes it impossible to meaningfully disentangle the internal process that led to a particular output. Pro-tip: reasoning models can't really tell you how they got to an answer. And yet, AI is presented as the generalisable solution to society's trickiest problems, and an agent that's so intelligent it will definitely be able to replace you.
While actual AI isn't fit to replace anything or anyone, it does work as an engine of precaritisation and marginalisation, of Uber and algorithmic welfare cuts, It's an apparatus that crudely extracts as much encoded human knowledge as it can in order to provide a shoddy substitute for key social functions like education and healthcare, while further concentrating wealth and power.
infrastructural violence
This structural violence is now complemented by equally egregious amounts of environmental violence.
As we now realise, the alleged benefits of a chatbot in your pocket come with a shockingly high price in terms of energy use and water consumption, and a GPU supply chain that depends on colonial extractivism and conflict minerals.
It's not so much that AI's energy demands are the leviathan that tips us over into irreversible climate change; the fossil fuel industry and industrial farming don't need any help on that front.
It's more that the unchallenged inevitability of AI as the only future for economic development and the key to geopolitical power means that AI companies can abandon their performative pretence at sustainability, and become instead a way to re-legitimise burning as much energy as possible.
It means that when decaying national infrastructures that already stagger under the burden of privatised profiteering are brought to moments of collapse by the additional demands of datacentres, the AI will get priority over mere human needs.
scale
I want to focus for a moment on scale as the concept that binds together the technical and social apparatus of AI.
The connectionist model of AI that underpins it all has existed for decades, but was largely ignored up to 2012 because it took too much data and computing power to grind out plausible answers. The unfortunate convergence of social media, crowdsourced datasets and GPUs changed all that.
The AI industry has since generated a mini-universe of self-reinforcing and eugenics-flavoured metrics that claim to measure progress while failing to tackle the things that really need to change. The basic driver of success on these metrics is the scaling of datasets and computing power. If we count computing as floating-point operations or FLOP we've gone from early AlexNet at 10^17 FLOP to recent models at 10^25 FLOP or more. To put that into perspective, this scaling outpaces any other so-called tech revolution from mobile phone adoption to genome sequencing.
The overarching logic of all industry metrics is exactly the same as GDP; the only thing that matters is growth, growth, growth, no matter the collateral damage along the way. Unrestricted scaling is a vision of infinite growth based on identifying previously unrealised forms of enclosure, and also a claim to forms of knowledge that will reach beyond human understanding.
It's scale that attracts the impatient flows of venture capital, and scale that underpins claims about emergent superintelligence. In the eyes of AI's advocates there are no social, environmental or planetary limits. Indeed, they argue for going further and faster because only AI can save us from the climate crisis and cure all human disease.
total mobilisation
I suggest that the convergence of forces around AI infrastructure can be understood as a form of ‘total mobilisation’, a term coined in the 1930s by ultra-nationalist writer Ernst Jünger to characterise the channelling of the entire material and energy resources of a nation into a new technological order.
His claim was that we have entered a new era, one that requires "the conversion of life itself into energy" as nations are “driven relentlessly to seize matter, movement and force through the formalism of technoscience”
Total mobilisation legitimates a new form of political order based on the vitalism of conflict. In the present moment, we're seeing the big AI companies abandon their stated commitments to the common good and rally round renewed visions of national dominance through military and economic might.
The outcome of mobilising all available energies isn't simply capital accumulation but the severing of society from its previous moorings, and an alignment with Jünger's vision of a breakthrough to a new epoch through violent technological transformation.
Reading the incessant push for more and more AI through the lens of total mobilisation makes sense of its apparent nihilism as a Nietzschean will to power.
technofascism
I'm not saying that our tech and political leaders are keen students of Jünger's ideas. It's more that total mobilisation captures the cult-like levels of commitment to AI from corporate bosses and national governments alike.
The neoliberal order is breaking down under the accumulated weight of its own contradictions and the resulting system shocks, like austerity and climate change. Those who wish to maintain a massive asymmetry of power and wealth seem to have no answers aside from claims about sci-fi technology and increasing authoritarianism.
Total mobilisation resonates strongly with the accelerationism and politics of neoreaction that deeply pervade the tech industry and form the bridge to political movements like MAGA and the far right. Understanding these developments alongside the proliferation of AI infrastructure as a form of total mobilisation suggests that we're dealing with a technopolitical phase shift, one that won't be dissuaded or held back by rationality or regulation and will treat those lives outside the tiny elite destined to lead this change as essentially disposable.
degrowth
Decomputing is an attempt to respond to the ever-growing social and environmental damage resulting from our current direction of travel. Decomputing identifies a rejection of scale as a way to mitigate the worst effects and a heuristic for alternative ways forward.
AI's scaling has appeal and power because it arises within a broader system based on the unifying principle of unconstrained growth. Decomputing is a turn towards degrowth, as a direct challenge to AI's extractivism and to the systemic logics underpinning it.
Most importantly, degrowth isn't merely a refusal to depend on expansionism but a switch of focus to an alternative metabolism based on sustainability and social justice. Decomputing seeks to interrupt AI's extractivism in the here and now in ways that align with a transformation of the wider political economy.
deautomatisation
Decomputing also opposes scale because it induces automatisation; that is, a state where autonomy and the capacity for critical thought are undermined by immersion in a system of machinic relations.
In this sense, AI is an intensification of the institutional, bureaucratic and market structures which already strip agency from workers and communities and place it instead in opaque and abstract mechanisms. Decomputing, by contrast, is the disentangling of thought and relations from AI's reductive influences.
As a practical example, decomputing would challenge the way the threat of algorithmic welfare cuts is justified by having a human-in-the-loop, as if this guaranteed due process and as if no-one had ever heard of automation bias or choice architecture. Decomputing is the process of developing alternative forms of organisation and decision-making that draw instead on reflective judgement and situated responsibility.
Decomputing is therefore as much about deprogramming society from its technogenic certainties as it is about decarbonising its computational infrastructures.
Datafication and an ideology of efficiency play into AI's careless and dehumanising optimisations. Decomputing attempts to wrest social praxis away from the utilitarian cruelty that is openly celebrated by the adherents of reactionary technopolitics. It is a deliberate turn away from the alienating frameworks of efficiency and optimisation and a return to context and ‘matters of care’ where our mutual vulnerabilities and dependencies are central to social reproduction.
convivial tools
Decomputing asserts that the development and deployment of any advanced technologies should be subject to social sanction. While uncommon now, it was widely argued in the 1970s and 1980s that the adoption of technology likely to have a widespread impact on society should depend on critical interrogation and collective approval.
We can lift directly from those tendencies in the form of Illich's work on tools for conviviality. He defined tools as both technologies and institutions, and convivial tools as those that enabled the exercise of autonomy and creativity, rather than the conditioned responses demanded by manipulative systems.
The Matrix of Convivial Technologies extends Illich’s ideas by specifying questions through which conviviality can be assessed, including accessibility (who can build or use it?), relatedness (how does it affect relations between people?) and bio-interaction (how does the tech interact with living organisms and ecologies?).
people's councils
However, it's unlikely that we're going to get very far by simply asking reasonable questions about the point of all this mobilisation. The power of big tech has extended beyond regulatory capture to a level of state capture, or at least, to a situation where there's an increasing merger with political structures.
Our current systems are incredibly fragile. DOGE did a good job of showing the seams, when it demonstrated that the centralisation and digitisation of institutions renders them vulnerable to what was essentially a form of far right cyberattack.
Decomputing takes instead a prefigurative approach to technopolitics, in that its practice enacts the forms of empowerment and sustainable relations that it seeks to bring about. The basic form of decomputing is the kind of assembly that I've described elsewhere as workers' and people's councils.
This kind of collectivity, self-constituting and rooted in local context and lived experience, can be applied at any level and in any setting, from parent-teacher associations who object to young minds being made dependent on chatbots to communities threatened by the construction of a hyperscale datacentre.
Wherever AI is proposed as 'the answer', is already a seam that needs unstitching, a structural problem where those who are directly involved should be at the forefront of determining what needs to be changed instead.
technopolitical resistance
Resistance to datacentres, which we can already see happening from the Netherlands to Chile shows the potential of intersecting seams, or what I would call infrastructural intersectionality. These intersections occur because the same communities who experience power cuts due to local grid overload or breath polluted air from gas turbines, like the black communities living around Musk's XAI datacentre in Memphis, are also likely to work in exploitative jobs where their immediate boss is basically an algorithm.
It's not hard to imagine a situation where resistance to a new datacentre is in solidarity with the wildcat strikes of workers in the local Amazon fulfillment centre through a joint assembly of workers and the community.
I think we can see a different dimension of the potential for decomputing in the disability movement. They are resisting savage cuts to welfare justified by stigmatising algorithms of suspicion, and through being labelled as unproductive members of society. They also have a deep understanding of the way disability itself is socially constructed by the technologies that society chooses to use or not to use.
The concept of crip technoscience is a critique of this role of tech, combined with approaches to hacking and adapting it to make people's lives more livable; hence creating convivial technologies that are sustainable and enabling.
decomputing
Decomputing is the development of a counter-power to the technopolitical apparatus of AI and its totalising transformations.
What decomputing proposes is one pathway towards societies built on relations of care, whose attributes aren’t abstraction and manipulation but mutual aid and solidarity. It asserts that that autonomy, agency and collective self-determination are in inverse proportion to the degree to which human relations are skewed by algorithmic ordering.
Decomputing is the prefigurative decoupling of advanced computation from social goals. It's the reassertion of the need for convivial tools and the construction of forms of collective social power that can bring them into being.
There are examples of contemporary struggles that don't start directly from resisting AI, but nevertheless combine the practice of self-organised resistance with the goal of constructing alternative futures. One such example is the GKN factory collective, where a factory in Italy making vehicle axles was bought by a hedge fund that tried to close it down and cash out. The workers refused, occupied their workplace and formed a collective with the local community to repurpose their tools for a just transition; that is, for worker justice and environmental sustainability. They now produce cargo bikes and recycle solar panels and are continuing their struggle under the partisan slogan 'Insorgiamo!' or 'We Rise Up!'.
a world to win
Demanding the social determination of technology is a way to unstitch the loss of collective agency which has resulted from decades of neoliberalism.
It's this collective agency that we're going to need to resist the rising wave of fascist political movements that want to roll back every kind of social equality, and project their nihilistic vision through technologies that are already coded as anti-worker and anti-democratic.
And this is my final point about decomputing, that it's not a vision for any return to a pre-AI status quo but a deliberate claim on a better world for all. Effective resistance has never been founded on a defence of an already unjust state of affairs. It only makes sense as the precursor to something better, by having the goal of a fairer and more solidaristic society.
Decomputing is the combination of degrowth and critical technopolitics that says other worlds are still possible, and that we intend to bring them into being.