Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
(Symposium) Design as a contested space: technological innovations, critical investigations, military interests
Time:
Wednesday, 25/June/2025:
3:00pm - 4:30pm

Location: Auditorium 2


Show help for 'Increase or decrease the abstract text size'
Presentations

Design as a contested space: technological innovations, critical investigations, military interests

Chair(s): Jordi Viader Guerrero (TU Delft, Netherlands, The), Eke Rebergen (University of Amsterdam, Amsterdam School for Cultural Analysis (ASCA)), Dmitry Muravyov (TU Delft, Netherlands, The)

The design of intimate digital technologies is irrevocably intertwined with various ethical and social considerations. Different methods and guidelines are developed to help designers navigate this, for which personal (human) decision making and ethical reflection are often valued. For some designers and researchers this means they start questioning or begin to struggle with the basic propositions or ideologies that these technologies are built upon. For example, current developments on AI are predicated on the epistemological assumption of knowledge understood as identification and prediction, the material and economic underpinnings of ‘publicly available’ large scale data sets, hyperscale data centers, and bottle-necked GPU supply chains, as well as a political rationale that belittles non-algorithmic decision making and concentrates power in a few corporate actors (Alkhatib 2024). As political ideologies around technology change from optimism equating technology with progress to wielding technology as a blunt manifestation of power (Merchant 2023, McQuillan 2022) and, since the history and deployment of AI has been recurrently linked to military research (Pickering 2010; Halpern 2015), these technologies (and their associated funding schemes) are understood to be (and have been) part of military innovation and (preparation for) war.

Questioning and struggling against the assumptions of technological design, as well as of responsible or ethical design practices that build upon or correct the course of already existing technological developments and ideologies, can lead to more contrarian critical design/technical practices and research (Agre 1997; Harwood 2019; Ratto & Hertz 2019; Mazé & Keshavarz 2013). Rather than seeking to apply ethical or philosophical theories to improve existing technologies, which are frequently defined by corporate actors, a critical technical practice looks for further interruptions, deconstructions, breakdowns or minor tech investigations (Andersen & Cox 2023) in order to occupy and politicize design practice as a locus for questioning the entanglements between technology and society (Soon & Velasco 2023). As such, a critical technical practice questions the implicit epistemological and political goals of design and engineering while repurposing them as a form of materially-bounded critical reflection.

Through this symposium/panel the complexities and challenges of such contrarian practices are developed, specifically in the context of the classroom and against a backdrop of AI technosolutionist imaginaries (Morozov 2013) and increasing ties to militarization in design practice and research. We propose these practices as a promising and interesting field of creative exploration and research that goes against the grain of the usual design and research programs, can challenge institutional ties, and can amount to a more activistic or even antimilitaristic stance in a time where national spending on military is increasing rapidly in Europe. How do designers critically navigate this complex field of creative possibilities, military interests, rapid innovations, but also possible violent implications of the weaponization of everything, and personal longing for peace and doing good? Is there enough attention for the historical and political constructions reinforcing often unrecognized networks of the military industry and institutions of corporate power in technological design practice? What are the considerations for questioning the inevitability and perceived necessity of the current (infrastructures of) war? How can design practice and education become a critically-fueled and politicized space that empowers us to imagine alternative technological futures?

Alkhatib, Ali. (2024). “Defining AI.” December 6, 2024. https://ali-alkhatib.com/blog/defining-ai.

Agre, P. E. (2014). Toward a critical technical practice: Lessons learned in trying to reform AI. In Social science, technical systems, and cooperative work (pp. 131-157). Psychology Press.

Andersen, C. U., & Cox, G. (2023). Toward a Minor Tech. A Peer-Reviewed Journal About, 12(1), 5-9.

Halpern, O. (2015). Beautiful data: A history of vision and reason since 1945. Duke University Press.

Keshavarz, M., & Maze, R. (2013). Design and dissensus: framing and staging participation in design research. Design Philosophy Papers, 11(1), 7-29.

McQuillan, D. (2022). Resisting AI: an anti-fascist approach to artificial intelligence. Policy Press.

Merchant, B. (2023). Blood in the machine: The origins of the rebellion against big tech. Hachette UK.

Morozov, E. (2013). To save everything, click here: The folly of technological solutionism. PublicAffairs.

Ratto, M., & Hertz, G. (2019). Critical making and interdisciplinary learning: Making as a bridge between art, science, engineering and social interventions. The critical makers reader:(Un) learning technology, 17-28.

Soon, W., & Velasco, P. R. (2024). (De) constructing machines as critical technical practice. Convergence, 30(1), 116-141.

 

Presentations of the Symposium

 

Historicising voice biometrics: the colonial continuity of listening, from the sound archive to the acoustic database

Daniel Leix Palumbo
University of Groningen, Netherlands, The

Since 2017, German border authorities have introduced voice biometrics as an innovative assistance tool to analyse the language and accents of undocumented asylum seekers to determine their country of origin and assess eligibility for asylum. However, the attempt to ‘scientifically’ identify links between voice, accent and country of origin through technology is not a recent development, standing in historical continuity with longer colonial practices of listening and sound archiving from the beginning of the last century. European sound archives encompass early voice recordings of colonial subjects made through large-scale research projects during colonial rule and the world wars to reinforce the racial and nationalist ideologies of European states. Although not aimed at controlling borders but defining ‘pure’ characteristics in the voice of the world populations to create otherness, these recordings shared the purpose of creating an archive that could ground the determination of origin through voice analysis. Today, the creation of the acoustic database to train voice biometrics occurs under very different conditions, delegated to various public and private actors, including research consortia and crowdsourcing platforms. It involves linguistic researchers and many data workers, who provide their voice data as cheap labour. By conducting digital autoethnography, critical discourse analysis and in-depth interviews, this project explores these processes of outsourced (audio) data work while situating them within the longer colonial history of sound archiving and listening. It investigates disruptions and continuities in the shift from the sound archive to the acoustic database and what these imply about the operations of State power.

 

Antimilitarism & algorithms: design interventions and investigative data practices

Eke Rebergen
University of Amsterdam, Amsterdam School for Cultural Analysis (ASCA)

The military industry has been heavily involved in the development of technological innovations and the design of interactive systems that have become part of everyday life. Designers rarely recognise the links of their profession to military investments and technological developments, whether it is the application of technologies that can easily be weaponized, usage of systems that are created through military research efforts, or normalisation of war and the military in advertising, games and films.

As there is increasing critical thinking within the design field on colonial histories, contributions to social injustice, and for example the inherent violence or discrimination in design, the recent development of further military investment seems also worthy of closer scrutiny.

In a similar way as proposals for reorienting design towards justice (Constanza-Chock) or to decolonize design (Tejada), here we chart out a more specific history of explicit anti-militaristic design-research and creative interventions against militarisation.

By examining cases like Sleiner's project called Velvet Strike or the artistic work of Claude Cahun and Marcel Moore’s subversion and covert interventions, it is possible to extrapolate such forms of playful subversion (Flanagan, 2013; Pederson, 2021; Did, 2024) to current developments in AI or war propaganda through social media. Antimilitaristic design efforts furthermore can’t do without investigation and withdrawal, uncovering and severing all relations with war related economies, complicit research activities, or involvement with military industries (Berardi, 2024). For this we look at recents design efforts that D'Ignazio and Klein called data feminism, as well as the work of Bureau D'études, that organised cooperation between militant groups, university students and artists, as examples of how to engage in further investigations of inherent networks of power and military technological developments. Antimilitarists of “onkruit” in the Netherlands are taken as a final historical example, as they created a package of informative zines, maps, diagrams of all kinds of military divisions, as well as playful sticker packs and explicit calls to action under the title “Een wilde wegwijzer” - explicitly rejecting military logic and exposing the places and often hidden infrastructures of the military.

Building on these cases we assess the relevance of similar creative endeavors in these times of renewed spending on military technologies, the development and testing of all kinds of AI systems by for example the Israeli army (Loewenstein), and more general the weaponization of everything (Galeotti), the complicity of companies like Google or Nvidia, and developments of for example what has been called the kill cloud (Westmoreland & Ling).

This contribution will end with some personal experiences as a design teacher working with design students on such investigations and interventions.

 

Teaching machines, managed learning and remote examination

Alex Zakkas
University of Amsterdam, Amsterdam School for Cultural Analysis (ASCA)

Covid brought to universities the use of digital proctoring software for students to undergo distant examination - so that universities can continue producing degrees without having to rethink ways of examining (ie complying with accreditation policies). Beyond the pandemic condition, in another “state of exception”, we also observed how in 2024 these same technologies were used during the student and teacher strikes in Greece for universities to go ahead with examinations despite the occupied buildings - another case of opting for techno-solutionism instead of addressing the underlying structural issues of education. These technologies received a lot of critique by both students and educators for their intrusion of intimate spaces (video surveillance of student private rooms), their discriminatory malfunctioning (failing to identify darker skins, needing spatial conditions that few students can afford) and their disciplinary pedagogical models (catching cheaters, panopticon effect). Like all technologies, proctoring software have their own histories of socio-technical entanglements, with links to surveillance technologies developed in military and carceral industries, merged with technocratic epistemologies of knowledge transfer (Skinner’s teaching machines), merged with a hyper-capitalist model of data extractivism and the quantified self. Understanding how these histories have influenced the design of teaching machines helps us place current innovations in edu-tech (such as “personal learning AI coaches” and “AI cheating detection”) within the wider socio-political narratives that produce them and anticipate probable futures (Crawford and Joler). As a counter-narrative, we are experimenting with the possibilities of Speculative & Critical Design methodology (Johannessen) for cultivating a "critical imagination" and involving students in reimagining the future of education against a backdrop of AI-accelerated capitalism and warmongering. Beyond the critique of technology the real question concerns the kind of knowledge we are passing on to next generations in a time when our options seem limited.

 

Exemplary situations of technological breakdown in the philosophy of technology: who and what is at stake in learning from failure?

Dmitry Muravyov
TU Delft, Netherlands, The

Breakdowns constitute the backbone of life with technologies. Thus, it is unsurprising that thinkers have sought to understand the meaning and the place of this obverse side of technology. I use the concept of exemplary situations to explicate and problematize some of the features of understanding technological breakdown in the philosophy of technology. In analyzing the theme of technological breakdown in the canon of the philosophy of technology, I rely on Mol's ideas about empirical philosophy, particularly her concept of exemplary situations (Mol, 2021). For Mol, all philosophy is empirical insofar as it is explicitly or implicitly informed by the situations in the world that shape the philosophical inquiry. By reading some canonical texts in the philosophy of technology, I show how, through exemplary situations, i.e., explicit examples or implicit context, the technological breakdown in these texts is rendered as a frictionless and individualized moment of learning, perceived through the position of a user or an observer.

The philosophy of technology is a philosophical subfield with multiple approaches, each with its distinct philosophical lineages; it has also developed over time, facing changes such as the "empirical turn." Notwithstanding such diversity and changes over time, I suggest that it is possible to elucidate a particular tradition of thinking about technological breakdowns by starting with a postphenomenological approach, one of the most influential paradigms in the contemporary philosophy of technology. While my analysis predominantly relies on examining this tradition, I also show its resonance with other approaches.

Traversing these texts, one can see exemplary situations of technological breakdown and artifacts that no longer operate as envisioned - hammers, computer freezes, slowly loading webpages, overhead projectors, or rifles. I seek to problematize the shared underlying features of these philosophical accounts by showing how technological breakdown can instead be collective, political, imbued with friction, and perceived from the position that complicates learning. Using an alternative exemplary situation of CrowdStrike blue screen freeze in July 2024, I show that, first, while canonically technological breakdowns are seen as moments of learning, it is also worth reflecting on who is learning here and at whose expense. Second, the subject of theorizing may not be an individual using the technology but a collective with few options but accepting the technology's working upon them. Third, defining something as a technological breakdown can also be imbued with friction. In doing so, I seek to politicize the technological breakdown in the philosophy of technology to take the notion beyond its predominantly emphasized epistemological dimension.

Through such reading, breakdown becomes a collective experience that prompts questioning who is obtaining knowledge while highlighting that nothing is self-evident about defining something as "broken" in the first place. Collective vulnerability rather than individual knowledge-seeking can be something that a breakdown engenders.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: SPT 2025
Conference Software: ConfTool Pro 2.6.154
© 2001–2025 by Dr. H. Weinreich, Hamburg, Germany