Hostile technologies: types and concepts
Sally Wyatt, Maastricht University
Abstract
Technologies of war and incarceration maim and kill. Artificial intelligence (AI) is used to profile and identify individuals so that they can be deported or denied welfare benefits.
These extreme examples are increasingly familiar from contemporary news reports. There are many other less dramatic examples of how technologies may not be deliberately designed to cause harm, but may become hostile. For example, a sewage infrastructure, designed to protect public health, may cause flooding if it is inadequately maintained.
In this presentation, I will explore what is meant by hostile and biased technologies and introduce the concept of ‘technological innocence’.
Biography
Sally Wyatt is professor of digital cultures in the Maastricht University Science, Technology and Society Studies research group. She has long worked on social aspects of digital technologies, including the ‘digital divide’, social exclusion and inequality. Wyatt has worked with colleagues in Canada, the UK and the Netherlands on the ways in which digital technologies are incorporated in healthcare.
Location
STS Seminar Room / Universitätsstraße 7 (staircase II / 6th floor) 1010 Vienna
