In his famous analysis of the panopticon, Michel Foucault showed to what extent power can be exercised through visuality and produce specific subjects. However, this form of power seems to be unable to grasp the dynamics of networked digital media technologies. In the paradigm of the control society (Deleuze 1993), not only are media increasingly ambient, networked and unobtrusive, but also the techniques of surveillance and control. Many of their contemporary forms do not rely on the visible demonstration and internalization of the gaze, but on automated data-based and algorithmic forms of control that are often motivated economically. They are not “salient”, but “silent” (Introna/Wood 2004) and even “calm” technologies (Weiser/Brown 1996) that proliferate in everyday life and diffuse in environments. Nevertheless, it is important to ask what role images play in these post-panoptical, ambient forms of power and how visuality is currently transformed and modulated.

Capturing personal data in exchange for free services is by now ubiquitous in networked media and recently led to diagnoses of surveillance and platform capitalism (Zuboff 2019; Srnicek 2017). Dataveillance and data mining in social media have been criticized as new forms of digital work and capitalist exploitation for several decades (cf. Andrejevic 2011, 2012, 2013; Fuchs 2010, 2013, 2015). Moreover, “silent” data-based and algorithmic controls also focus on the practices of visual culture. Whether we are sharing photos or watching videos, movies or serials, the access to images is structured by platforms: their economic imperatives, data acquisition techniques and algorithmic processing. The logics of surveillance capitalism cannot be understood without the analysis of its incentive and nudging strategies. It raises the question how images support its maintenance and reproduction. Aesthetic strategies and media principles of user-generated, professional and popular images such as humour, compactness, nudity, spectacularity, cinematicity, seriality, interactivity or emotionality can contribute to turning to a platform, capturing attention, prolonging browsing times and generating the “network effects” (Srnicek 2017) necessary for the functioning of surveillance capitalism. It is about reintroducing the logic of “gaze” into the media environments of “glance” (Bryson 1983).

On social media platforms, forms of social control based on the visibility of the personal can hardly be separated from algorithmic sorting and recommending. They modulate visibility and invisibility as well as the associated social fears and thus algorithmically reconfigure scopic forms of power (Bucher 2016, 2018). It can be assumed that algorithmic control not only complicates or prevents the possibility of subjectivation (Rouvroy 2013; Rouvroy/Berns 2013), but also enforces new and old ways of subjectivation. With this, categories such as gender, age, class and race which are underexposed in Surveillance Studies (Conrad 2009) gain particular relevance. For example, not all bodies are subjected to the exposure, economization of attention or automated censorship in the same way on popular platforms for sharing images. Also on streaming platforms, the rhetoric of algorithmic personalization obscures collaborative filtering and often stereotypical clustering, which clearly manifest gender and age biases, among others, and modulates specific viewer subjects.

Against this background, new technological endeavours such as the internet of things, ubiquitous computing and ambient intelligence appear as attempts to expand the opportunities for data extraction and monetization. Everyday objects become sentient things that are capable of multimodal monitoring of the environments and living beings and of recording, storing and circulating captured information. Optical data acquisition in the form of sensors, webcams or computer vision operates without drawing attention to itself. Often, not only the technologies are invisible, but also the images which are not destined for human vision anymore (Paglen 2016; Rothöhler 2018). These optical data sensing and “invisible images” share the unobtrusiveness with algorithmic security systems such as facial recognition, which exploits the publicness of the face, does not require consent, and produces “calm images” (Veel 2012). In applications such as Snapchat, the use of biometric face recognition is so common that it does not even need to be recognized as a form of control.

The conference examines the role of images and visuality in surveillance capitalism. In particular, it focuses on the following questions: To what extent and by means of which aesthetic strategies do images create incentives for and stabilize surveillance capitalism? How do they contribute to its aestheticization? How is subjectivation produced in apparatuses of dataveillance and algorithmic control and how are the regimes of the gazes transformed within them? How is pictoriality reconfigured in post-panoptical, ambient media environments and subjected to forms of anesthetization?

Topics can include, but are not limited to:

  • Role of the images for the generation of the “behavioural surplus” (Zuboff 2019) and data extraction
  • Images as decoy and nudges; medial and aesthetic incentive strategies 
  • Audience labour and modulation of viewing
  • (In-)visibility as social control and its relation to data monitoring and algorithmic sorting 
  • Relationship between scopic and non-scopic forms of power and control 
  • New forms of subjectivation, desubjectivation or prevention of subjectivation in visual surveillance capitalism
  • Economization of attention
  • Platform politics and automated censorship of images 
  • AI training on user generated images and platform capitalism
  • Surveillance capitalism in popular visual media and media arts
  • Gender, race, class and algorithmic control on platforms for (moving) images
  • Calm images and invisible images
  • Optical data acquisition in the internet of things and ubiquitous computing
  • (Resistance) discourses of transparency, black boxing and obfuscation and their implicit epistemologies of the visible 
  • Tension between aesthetization of surveillance capitalism and anesthetization of images