top of page
Search
  • Writer's picturemike duggan

Zoom Obscura



Introducing a new project that uses artistic intervention to challenge the data collection practices of videoconferencing.



Summary

The COVID-19 pandemic has gifted video conferencing companies such as Zoom with a vast amount of biometric data to be rendered knowable, translatable and ready for economic exchange – such as faces, voices, and chat scripts. It is however as yet unclear what the explosion in video conferencing software means for the exploitation and monetisation of potentially valuable data for natural language processing, facial recognition, machine learning and artificial intelligence training and other drivers within the digital economy which require urgent critical scrutiny. Frequently we are offered ‘solutions’ to issues around data privacy and security that are based on a loose form of trust, with debate frequently reduced to the prevention of external malicious actors gaining access. This has led to a focus on ‘end-to-end' encryption – but this still leaves individuals with limited access, control, and verifiability to challenge terms and conditions within. COVID-19 has, however, left us with little choice but to increase the volume of interactions we have in online spaces such as the video-call ‘room’, and with limited agency in how our personal data might be being stored and exploited. This leaves us with significant ethical, privacy, and political concerns.


Zoom Obscura is a research and arts project that aims to give agency to the users of newly ubiquitous video conferencing technologies such as Zoom, while still allowing them to participate in online spaces and debates, enabling us to negotiate our own presence and our own value in these new spaces. We aim to do so by bringing together artists, academics, hackers, designers and creative technologists to develop critical interventions that make the problematic workings of these technologies legible to wider audiences while empowering users to experiment with, and control how their personal data (visual, audible, text input) manifest in online spaces. Playing on metaphorical (Kofman 1999) and material concepts of the Camera Obscura, with its inverted images and use of light and shadow, Zoom Obscura addresses these issues by harnessing the critical power of art, design and technology; blending and bringing into tension skills and genres to produce a range of interventions which give users the power to take back some of the agential power from platforms such as Zoom. Encryption might be a technical fix, and a politically popular (although controversial) narrative, but it doesn’t solve the ethical problems that sit beneath, through, and around its implementation. Zoom Obscura seeks to explore a data ethics beyond encryption and technological solutionism.


Looking to the future, the Zoom Obscura asks:


- How can we contest / resist the inevitability of a future structured around video calls, conferences and seminars?

- Can we push back against the normalisation of the practices we have so quickly and readily adopted in the COVID-19 state of exception?

- How can we regain control of how our images and words manifest in these spaces?

- How can we move beyond encryption as a solution to privacy /security problems?


The project aims to deconstruct/unpack the Zoom assemblage, bringing together a group of artists/hackers/creative technologists with diverse and unique practices and skills in a three-part workshop series to brainstorm and prototype methods of obfuscation, subversion and other ways in which users of these technologies might regain agency in how their data is represented and how they can participate in online spaces on their own terms. Interventions might include cheap analogue hacks such as placing stickers onto webcams with watermarks, for example, or digital hacks such as adding copyrighted brands to video streams, pixilating faces, or augmenting the stream with different methods, or indeed inventing alternatives to Zoom. Interventions might lie in the hands of the users, or in the camera, the browser, the WIFI, or even in filtering the data and permissions granted to the Zoom app itself. Framed by 3 workshops covering i) ideation, ii) prototyping and iii) presentation of various interventions, the project brings together cutting edge artistic and design practitioners with scholars from Digital Art, Cybersecurity, HCI and Digital Humanities as well as experts in commercial and cultural data ethics and a local digital arts and activism collective.


Follow our progress on Twitter at @zoomobscura, #zoomobscura


This project is funded by the Human Data Interaction Network being developed by a multidisciplinary team led by Dr. Pip Thornton.


The Team:


Dr. Pip Thornton (Research Associate, Creative Informatics, University of Edinburgh)

Dr. Mike Duggan (Department of Digital Humanities, King’s College London)

Dr. Chris Elsden (Research Associate, Creative Informatics, University of Edinburgh)

Dr. Andrew Dwyer (Addison Wheeler research fellow, University of Durham)

Professor Chris Speed (Chair of Design Informatics, University of Edinburgh)

David Chatting (PhD Candidate, Department of Design, Goldsmiths)

Hannah Redler Hawes (Independent curator and co-founder of Data as Culture at the ODI)

Jack Nissan (Tinderbox)

Luci Holland (Tinderbox)




30 views0 comments

Recent Posts

See All

Comments