top of page

Reconfigured Vision – workshop

SPACE Studios Art+Technology residency, London 

Taking the question ‘what do objects sound like?’ as a starting point, participants joined artist in residence Ilona Sagar in a co-inquiry that sought to investigate our relationship to assistive technologies, asking how we can safeguard agency and subjective experience? How do the devices we use dictate our shared environment?

Working in collaboration with OxSight and Torr Vision Lab organisations developing devices to assist users who are registered legally blind, alongside Alex Taylor from Microsoft Research Cambridge’s Human Experiences & Design Group, artist Arron McPeake, and Design for Disability, the group was invited to imagine new ways of rendering the physical environment and translating its objects into sonic semantics.​

Testing wearable vision enhancement tools acted as a catalyst for a much wider exploration into the politics and language of assistive technologies. The idea that bodies are either enhanced or normalised is an uncomfortable perspective of wearable technologies and raises the question, what is a good body? To what extent are we a product of our ‘body capital’, labour and efficiency?

    bottom of page