Reconfigured Vision – workshop
SPACE Studios Art+Technology residency, London
Taking the question ‘what do objects sound like?’ as a starting point, participants joined artist in residence Ilona Sagar in a co-inquiry that sort to investigate our relationship to assistive technologies, asking how can we safeguard agency and subjective experience? How do the devices we use dictate our shared environment?
Working in collaboration with OxSight and Torr Vision Lab, organisations who are engineering devises to assist users who are registered legally blind and Alex Taylor from Microsoft Research Cambridge Lab, Human Experiences & Design Group and Ilona Sagar, the group was invited to imagine new ways of rendering our physical environment and translate the objects it contains into sonic semantics.
Testing wearable vision enhancement tools acted as a catalyst for a much wider exploration into the politics and language of assistive technologies.
The idea that bodies are either enhanced or normalised is an uncomfortable perspective of wearable technologies and raises the question, what is a good body? To what extent are we a product of our ‘body capital’, labour and efficiency?
Read interview with Ilona Sagar Here