Would you trust a robot trained on your behaviors?
How will we know when a machine becomes sentient?
What does it mean to be moral?
As machines get smarter, more complex, and able to operate autonomously in the world, we’ll need to program them with certain “values.”
Yet we do not agree on what we value: across cultures, across individuals, even within ourselves. We often do not act in accordance with what we say we value, so should these systems learn from what we say or what we do? What are the implications of how our current belief systems manifest in the swiftly approaching technological future? As we anticipate such change, can we use this technological moment to become more honest, humble, and compassionate?
Moral Labyrinth is an interactive art installation that takes shape as a physical walking labyrinth, comprised of philosophical questions, and an individualized “digital” labyrinth on an accompanying laptop. The work is a meditation on perennial—and now particularly pressing—aspects of being human. Engaging with the difficult task of aligning values, it gently reveals the gravity of the problem, and creates an open space to reflect on questions. It also hopefully allows us to see our own values more honestly and critically, as a first step toward any solution.
Visit the Moral Labyrinth website to submit your own question.
Rainbow Unicorn, Berlin. Part of Transmediale Vorspiel
January-February, 2018, paneled wall mural
Ars Electronica Festival, Linz, Austria
September, 2018, walking labyrinth and interactive digital experience
Mozfest, Ravensbourne University, London
October, 2018, Walking labyrinth made entirely of baking soda
Community Bike Path, Somerville, MA
WeRobot, University of Miami, April 2019, with Jessica Fjeld.
Moral Labyrinth Workshop. RightsCon, Tunis, June 2019, with Mindy Seu and Jie Qi.
Moral Labyrinth, Northeastern School of Law, Boston, forthcoming, April 2020.
Special thanks to Black Cat Labs for the wonderful laser cutting work.