01.05.2019 - 13.07.2019
Corinna Hirt,
Matthew Jörke,
Kalle Reiter
Interaction Designer
Concept Ideation
Interaction Design
Visual Design
Prototyping
Project Management
From single-celled bacteria to vertebrate mammals, nearly every organism on our planet is capable of changing and adapting to their environment. The ability to gather useful information and react appropriately is essential to any living beings' survival, so much so that some might consider it to be a definition of life itself.
This project was completed in collaboration with Corinna Hirt and Matthew Jörke at the Hochschule für Gestaltung, Schwäbisch Gmünd. The aim of the course (Invention Design, Prof. Jörg Beck) is to research future-focused technologies and design for interactions that may arise when these technologies mature. An emphasis was placed on anticipating the societal impact of these technologies and accounting for potential pitfalls in our designs. For this project Corinna, Matthew and I decided to research artifical intelligence and machine learning technologies. Our project converged around the concept of adaptive user interfaces.
The project has two components:
• a website outlining our research and design guidelines
• an interactive smart-mirror installation
Theoretical Insights
The human sensory system, molded and shaped by thousands of years of evolution, is fine-tuned to perceive rich information about our environment. When humans communicate, our ability to interpret emotion, facial expressions, tone of voice, and social context seems effortless. In the background, our brain processes a barrage of sensory information within milliseconds – our conscious perceptions are the mere tip of iceberg. Our ability to interpret each other, make inferences about our environment, and craft intelligent responses is core to human being.
When humans communicate with computers, the majority of this rich information is lost. As it stands, classical computers are simply incapable of interpreting humans as well as we can interpret each other. Seen in this light, a computer is a cold and uncompassionate partner. Over time, we have grown used to dull interactions with computers, and generally don't expect much more than functional obedience from computational agents.
With advances in sensor technologies and machine learning algorithms, computers are becoming more and more capable of interpreting human emotions and behaviors.
An Adaptive User Interface (AUI) is a user interface that dynamically adjusts its layout, elements, functionality, and/or content to a given user's needs, capabilities, and context of use.
The smart mirror was displayed at the Hochschule für Gestaltung Schwäbisch Gmünd semester exhibition. Our mirror was designed to playfully and intuitively communicate the present capabilities of machine perception to the general public and provoke critical discussion thereof. Equipped with facial recognition, keypoint tracking, and emotion recognition, the mirror could recognize past visitors, estimate a user’s age and gender, and display facial features and emotional cues in real-time.
Source CodeEvery user is different, whether in size, age, or health. How can we build systems uniquely adaptable to every single user?
Despite our ever-changing environments, our digital devices always remain the same. How might we incorporate situational and contextual awareness into our devices?
Emotion recognition technology is surprisingly accurate and will certainly improve in the future. However, emotion recognition alone is not enough–how can we design computers to respond naturally to our emotions?
Source Directory
UX gulfs - User Centered System Design, Don Norman 1986
(https://www.nngroup.com/articles/two-ux-gulfs-evaluation-execution/)
The SUPPLE system
(http://www.eecs.harvard.edu/~kgajos/research/)
(http://iis.seas.harvard.edu/projects/supple/)