Exploration on non-screen base interface
The goal of this project was to create a non-screen interface for navigational purposes. It was created in the Tangible Interfaces class taught by professor Hiroshi Ishi at the MIT Media Lab. By looking into our own lives, our group saw that while navigating through the city on a bicycle, a screen may not be the most safe and efficient way in which to receive instructions regarding navigation. There are many bike-related accidents that occur by looking down at a cell phone screen and our goal was to determine a safer way to send information to the biker while they are en-route.
In order to solve this problem, we looked at the articles that are carried by bikers as they travel through the city. The helmet was the first item that came to mind as an opportunity to provide feedback to the user in a non-visual manner. We modified an existing helmet by adding 6 vibrating discs; five in the front and one at the back. A wireless microcontroller was also added to allow communication with the user's cell phone. This microcontroller received the navigational information from the cell phone and distributed the necessary information to the vibrating discs found within the helmet. The concept was that the turn by turn directions from Google Maps would be transformed into haptic patterns that the user could easily identify and use to guide their route.
As a proof of concept, our group developed an app that worked as a joystick that sent directions to the user to guide their path in real time. The video below shows a wearer of the helmet navigating their way by using the haptic feedback patterns and instructions through the use of the app.
App: Guillermo Bernal
Electronics: Guillermo Bernal