top of page

What if we could feel screens instead of looking at them?

Writer: Gonzalo MoralesGonzalo Morales

During last weekend's Imperial College Design Society Makeathon, my teammates — Ruta Czaplinska, Dunni Fadeyi, Vishal Mehta, Diego Muñoz — and I tackled the challenge: "How will we interact with machines of the future?". The aim of this Huawei-sponsored competition was to conceive and prototype solutions that would respond to the brief within the 2-day duration of the event.


 

Interactions now

We began by taking a look at our current relationship with tech, and found that we almost exclusively rely on audiovisual outputs to understand what the machines we interact with are trying to tell us.


Beyond the obvious drawbacks this presents for the visually and hearing-impaired, this form of communication poses particularly serious risks when we are on the go: focusing on screens as we walk on the streets can lead to accidents and theft in London, a phone is stolen every 6 minutes, most often snatched from the user's hands as they stare at their screens off-guard. We may feel safer turning off the screen and using a pair of headphones instead when moving around the city, but this can further isolate us from our surroundings, posing additional risks.



 

Future interactions


When we think about the future, we imagine a world in which our interactions with machines feel natural and seamless, thus mimicking the real-world interactions we experience as three-dimensional, physical beings. Because of this, we started to look into the topic of touch-based interactions.


Our senses of touch and proprioception (i.e., awareness of our body's position) are at the core of our human experiences. Together, they give us a sense of "reality" that allows our body limbs to make each move confidently, without us having to think about it actively. They allow us to know the temperature, geometry, texture, weight, and hardness of an object without having to look at it. So we began to wonder: what if, much like the old physical interfaces, we could feel and locate the contents of a screen as if they were physical controls? 


There is nothing new under the sun: particularly with the rise of VR and Spatial Computing, many companies and research groups are looking into the topic of devices that can reproduce three-dimensional shapes or textures. While great progress is being made, it seems the current outcomes have not yet compensated for the costs of such technologies...


...but can we still give a sense of the UI controls of a screen with the currently available hardware?


Well, that's precisely what we did for this Makeathon! After all, we only had 48 hours to demonstrate our idea in action. To achieve this sense of "awareness" of where something is on our screens, we turned to haptics:


Our smartphones already have quite advanced haptic feedback actuators that allow us to tune the frequency, intensity, and sharpness of their motion, thus producing very distinct vibrations and pulses. What if our screens responded with haptics to whether the pixel under our finger was lighter or darker? What if they responded differently depending on whether we are resting our finger on an interactive control?


 

Our prototype


In response to this, we built Touch Nav: a concept for a navigation app that allows us to know about the next direction to take and the map around us without having to look at the screen!


With this, you can keep the phone in your pocket as you navigate the city! Freeing your visual and auditory senses so you can enjoy your surroundings 😀


Here's a video of our prototype. It's a bit difficult to illustrate a haptic solution through visual means, so I would recommend turning on the sound so you get a better idea of our approach.





 

Challenges?


Certainly! This quick project was a proof of concept that the technologies currently embedded in our smartphones can already give us a sense of what our screens are displaying without having to peek at them. However, if we are to replicate more nuanced sensations through our screens, these haptic actuators need to keep getting more advanced (and, potentially, they should be able to produce more localized vibrations, so the user feels the texture response comes from right under their finger).


From a usability point of view, another challenge remains: the current behavior of phones by which the screen is locked automatically when in a pocket should be altered to accommodate these touch interactions, but in a way that does not allow for accidental touches. A solution for this limitation should be found for this interaction method to become more widespread, but we are optimistic since the starting point is already looking good! Wouldn't it be great not to have your eyesight hijacked by these devices as you navigate the real world? We definitely think so ⚡️



 


Imperial College DesSoc Makeathon Team F:  (From left to right) Dunni Fadeyi, Ruta Czaplinska, Vishal Mehta, Gonzalo Morales, Diego Muñoz.

Imperial College DesSoc Makeathon Team F:

(From left to right) Dunni Fadeyi, Ruta Czaplinska, Vishal Mehta, Gonzalo Morales, Diego Muñoz.

Comments


bottom of page