One of the main attractions driving the current crop of wearable devices is their ability to deliver notifications in new, more elegant ways. Recent haptic advances like the “Taptic” and “Force Touch” features get us ever-closer to fulfilling the promise of “silent” notifications, normally accomplished by vibrating motors that, while subtle, are still distinctly audible to others. But notifications, no matter how subtle or invisible to others, are still interruptions to us.
At the Lab, we’re looking at technologies and interaction strategies that afford a different behavior - rather than having notifications “pushed” at you in various forms, interrupting you and taking you out of the moment, we’re interested in improving the experience you have when you consciously decide to “check the stack” of unread messages, unhandled notifications, etc. The big difference between this gesture and, say, a taptic tug on your wrist is the synchronicity - the tug happens when it happens, but you are in charge of when you check your stack.
And you already do this sort of checking, right? You pull out your phone, unlock the screen, and swipe down to see what is waiting in your notification tray. What if there was a non-visual way to do this? What if you didn’t even have to take your phone out of your pocket?
As a way of prototyping how this gesture might feel, I looked around for non-visual technologies that might be embeddable in a smartphone (and implementable with our Lab’s capabilities – for example, no nano-machining!). One phenomenon that stood out in terms of novelty, efficacy, and near-term availability is electrovibration – that is, the use of signal conditioning to dynamically change the apparent texture of a smooth surface. This means that we could have a simple, flat surface like the back of a phone and dynamically change how it feels when you run your finger over it – scratchy for a voice mail, ridged like corduroy for a bunch of text messages, or rubbery for “interesting stuff is nearby you.” With a little more sophisticated signal control, you could even impart a sense of “heft” or magnitude to these textures - a subtle grooving for low-level emails, or large, comb-tooth ridges for a pile of urgent updates.
So I made LEVER, an experience-design prototype that implements the work described in two electrovibration research papers published by Disney a couple of years ago. (If you are wondering, the Disney work is called “revel,” so I named my derivative board “lever” because it’s “revel” backwards.) The LEVER board works by capacitively coupling with your thumb while your other fingers are grounded at the sides of the board. As your thumb moves across the center strip, it doesn’t feel like smooth copper - it can range from waxy or rubbery all the way to sandpapery or grooved like a vinyl record.
LEVER is designed to be a testing board - a place to try out different waveforms, different amplitudes, and different grips and postures toward the touch experience. It’s shaped roughly like a mobile phone both because we thought that phone-based interactions were a main use case, and also because the mobile phone grip shape allows for easy grounding of the signal (the ground pads are on either side of the board, where the hand naturally makes contact). In the photos above, you can see how the form took shape as it went from paper to foamcore to real board. In its final or embedded form, the circuit can be made significantly smaller, possibly small enough to seamlessly integrate into a phone case.
I tested the LEVER board on more than a dozen people, asking first if they could feel the new texture, and then asking if they could notice more subtle differences between textures. The reactions were generally positive - but not entirely so. It’s weird to feel the texture of a surface change under your thumb, and the magnitude of the change is often dependent on external factors, like the person’s sweatiness or the humidity in the room.
As a next step, the LEVER board could take this perceptual variability into account, and make changes accordingly (a similar factor affected the Disney researchers as well). But the design of that feedback circuit-and-software is slightly out-of-scope for the Lab, given our focus on building experience prototypes.
If the variability of the sensation could be resolved, the next step in the research would be to implement LEVER in a mobile phone case and map different types of “stacks” (places nearby I might like, number of unread work emails) to different types of haptic signal. But if we can’t guarantee a baseline of common sensation across people, the mapping work becomes difficult, both cognitively for the user and electrically for the fabricator. For now, we’re happy to have learned more about the challenges and possibilities presented by electrovibration and to share our results. Hopefully this work will inspire others to experiment with this novel and useful form of touch interaction. What would you make with it?
You can find board files, schematics, and firmware at our repo here.