Understanding How People Use Skin as an Input Surface for Mobile Computing surveys how people might use skin as a gesture-based input device. There are no actual sensors involved, instead the research is more interested in the gestures themselves, which they hope will inform input sensors in the future, rather than have the gestures be restricted by the sensors.
This paper contributes results from an empirical study of on-skin input, an emerging technique for controlling mobile devices. Skin is fundamentally different from off-body touch surfaces, opening up a new and largely unexplored interaction space. We investigate characteristics of the various skin-specific input modalities, analyze what kinds of gestures are performed on skin, and study what are preferred input locations.
What, this is super cool (I know, very insightful comment here). And I'm having a great time imagining a phone made of skin right now (':
As long as you can do the pinky-fist-thumb to your ear thing and have it mean “make a phone call” I’m on board!
Papers related to this I stumbled upon recently:
"Tactum: A Skin-Centric Approach to Digital Design and Fabrication", Gannon et al., CHI 2015
"ExoSkin: On-Body Fabrication", Gannon et al. CHI 2016
Whaaaaa these are so cool! I am in the “still creeped out by having computers reside on my body” phase, but that probably makes me an old person. I do wear other technology (clothing, glasses, etc), after all...