Tactum is an augmented modeling tool that lets you design 3D printed wearables directly on your body. It uses depth sensing and projection mapping to detect and display touch gestures on the skin. A person can simply touch, poke, rub, or pinch the geometry projected onto their arm to customize ready-to-print, ready-to-wear forms.
In its current iteration, Tactum uses computer vision and projection mapping to detect interactions with the body. Tracking and gesture recognition is done with a Leap Motion Controller, and visual feedback is projected onto the forearm using a Casio XJA251 projector. We extract a user's natural gestures – gestures that don't require specific training – to drive a body-based 3D modeling environment. A person can touch, poke, rub, pinch, grab, and twist the digital geometry projected onto their body.
Fabrication & Ergonomic Fit:
Since this base geometry is generated from 3D data of the arm, any design created through Tactum is inherently built to fit each individual user's body. Additionally, technical 3D printing constraints are also embedded within the geometry; this means that no matter how much you manipulate the digital geometry, every design generated through Tactum is guaranteed to be 3D printable.
For more information, see the full project page at http://madlab.cc/tactum
Tactum was developed in collaboration with Autodesk Research (http://www.autodeskresearch.com/), and with support from the Frank-Ratchye STUDIO for Creative Inquiry at Carnegie Mellon University (http://studioforcreativeinquiry.org/).
Music by Broke For Free (https://soundcloud.com/broke-for-free/bonobo-recurring-remix)