SciTech

Skinput transforms arm into touchpad

Current development allows an interface to be projected on a Skinput user’s arm. Tapping certain places on the arm creates different vibrational frequencies, which are detected by sensors that accurately determine the location of the tapping. Skinput also works by tapping on the wrist, fingers, and palm. (credit: Courtesy of Chris Harrison) Current development allows an interface to be projected on a Skinput user’s arm. Tapping certain places on the arm creates different vibrational frequencies, which are detected by sensors that accurately determine the location of the tapping. Skinput also works by tapping on the wrist, fingers, and palm. (credit: Courtesy of Chris Harrison)

As electronic devices get smaller and smaller, the buttons, touch screens, and knobs we use to control them must do the same. A need for usability constrains the size of devices and is often the only thing that keeps devices from getting smaller and more convenient to carry. Chris Harrison, a third-year Ph.D. student at the Human Computer Interaction Institute, has found a way to move the method of input outside the device, removing this inconvenient constraint.

His product, “Skinput,” developed in conjunction with Microsoft researchers Dan Morris and Desney Chan, seeks to allow people to interact with their electronics simply by touching different parts of their forearms. It effectively transforms an arm into a touchscreen or touchpad.

According to a Carnegie Mellon press release, “bone and soft tissue variations” in the arm mean that tapping one part of the arm produces different sounds and vibrations than tapping some other part. These vibrations then travel up the arm, where sensors can determine the frequency. Since each location has a distinctive frequency, the sensors can determine which part of the arm was tapped. Harrison said that the sensors are devices known as piezoelectric crystals, which produce an electric current in response to vibration, and an array of 10 such sensors placed in an armband are used to detect the vibrations. Each crystal is tuned to detect a particular characteristic frequency.

The signals generated by the sensors are then passed through a set of machine learning algorithms that improve the accuracy of the sensors and give the location of tapping with greater certainty. Despite the fact that human arms are quite different from person to person, Harrison said that his system is extremely accurate as long as the appropriate positions are chosen and as long as the number of locations to tap is kept small. His paper on the technology notes that up to 95 percent accuracy can be achieved using certain easily accessible locations on the arm.

Skinput forms one part of Harrison’s grander plans for allowing people to “interact with small things in big ways.” The idea is to be able to “appropriate” other objects to use as input devices. Harrison feels that one way to reduce the size of input devices is to make use of something that we already carry around with us: our arm. His previous methods used similar technologies to use tables as input devices, but as Harrison said in an interview, “you can’t take a table everywhere you go.”

Harrison sees Skinput as filling a niche for highly portable, large-scale input devices, possibly complementing projectors. “Projectors already provide us with small devices that produce a large output, but input devices have not yet caught up,” he said. One variation of Harrison’s technology has a small projector (called a pico projector) attached to the Skinput arm band, which allows menu items and such to be projected onto the hand.

One application of Skinput that Harrison envisions is a way to control music players, which are already controlled by a limited number of buttons. He envisions a jogger tapping her thumb and forefinger together to skip tracks or her thumb and ring finger to increase the volume. This also takes advantage of the fact that we are aware of the position of our arms and fingers without looking at them (a phenomenon called proprioception, as explained on Harrison’s website, http://www.chrisharrison.net/projects/skinput/). This enables using devices without having to divert our attention to them. This could act as an extension of devices such as the iPod Shuffle, which already functions well without a screen.
Whatever the case, Harrison’s projects are on the path to bringing the bond between humans and computers even closer.