Intel Perceptual Computing SDK – Demo 001: Hands and Fingers!

This past September Intel’s perceptual computing group announced the beta version of its SDK together with Interactive Gesture Camera, a small sized depth sensing camera produced by Creative Labs not much bigger than a webcam and focused on close range interaction.

The SDK is pretty ambitious and interesting in its approach to multi-modal interfaces: combining several computer vision techniques (including hand and finger tracking, face recognition and analysis, 2d and 3d object recognition and apparently also eye-tracking) together with nuance’s speech recognition engine.
All this multimodal sensing when purposefully combined will without a doubt augment and potentiate user’s expressivity on computers.

We got our Interactive Gesture Camera and SDK Kit right before the holidays and as soon as the year started we could’t help but to play with it and with some of the features available.
Here is our first demo: we played with hand and finger tracking and it’s pretty sweet how right out of the box it is to get the data and play with it. Intel was careful and attentive enough to release examples for almost every creative computing framework available (Processing, OpenFrameworks, Cinder and Unity!).
To demonstrate some of the possibilities of individual finger tracking, hand tracking and also using the openness of the hand as interfaces we created a very simple gestural synthesizer:

The user can generate up to 5 (carrier) oscillators (one per finger) with inter-dependent frequencies that are added together (additive synthesis), these are them modulated by another oscillator (modulator) either in AM (Amplitude Modulation) or FM (Frequency Modulation) depending if the hand is either closed or opened respectively. The fingers and hands positions determine the frequency of their oscillators allowing for some interesting expressiveness (but not always an amazing sound.. remember
this is a very basic synthesis engine!).

Right now we feel that the system is not yet stable enough with some of the finger tracking behaving a bit inconsistently but bare in mind this is still a very early Beta version of the SDK and not all is exposed and/or definitive.
We’re looking forward to play more with it and build more cool examples of possible interactions!

feel free to grab the source code on our Github repo
or just download the executable, plug your Interactive Gesture Camera and have fun!

Leave a Reply


7 − = six

 

Switch to our mobile site