MadSci LAB partnered with Intel Perceptual Computing Group to host NYC’s first Perceptual Hackathon.
During two days a group a of creative minds and interaction explorers ‘hacked’ Intel’s Perceptual Computing SDK looking for innovative uses and novel interactions.
Intel’s Perceptual Computing SDK features an interactive gesture camera and a whole framework that includes tools for hand and finger tracking with a skeletal model, face tracking and speech recognition and is pushing interactions using several sensing modes simultaneously.(read more about our experience with it here!)
During the first day participants installed and became familiarized with the SDK while created groups and started exploring the initial ideas that later on became the final projects developed in the second day of the hackathon.
Below is a brief description of some of the resulting projects:
Valerie Chen and Youjin Shin created an original Twitter feed visualization paired with a gestural approach allowing the users to ‘pinch’ the tweet and dive into its geolocation in a map:
Paul Cheng explored bi-manual input and hand tracking for a fly-thorough game in 3d space:
Rui Pereira combined hand and finger tracking for pointing and speech-recognition with voice commands to pull the trigger in it’s bang!bang! ‘point and shout’ game:
Brett Renfer (w/ Adam Lassy) combined finger tracking and SpaceBrew for a networked ‘poking’ duel game:
Jo HanByul and Sanniti Pimpley used finger tracking and gestures for a super cute puppet controlled game:
In the end, Quinn Kennedy won the hackathon by combining face tracking, speech recognition and physical computing to give life to his expressive robot that follows users in a space and politely replies to their conversation.
We’re so happy that such a great group of creative people joined us in this adventure with the Intel Perceptual Computing Group team. A huge THANK YOU! to all who participated and another one to the Intel team who supported us with such attention and dedication!