Daniel Tsui demonstrating the work he and Jack Griffiths have done this week on HMM gesture tracking.
Danielle showing how it’s really done, controlling audio samples and light patterns
Rod Sakakibara adding sensors to a clarinet while still allowing it to be played in the normal fashion. To enable that, the sensors must be mounted in the keypads, and the keypads must still seal properly.
The irritating tone is just to show that the data is changing — this isn’t what the results of controlling audio will sound like!
Danielle and gesture/pose tracking triggering audio samples and light presets
Danielle pose tracking triggering audio samples
Over the course of this week (5/3), Jack and Daniel went up the hill…. erm… wrong story… worked on gesture tracking with the KiCASS system. Courtesy of the ml.lib library and data smoothing, they were able to create a patch that allowed for reliable gesture tracking. Theoretically the limit on how many gestures that can be recorded is unlimited… but some say you will quickly forget which gesture belongs to which ID number/sound/data by gesture number 8… More to follow — proof of concept demos will be developed this coming week.