TaSTE is the Tracking and Smart Textiles Environments project. It combines two previous projects: the Kinect-Controlled Artistic Sensing System (KiCASS) and the Responsive User Body Suit (RUBS). KiCASS uses Kinect-for-Windows hardware and custom written software to track dancers in performance, using the resulting data to control audio, video, and costume and theatre lighting. RUBS consist of costuming with e-textile sensors and/or on-body lighting, and the data has been used to control audio and video triggering and processing. Combining the various media results in a new, more sophisticated performance environment.
TASTE is led by Prof. Bob Pritchard (University of British Columbia), with input from Profs. Keith Hamel (Music) and Sid Fels (Engineering). Prof. Nancy Paris (British Columbia Institute of Technology) carried out conductive stitching pattern analysis as part of the project. Dancers/choreographers include Ziyian Kwan and Emmalena Fredriksson, and dancers Sarah Wasik, Danielle Lee, Samantha Krystal, Gemma Tomasky, Hanna Van Inwegen, and Kayla Price. Musicians include Kiran Bhumber, Paulo Bortolussi, Margaret Lancaster, Danielle Lee, and David Owen. Textile design and costume development is done by Alaia Hamer, and code refactoring is carried out by Ian Lavery.
Student researchers: have included musicians Chantelle Ko, Emily Richardson, and Carlos Savall Guardiola, dancers Gemma Tomasky and Emily Daily, and doctoral composer/programmers Brian Topp and Michael Ducharme. The first wireless unit for RUBS was created by engineers Jin Han, Carol Fu, Lily Shao, and Esther Mutinda, and the KiCASS and KiCASS II systems were created by Isaac Cheng, Russil Glover, Kelsey Hawley, Kevin Hui, Michael Sargent,Dominique Low , Theresa Mammarella , Geoff Shaw , Emmett Tan , and Justin Wang.The wireless RUBS ESP32 augmented boards were designed and created by Daniel Tsui, who also wrote the code for sensor input and lighting control.