NIME2020 Poster

Inexpensive Colour Tracking to Overcome Performer ID Loss

Bob Pritchard
UBC School of Music
6361 Memorial Road
Vancouver, Canada
bob@mail.ubc.ca
Ian Lavery
Algorhythmic Software Design
308 – 988 16th Ave W.
Vancouver, Canada
ian_lavery@hotmail.com

OVERVIEW

Inexpensive single camera motion tracking systems using active infrared (e.g. Kinect, Orbbec) are often used to generate data for use in artistic performance. The Kinect Controlled Artistic Sensing System (KiCASS) is one such environment, developed in 2016 at the University of British Columbia and upgraded in subsequent years.(KiCASS) began as a project between undergraduate Engineering students and the School of Music digital performance ensemble SUBCLASS. It uses a Kinect-for-Windows© camera, a high-end Windows laptop, a router, and NuiTrack-based custom written software, now allowing users to select up to 22 tracking points as well as four hand positions on up to six performers.  While it can track six performers is generally used to track one or two dancers or musicians. It has been used in dozens of studio and public performances and gallery installations, with the generated data controlling audio, video, processing, and/or lighting.

THE PROBLEMS

1. Tracking Loss

  • Performer can be occluded by another performer or object, or performer will exit and re-enter the tracking area
  • When this occurs system loses track of performer
  • System creates new tracking ID upon reacquisition of performer
  • Original tracking points on performer are lost, so no performance data is generated.

2. Tracking Swapping

  • System will occasionally swap IDs of onscreen performers
  • Performers lose assigned target points, thus unable to control media

THE SOLUTION

To overcome these issues, we developed colour tracking to override system-assigned performer IDs. We generate consistent performer IDs based on costume colour.

Colour Registration of Performers

  • We take a 6-frame analysis of each dancer’s dominant hue, using the quadrilateral created by the joint-tracked shoulders and hip, and NuiTrack’s user mask.
  • Convert the image from RGB to HSV (Hue Saturation Value), using the Open C library Emgu CV
  • Create a histogram of all hue values in the torso region. The colour value that occurs most frequently is selected as the dominant hue of the torso.
  • During registration we also select the target points to be tracked on each dancer.

Data Check

System checks incoming HSV values from each performer every 500 msec.

  • Confidence Value (CV) generated by comparing incoming colour with registered colours
  • CV = 100 – distance-from-match. Value of 0 means performer has been lost
  • Reacquisition: software determines dominant incoming colour
  • Assigns to a colour through closest distance-from-match

Optimization

Every 500 msec each performer’s colour reanalyzed

  • With three confirmed matches of colour of all performers, system locks matches until CV for any performer falls below certain threshold
  • If CV falls below certain threshold, system runs optimization routines
  • If performer’s colour match changes to shorter distance than current match, colour IDs are swapped between two users identified with those colours

Performance

  • Data (X Y Z coordinates of tracked points and CV) sent over wi-fi in OSC format
  • Clients receive data and use Max or Pd to control audio/video, processing, lighting
  • Tracking area parallelogram has depth of 8 metres and far width of 6 metres

Outcomes

  • Successful tracking four dancers to control amplitude of individual audio tracks
  • Amplitude based on distance of torso from camera
  • Occlusion or exit from tracking area caused associated audio track fade out
  • Re-entry/acquisition caused associated audio track fade in
  • Issues with shine from theatrical lighting on spandex leotards
  • Problems with shadows being cast on performers
  • Performers/choreographer enjoyed being able to control audio levels through basic positioning, not interfering with other performance movements

Future work

  • Colour registration will include performers moving through space during registration to account for lighting differences
  • Addition of limb tracking and target points to increase control and artistic possibilities
  • Addition of P.I.’s e-textile sensors to increase control opportunities

Acknowledgements

We acknowledge that the University of British Columbia is located on the traditional, ancestral, and unceded territory of the hən̓q̓əmin̓əm̓-speaking xʷməθkʷəy̓əm (Musqueam) people.