To create our cool 3D launching interaction we are using computer vision to capture finger positions. The easiest way to do this is to track the fingertips using colored markers.
The only downside to this method is now the user will need to put on some sort of colored marker in order to interact with our system (or paint their nails / fingers). For our application we will probably use the ends of colored latex gloves to provide the tracked color without the mess.
The pros to this approach is more robust tracking of the fingers which seems to be pertinent for this high stakes game situation (we don’t want the Insane Llamas to win do we?). Although there are compromises that can be made (open hand and closed hand gesture recognition would be easier than marker-less finger tracking) we think that this approach will provide the best game experience.
To track the fingers we will be using OpenCV which has a port for iOS devices. With some sample code from AI Shack I was able to get a prototype working for tracking two colors; in this case red and yellow. In the image below you can see the paths of the objects’ center of mass and the threshold used for determining the objects.
The next step is to determine pinch and release events for the two colors. Once that is complete we can determine trajectory and force for our Kiwis!