Anyone who has worked with Kinect will tell you that the information you get out of the sensor is somewhat noisy. This is particularly problematic if you’re doing avateering, you can’t just apply it directly to a model - you’ll need to do some filtering, such as double Holt exponential smoothing.
We were nearing the deadline for an interactive installation, then we got a very odd report: the client said that while everything was going perfect for 99% of the cases, when a particular QA user tried the application the avatar legs suddenly started wildly kicking around.
This made no sense to us - the algorithms we were using were in no way person-specific. I asked them to send us some video of how the render looked and sure enough, the legs were flailing around as an epileptic octopus.
Intrigued, we moved to the next step in debugging and checked a recording of the raw Kinect data. This immediately showed us what the problem was: Kinect was seeing the user as a floating torso.
What gives?