Chant KinesicsKit v3 - Build Humanlike Interfaces for Natural User Experience

Chant KinesicsKit v3

Chant KinesicsKit v3
Chant KinesicsKit v3


Humanize your application interfaces with movement tracking using natural user interface (NUI) technology. Movement tracking is the process of mapping movement from image and positional data captured by cameras. It can be used to control data processing and data entry. As a new modality for a natural user interface, it can augment traditional input methods applications use with keyboard, mouse, touch, and voice.

What is Movement Management?
Movement management enables you to:

capture image data and detect movement,
map and process movement data for application processing,
persist movement data for playback, editing, and analytics, and
integrate sensor device configuration and control as part of deployed applications.
Applications benefits include:

touch-free modality for control and input,
added flexibility to run on demand based on movement detection, and
expanded deployment scenarios for interactive processing in non-traditional application environments.
What is KinesicsKit?
Chant KinesicsKit handles the complexities of tracking movement with Microsoft Kinect for Windows.

It simplifies the process of managing movement with Kinect sensors and the Microsoft Natural User Interface API (NAPI). You can process audio and visual data to map movement directly within software you develop and deploy.

KinesicsKit includes C++, C++Builder, Delphi, Java, and .NET Framework class library formats to support all your programming languages and sample projects for popular IDEs—such as the latest Visual Studio from Microsoft and RAD Studio from Embarcadero.

The class libraries can be integrated with 32-bit and 64-bit applications.

Movement Management Component Architecture
The KinesicsKit class library includes a movement management class that provides you a simple way to track and map movement with Microsoft Kinect for Windows.

The movement management class, ChantKM, enables you to start and stop color, depth, body, and audio data collection with Microsoft Kinect sensors. Your application can also use the KinectSensor and adjunct classes to manage low-level functions if desired.

With the ChantKM class, you can detect movement, process color, depth, and body data, and record audio to a file. Your application uses the ChantKM class to manage the activities for interacting with the Microsoft Kinect sensor on behalf of your application. The ChantKM class manages the resources and interacts directly with the Natural User Interface API (NAPI) runtime.

Your application receives status notifications through event callbacks.

Speech recognition and synthesis are supported with the SpeechKit ChantSR and ChantTTS classes. See SpeechKit for more information about integrating speech technology.



The ChantKM class encapsulates the NAPI functions to make the process of tracking movement with Microsoft Kinect sensors simple and efficient for your application.

The ChantKM class simplifies the process of managing Microsoft Kinect sensors by handling the low-level activities directly with the sensor.

You instantiate a ChantKM class object before you want to start tracking movement within your application. You destroy the ChantKM class object and release its resources when you no longer want to track movement within your application.

Only for V.I.P
Warning! You are not allowed to view this text.