By DigitalRune Team on Tuesday, May 01, 2012

This is a guest post by Kairat Aitpayev.

Augmented Reality (AR) technology allows adding virtual object to the real world using special markers. By merging it with Microsoft Kinect I have created new approach which allows users to use their own body as a collision object in the 3D world and interact with augmented objects.


Here the presentation of the demo application which by itself is a part of bigger project related with online sport education:

(Full video of entire project: See

The project was created in the MultiMedia Lab of Kazakh-British Technical University (Kazakhstan) in collaboration with University of Technology of Belfort-Montbeliard (France).

DigitalRune libraries are used to improve the interaction between the real and the virtual world.

By DigitalRune Team on Tuesday, April 03, 2012

Our last post about real-time motion capture using Kinect seems to be pretty popular. And for the small amount of time (only a few hours) that we put into this example, the results are satisfying – but we can do better! We have updated the project and included a brand new sample. The new sample uses skeleton mapping to animate 3D models using Kinect. The code is much simpler, and exchanging the 3D model is a lot easier. We have also updated the example application to use Kinect SDK v1.0 instead of Kinect SDK beta2. You can download the sample project (including the source code) at the end of this post.


Here are the results of the new sample application:

By DigitalRune Team on Tuesday, January 03, 2012

A while ago we discussed in the Forum how to use the Microsoft Kinect SDK to animate a character model in real-time and how to apply movement constraints (e.g. joint rotation limits). Yesterday our company got a new Kinect device and I figured I might spend the rest of the afternoon trying to write a small XNA sample.

Here is the result: (You can download the source code below the video.)