For the last couple months i’ve been working with Formiga Design helping with a couple of projects. One of these projects was a mapping scenario for Coca Cola’s 70th anniversary with both real-time and pre-rendered content. Here is a video presenting some of the goodies that made up the day & night.
Since the release of OpenNI, I wanted to find a way to be able to keep Skeleton Tracking features while having access to things such as Motor control, LED options, etc. OpenNI as an API for PrimeSensor, makes sense that it doesn’t include any of this features, afterall it’s not their problem (speculation). Well, recently i found a thread at the OpenNI Groups from a guy named Michael Kuzmin that made me feel stupid as to how simple it is to have both things working together. Only for Windows OS.
And I quote:
“Simply install OpenNI and after that CL NUI. Then all three components (audio, motor, camera) should be managed by CL NUI.
Now just go to your hardware configuration and delete the driver for the NUI Camera. When you have done that, the OpenNI driver should be found by your OS.
If your OS installs the NUI driver again, just go to your CL NUI installation dir and rename the “driver” folder, so it can’t find it again.”
Another of my recent experiments with the kinect. As i keep wanting to play with particles running on the GPU i thought i would make it much more fun to play with and connect it with the Kinect. I came up with this. A million particles makes the user and disintegrates as time goes by. Actually, it looks like a man made of sand. Have a look:
This is how it all started:
During my christmas holidays i had time to spare and some ideas to develop, so with all the party-feeling and booze i started working on the one i wanted the most. This is a cloth simulation running on the CPU which you can interact with using the kinect. With the 3d information from the kinect i’m able to change the cloth in some cool ways. The most interesting is plain and simple, offset each connection point by the depth at that point. With that done i had to go with the visual side (fun fun!), so i started playing with per-pixel lighting and some cool patterns.
Then added self-shadowing to the mixture and some color control just for the fun and looks of it.
In the end i thought it would be nice to take my carpet out to clean. Happy holidays!
OpenNI, a library created by PrimeSense. It’s quite famous for its ability to use the Kinect to track users and feed us with a usable skeleton per user. Since i wanted to give this a try and play around with the skeleton tracking i had to have this. So i did. In the end i ended up with a wrapper around this API. It’s open-source and available to everyone. Its not perfect and needs work and more features, but it’s just a start. Use it at your own risk and with your favourite creative framework. It’s possible.