Equipment/Kinect: Difference between revisions

From London Hackspace Wiki
m (Reverted edits by 75.101.146.96 (talk) to last revision by Christopher.paton@gmail.com)
mNo edit summary
Line 105: Line 105:


[http://www.adafruit.com/blog/2010/12/21/vr-kinect/ Adafruit have this video] showing the Vuzix and kinect working together. Again, it is relying on stereo tracking and similar.
[http://www.adafruit.com/blog/2010/12/21/vr-kinect/ Adafruit have this video] showing the Vuzix and kinect working together. Again, it is relying on stereo tracking and similar.
[[Category:Equipment/Electronics]]

Revision as of 19:50, 8 May 2013

A page all about our lovely Kinect and the things it can do.

Kinect for Windows XP, Vista or 7

After countless attempts of trying to get the Kinect to work with windows and all the drivers, patches, programs you have to install I came across this bundle installer that does it all for you and give you some demos too and the ability to control the mouse, mouse buttons and key presses using gestures.

This is the website that you can download the package and the read me.

[1]

You then might need to re-load the drivers once you plug the Kinect in, this is done simply by pointing the "found new hardware" installation to 'C:/program files/prime sense/sensor/driver" for the Kinect motor. Ignore the Kinect microphone, but re-do as for the motor the Kinect sensor new hardware.

Hope this helps.

Chris Paton Chris-robot

Libfreenect / OSX / libusb

https://github.com/OpenKinect/libfreenect has the skinny on the actual driver. Basically, you do need a patched, older version of LibUSB. This was true of Cinder Kinect Block and ofxKinect as well. The homebrew install doesn't work I think but I can't be sure of that.

You will need to pull the matching version of libusb for this patch. This is NOT v1.0.8, this is a change based off the repo head as of 2010-10-16. To get a tar.gz with the snapshot of the repo at this point, hit the link below.

http://git.libusb.org/?p=libusb.git;a=snapshot;h=7da756e09fd97efad2b35b5cee0e2b2550aac2cb;sf=tgz;js=1

Once you’ve gotten that tarball and unziped it somewhere, patch using the files in platform/osx/. Just go to the root directory of the libusb source and run

patch -p1 < [path_to_OpenKinectRepo]/platform/osx/libusb-osx-kinect.diff

You need to tell configure to include some necessary frameworks: ./configure LDFLAGS=-framework IOKit -framework CoreFoundation

Recompile libusb and put it wherever CMake will look (/usr/local/lib, /usr/lib, etc…). If you’re using a package manager like fink, macports, or homebrew, I’m going to expect you know what your doing and can deal with this. If not, see IRC channel.

OpenNI

OpenNI is a new fangled system for the creation of fancy user interfaces. It accepts modules and many other things (middleware called NITE aparently).

The Sensor Kinect is one such project and this was originally made by Primesense for Windows and Linux and has since been ported to OSX.

Obviously, the openframeworks community got right on it and there is some pretty sneaky stuff here :)


Steps to Reproduce on OSX

To begin with, there are a few things that one should install. Basic OpenNI and NITE need to be installed first and these are already in beta on the page for OSX. When installing the NITE Binaries, you need to use 0KOIk2JeIBYClPWVnMoRKn5cdY4= which is the licence key (which is needed for some reason! :S ). There is source code for OpenNI and it appears for NITE as well but I've not managed to get that properly built yet.

I never tried the actual samples contained in OpenNI that come prebuilt! This may have saved a lot of time. They don't work now but maybe they can be made to work.

Once these are installed, we can then attack the 3 projects above in the order they are given in the instructions.

# get repositories 
mkdir openni_dev
cd openni_dev
mkdir nite
git clone git@github.com:roxlu/OpenNI.git
git clone git@github.com:roxlu/SensorKinect.git
# compile openNI
cd OpenNI/Platform/Mac/Bin/openFrameworks
./build.sh
make 
sudo make install
# compile SensorKinect
cd SensorKinect/Platform/Mac/Bin/openFrameworks
./build.sh
make
sudo make install
#do *something* with nite
cp OpenNI/Platform/Mac/Bin/openFrameworks/nite* nite/
cd nite
./nite_copy_to_openframeworks.sh
./nite_change_rpaths


Compiling OpenNI can be an issue with CMake as it looks for sample/ofxOpenNI which doesnt exist. Comment this out in the CMakeLists.txt file and run the build scripts and make. It builds ok.

With these two built, you'll see OpenNI_openFrameworks dir built. This contains most of the bits you need such as the libraries.

It is recommended that you compile and run two test programs, Sample-NiViewer and Sample-NiUserTracker . The first should show the streams from the two sensors working hopefully if all is well. The second actually loads the middleware I believe and performs the skeleton tracking. This is the important one to get working and it didnt work until I installed the proper Nite binaries.

The ofxOpenNI plugin is fairly straight forward. Move it to your apps/myapps folder and open the project. You need to add the libOpenNI.dylib that you built from the previous section. Before I installed the Mac NITE binaries, this compiled and ran but didn't work.

ROS

This is a linux (and macports) system for robotics. Its been touted by lady ada and a few others as being rather cool. Probably should look into it. Some interesting resources are:

There is probably more. Its installed on my VM. Its quite a large package.

3D Stereo

this youtube clip has some thoughts on using the Kinect for stereo images. Apparently you can buy the film here.

Is this similar to a DS display? Probably.

Kinect and Vuzix

Adafruit have this video showing the Vuzix and kinect working together. Again, it is relying on stereo tracking and similar.