Which sensor backend?


RGBDemo supports several sensors:

  • PMD camera. Enable the NESTK_USE_PMDSDK cmake variable.
  • Kinect for Xbox. You can use freenect or OpenNI. Freenect is lighter, but does not have skeleton tracking, and is less tested on Windows.
  • Kinect for Windows. You can use OpenNI with the latest avin2 drivers, or the Kinect for Windows SDK from Microsoft, recommended if you are on Windows. Enable the NESTK_USE_KIN4WIN cmake variable in that case.
  • Asus Xtion Pro Live. Only the OpenNI backend supports it.

Installing OpenNI


In the recent github codebase, OpenNI is not included anymore, so you need to install it manually.

You need to install several things:

Compilation on Linux (Ubuntu)


  • OpenCV >= 2.2 is required. It can be downloaded from OpenCV website. Note: OpenCV 2.3.1 is not supported in RGBDemo 0.6.0. It is supported in the latest github version though.
  • An optional install of PCL (then you need to enable NESTK_USE_PCL cmake variable)
  • Install required packages, e.g. on Ubuntu 10.10:
sudo apt-get install libboost-all-dev libusb-1.0-0-dev libqt4-dev libgtk2.0-dev cmake libglew1.5-dev libgsl0-dev libglut3-dev libxmu-dev
  • Untar the source, use the provided scripts to launch cmake and compile:
tar xvfz rgbdemo-0.6.1-Source.tar.gz
cd rgbdemo-0.6.1-Source
./linux_configure.sh
./linux_build.sh

Compilation on Mac


You will need:

  • An install of QT
  • An install of `OpenCV > 2.2.
  • An optional install of PCL (then you need to enable NESTK_USE_PCL cmake variable)
  • Note: as of version 0.5.0, libusb is included in the library, so no need to install it.

Then run the following commands:

tar xvfz rgbdemo-0.6.0-Source.tar.gz
cd rgbdemo-0.6.0-Source
./macosx_configure.sh
./macosx_build.sh

The configure script might ask for libusb installation. Say yes if you don’t have it installed.

If you still experience some issues with libusb, or have a custom install, you can try:

cmake -DLIBUSB_1_INCLUDE_DIR=$HOME/libusb/include -DLIBUSB_1_LIBRARY=$HOME/libusb/lib/libusb-1.0.dylib build

supposing that you have it installed in $HOME/libusb.

Compilation on Windows


It has been tested with MinGW and Visual Studio 10 so far. Note that OpenNI backend is NOT available for Mingw.

You cannot use both libfreenect and OpenNI backends on Windows. You have to choose between one of them. By default, OpenNI backend will be compiled.

You might also want to check the nice tutorial contributed by David Jones: http://razorvision.tumblr.com/post/15056357377/how-to-compile-rgbdemo-0-6-1 .

UPDATE: An updated tutorial for Windows Compilation has been contributed by Eduard Kosel: http://www.midnight-tech.de/component/content/article/1-kinect/1-compiling-rgbdemo-07-easily

If you want to compile with libfreenect backend, you will first need to install the libfreenect drivers from OpenKinect Windows.

If you want to compile using Visual Studio 2008:

  • Install QT binaries for MSVC 2008?.
  • Install `OpenCV > 2.2.0 from source.
  • An optional install of PCL (then you need to enable NESTK_USE_PCL cmake variable)
  • Install OpenNI, SensorKinect, and Nite (in this order).
  • Add QT bin path to the Path environment variable, or specify QMAKE path in CMake
  • Run CMake
  • Open the generated solution in Visual Studio.

If you want to compile using Visual Studio 2010:

Here is a step-by-step procedure for Min GW?, in case you want to use libfreenect:

  • Install QT opensource for Windows. This will also install Min GW?.
  • Add C:\Qt\2010.05\Min GW?\bin to the Path environment variable
  • Install and run cmake on rgbdemo
  • Disable the NESTK_USE_OPENNI cmake variable
  • Open the CMakeLists.txt in Qt Creator? or compile manually using mingw-make.