Kinect RGB Demo v0.5.0

Research.KinectRgbDemoV5 History

Hide minor edits - Show changes to output

July 19, 2011, at 07:42 PM by 87.217.161.246 -
Added lines 3-6:
! [[KinectRgbDemoV6|%red%NEW: version 0.6]]%%

(:redirect KinectRgbDemoV6:)

June 14, 2011, at 01:56 AM by 180.24.85.237 -
Changed lines 25-26 from:
Please send your questions, patches, ... to mailto:rgbdemo@groups.google.com .
to:
Please send your questions, patches, ... to mailto:rgbdemo@googlegroups.com .
May 03, 2011, at 08:09 AM by 163.117.150.243 -
Changed line 35 from:
* OpenNI / Nite backend support. No more fun with the chessboard-based calibration, sorry. Thanks to Diererick/Roxlu for the initial CMake integration.
to:
* `OpenNI / Nite backend support. No more fun with the chessboard-based calibration, sorry. Thanks to Diererick/Roxlu for the initial CMake integration.
Changed lines 56-57 from:
You can get MacOSX universal (Intel only) binaries from there: [[http://sourceforge.net/projects/roboticslab/files/RGBDemo-0.5.0-Darwin.dmg/download|RGBDemo-0.5.0-Darwin.dmg]] (LGPL License).
to:
You can get `MacOSX universal (Intel only) binaries from there: [[http://sourceforge.net/projects/roboticslab/files/RGBDemo-0.5.0-Darwin.dmg/download|RGBDemo-0.5.0-Darwin.dmg]] (LGPL License).
Changed lines 107-110 from:
It has been tested with MinGW and Visual Studio 10 so far. Note that OpenNI backend is NOT available for Mingw.

You cannot use both libfreenect and OpenNI backends on Windows. You have to choose between one of them. By default, OpenNI backend will be compiled.
to:
It has been tested with `MinGW and Visual Studio 10 so far. Note that `OpenNI backend is NOT available for Mingw.

You cannot use both libfreenect and `OpenNI backends on Windows. You have to choose between one of them. By default, `OpenNI backend will be compiled.
Changed lines 115-117 from:
* Install OpenNI, SensorKinect, and Nite (in this order).
* Add QT bin path to the Path environment variable, or specify @@QMAKE@@ path in CMake
* Run CMake
to:
* Install `OpenNI, `SensorKinect, and Nite (in this order).
* Add QT bin path to the Path environment variable, or specify @@QMAKE@@ path in `CMake
* Run `CMake
Changed lines 122-124 from:
* Install OpenNI, SensorKinect, and Nite (in this order).
* Add QT bin path to the Path environment variable, or specify @@QMAKE@@ path in CMake
* Run CMake
to:
* Install `OpenNI, `SensorKinect, and Nite (in this order).
* Add QT bin path to the Path environment variable, or specify @@QMAKE@@ path in `CMake
* Run `CMake
Changed lines 132-133 from:
* Open the CMakeLists.txt in @@QtCreator@@ or compile manually using mingw-make.
to:
* Open the `CMakeLists.txt in @@QtCreator@@ or compile manually using mingw-make.
April 15, 2011, at 05:58 AM by 87.217.161.32 -
Changed lines 62-65 from:
* You will have to install OpenNI/Nite drivers. You can download them from [[http://openni.org|OpenNI]] website, or you can use the copy provided in the @@Drivers@@ directory.

* '''Important:''' You first need to install OpenNI, then SensorKinect, then Nite, in this order. You can find the license key for Nite on the same website. The free license for Kinect devices is @@0KOIk2JeIBYClPWVnMoRKn5cdY4=@@
to:
* You will have to install `OpenNI/Nite drivers. You can download them from [[http://openni.org|OpenNI]] website, or you can use the copy provided in the @@Drivers@@ directory.

* '''Important:''' You first need to install `OpenNI, then `SensorKinect, then Nite, in this order. You can find the license key for Nite on the same website. The free license for Kinect devices is @@0KOIk2JeIBYClPWVnMoRKn5cdY4=@@
Changed line 70 from:
* The source includes a copy of OpenCV since Ubuntu packages are buggy. If you want to use an external OpenCV installation (>= 2.2), disable the USE_EXTERNAL_OPENCV flag in CMake or directly use the @@./linux_configure_external_opencv.sh@@ script.
to:
* The source includes a copy of `OpenCV since Ubuntu packages are buggy. If you want to use an external OpenCV installation (>= 2.2), disable the USE_EXTERNAL_OPENCV flag in CMake or directly use the @@./linux_configure_external_opencv.sh@@ script.
April 11, 2011, at 07:12 PM by 87.217.161.180 -
Changed line 73 from:
sudo apt-get install libboost1.42-dev libusb-1.0-0-dev libqt4-dev libgtk2.0-dev cmake libglew1.5-dev libgsl0-dev libglut3-dev libxmu-dev
to:
sudo apt-get install libboost-all-dev libusb-1.0-0-dev libqt4-dev libgtk2.0-dev cmake libglew1.5-dev libgsl0-dev libglut3-dev libxmu-dev
March 23, 2011, at 05:59 PM by 163.117.150.79 -
Added lines 62-65:
* You will have to install OpenNI/Nite drivers. You can download them from [[http://openni.org|OpenNI]] website, or you can use the copy provided in the @@Drivers@@ directory.

* '''Important:''' You first need to install OpenNI, then SensorKinect, then Nite, in this order. You can find the license key for Nite on the same website. The free license for Kinect devices is @@0KOIk2JeIBYClPWVnMoRKn5cdY4=@@

March 23, 2011, at 03:40 PM by 163.117.86.191 -
Changed line 69 from:
sudo apt-get install libusb-1.0-0-dev libqt4-dev libgtk2.0-dev cmake libglew1.5-dev libcv-dev libhighgui-dev libcvaux-dev libgsl0-dev libglut3-dev libxmu-dev
to:
sudo apt-get install libboost1.42-dev libusb-1.0-0-dev libqt4-dev libgtk2.0-dev cmake libglew1.5-dev libgsl0-dev libglut3-dev libxmu-dev
March 23, 2011, at 03:39 PM by 163.117.86.191 -
Changed lines 259-260 from:
* You can try an experimental interactive scene reconstruction mode using the @@build/bin/rgbd-reconstructor@@ program. This is similar to the interative mapping of [[http://ils.intel-research.net/projects/rgbd|Intel RGBD]] but still in a very preliminar stage. The relative pose between image captures is determined using feature points matching and mean squares minimization.
to:
* You can try an experimental interactive scene reconstruction mode using the @@build/bin/rgbd-reconstructor@@ program. This is similar to the interative mapping of [[http://ils.intel-research.net/projects/rgbd|Intel RGBD]] but still in a preliminar stage. The relative pose between image captures is determined using feature points matching and mean squares minimization.
Added lines 263-264:
* '''Note: ''' As of version 0.5.0, you can enable ICP refinement if @@NESTK_USE_PCL@@ is enabled (by default on Linux) and using the @@--icp@@ option.
March 23, 2011, at 02:56 PM by 163.117.150.79 -
Changed lines 50-51 from:
Attach:skeletor.png
to:
%height=320px% Attach:skeletor.png
March 23, 2011, at 02:55 PM by 163.117.150.79 -
Added lines 50-51:
Attach:skeletor.png
March 23, 2011, at 02:49 PM by 163.117.150.79 -
Changed lines 40-43 from:
You can have a look at the 3D freehand reconstruction on the following video:

(:youtube 2ml8GiUPTao:)
to:
You can have a look at the new 3D freehand reconstruction on the following video:

(:youtube Cldf7UdFq1k:)
March 22, 2011, at 11:22 AM by 163.117.150.243 -
Added lines 268-272:

!!! Body tracking and gesture recognition

* Launch @@rgbd-skeletor@@.
If you make the calibration pose, you should be able to see your joints. If you are interested into a minimal body tracking example, you can have a look at @@nestk/tests/test-nite.cpp@@. Enable the @@NESTK_BUILD_TESTS@@ cmake variable to compile it.
March 22, 2011, at 10:13 AM by 163.117.150.243 -
Changed lines 82-83 from:
* An install of libusb. You can install it manually from [[http://www.as3kinect.org/distribution/osx/libusb_1.0.pkg|AS3 Download]] or using the provided @@macosx_configure.sh@@ script below.
to:
* '''Note:''' as of version 0.5.0, libusb is included in the library, so no need to install it.
March 22, 2011, at 10:12 AM by 163.117.150.243 -
Added lines 236-239:
!!! Moving the Tilt motor

This is only possible with the @@libfreenect@@ backend. Open the @@Filters@@ window and you can set the Kinect tilt on the bottom slider.

March 22, 2011, at 10:11 AM by 163.117.150.243 -
Changed lines 163-164 from:
!!! Calibrating your Kinect
to:
!!! Calibrating your Kinect (libfreenect backend)
Changed lines 191-192 from:
!!! Running the viewer with calibration
to:
!!!! Running the viewer with calibration
March 22, 2011, at 10:10 AM by 163.117.150.243 -
Changed lines 50-52 from:

!!! Compilation on Linux (Ubuntu)
to:
!!! Running test programs from binaries

!!!! Mac binaries [[#macbin]]

You can get MacOSX universal (Intel only) binaries from there: [[http://sourceforge.net/projects/roboticslab/files/RGBDemo-0.5.0-Darwin.dmg/download|RGBDemo-0.5.0-Darwin.dmg]] (LGPL License).

!!!! Windows binaries [[#winbin]]

You can get Win32 binaries from there: [[http://sourceforge.net/projects/roboticslab/files/RGBDemo-0.5.0-Win32.zip/download|RGBDemo-0.5.0-Win32.zip]] (LGPL License).

!!! Compiling from source

!
!!! Compilation on Linux (Ubuntu)
Changed lines 78-79 from:
!!! Compilation on Mac
to:
!!!! Compilation on Mac
Changed lines 99-104 from:
!!! Mac binaries [[#macbin]]

You can get MacOSX universal (Intel only) binaries from there: [[http://sourceforge.net/projects/roboticslab/files/RGBDemo-0.5.0-Darwin.dmg/download|RGBDemo-0.5.0-Darwin.dmg]] (LGPL License).

!!! Compilation on Windows
to:
!!!! Compilation on Windows
Deleted lines 127-131:
!!! Windows binaries [[#winbin]]

You can get Win32 binaries from there: [[http://sourceforge.net/projects/roboticslab/files/RGBDemo-0.5.0-Win32.zip/download|RGBDemo-0.5.0-Win32.zip]] (LGPL License).

March 22, 2011, at 10:08 AM by 163.117.150.243 -
Changed lines 94-102 from:
It has been tested with MinGW and Visual Studio 10 so far. OpenNI backend is not available for Mingw.

Here is a step-by-step procedure for MinGW, in case you want to use libfreenect:
* Install QT opensource for Windows. This
will also install MinGW.
* Add C:\Qt\2010.05\MinGW\bin to the Path environment variable
* Install and run @@cmake@@ on rgbdemo
* Disable
the NESTK_USE_OPENNI cmake variable
* Open the CMakeLists
.txt in @@QtCreator@@ or compile manually using mingw-make.
to:
It has been tested with MinGW and Visual Studio 10 so far. Note that OpenNI backend is NOT available for Mingw.

You cannot use both libfreenect and OpenNI backends on Windows. You have to choose between one of them. By default, OpenNI backend will be compiled.

If you want to compile with libfreenect backend, you will first need to install the libfreenect drivers from [[http://openkinect.org/wiki/Getting_Started#Driver_installation|OpenKinect Windows]].
Changed lines 105-106 from:
* Open the generated solution in @@MSVC2010@@.
to:
* Open the generated solution in @@Visual Studio@@.
Changed lines 108-109 from:
*
to:
* Recompile QT for MSVC2010. Binaries provided for MSVC2008 unfortunately do not work with VS2010 (Runtime Error).
* Install OpenNI, SensorKinect, and Nite (in this order).
* Add QT bin path to the Path environment variable, or specify @@QMAKE@@ path in CMake
* Run CMake
* Open the generated solution in @@MSVC2010@@.

Here is a step-by-step procedure for MinGW, in case you want to use libfreenect:
* Install QT opensource for Windows. This will also install MinGW.
* Add C:\Qt\2010.05\MinGW\bin to the Path environment variable
* Install and run @@cmake@@ on rgbdemo
* Disable the NESTK_USE_OPENNI cmake variable
* Open the CMakeLists.txt in @@QtCreator@@ or compile manually using mingw-make.

Changed lines 125-126 from:
You will first need to install the libfreenect drivers. They are shipped with the archive, in the @@drivers@@ directory. When you plug your Kinect, specify the drivers location manually to the @@drivers\xbox nui motor@@ directory. If you missed it, then go into the device manager and update the drivers giving this location. More details there [[http://openkinect.org/wiki/Getting_Started#Driver_installation|OpenKinect Windows]].
to:
March 22, 2011, at 10:04 AM by 163.117.150.243 -
Changed lines 115-116 from:
You can get Win32 binaries from there: [[http://sourceforge.net/projects/roboticslab/files/RGBDemo-0.4.0-Win32.zip/download|RGBDemo-0.4.0-Win32.zip]] (LGPL License).
to:
You can get Win32 binaries from there: [[http://sourceforge.net/projects/roboticslab/files/RGBDemo-0.5.0-Win32.zip/download|RGBDemo-0.5.0-Win32.zip]] (LGPL License).
Added line 188:
Added line 190:
March 22, 2011, at 10:02 AM by 163.117.150.243 -
Changed lines 119-120 from:
!!! Running the viewer without calibration
to:
!!! Running the viewer
Changed lines 140-141 from:
!!! Switching between backends
to:
!!!! Switching between backends
Changed lines 147-148 from:
!!! High resolution mode
to:
!!!! High resolution mode
Changed lines 156-157 from:
Note: this is only necessary if you want to use the libfreenect backend.
to:
'''Note:''' this is only necessary if you want to use the libfreenect backend.
March 22, 2011, at 10:01 AM by 163.117.150.243 -
Added lines 140-153:
!!! Switching between backends

There are two supported backends for Kinect devices, @@libfreenect@@ and @@OpenNI/Nite@@. By default, is the @@NESTK_USE_OPENNI@@ Cmake variable is enabled, demo programs will choose the OpenNI backend. If you want to switch to the libfreenect backend, you can use the @@freenect@@ command line option:
[@
build/bin/rgbd-viewer --freenect
@]

!!! High resolution mode

When using the OpenNI backend, you can enable high RGB resolution mode to get 1280x1024 color images @ 10Hz with the @@highres@@ option:
[@
build/bin/rgbd-viewer --highres
@]

Added lines 156-157:
Note: this is only necessary if you want to use the libfreenect backend.
Changed lines 189-190 from:
to:
'''New since RGBDemo v0.5.0''': if you are using the OpenNI backend, then the calibration parameters will be determined automatically.
Changed lines 223-224 from:
to:
'''Note: this is currently only available with libfreenect backend'''
Added lines 238-239:
'''Note:''' You will also need a calibration file if you used OpenNI backend to grab the images. You can get one by running the viewer and selecting @@File/Save calibration parameters@@.
March 22, 2011, at 09:52 AM by 163.117.150.243 -
Added line 20:
* Demo of gesture recognition and skeleton tracking using Nite
Changed lines 23-26 from:
to:
!!! Support

Please send your questions, patches, ... to mailto:rgbdemo@groups.google.com .

Added lines 48-50:
And a snapshot of the skeleton and hand point tracking here:

March 17, 2011, at 08:26 PM by 87.217.161.181 -
Changed lines 9-11 from:
This software was partly developed in the [[http://roboticslab.uc3m.es | RoboticsLab]] and aims at providing a simple toolkit to start playing with Kinect data and develop standalone programs. Features include:
to:
This software was partly developed in the [[http://roboticslab.uc3m.es | RoboticsLab]] and aims at providing a simple toolkit to start playing with Kinect data and develop standalone computer vision programs without the hassle of integrating existing libraries. The project is divided in a library called @@nestk@@ and some demo programs using it. The library itself is easy to integrate to an existing project using @@cmake@@: just copy the nestk folder as a subfolder of your project and you should be able to start working with Kinect data. You can get more information on the [[KinectUseNestk|nestk page]].

Current features
include:
Changed lines 20-23 from:
* Linux, MacOSX and (partial) Windows support

The project is divided in a library called @@nestk@@ and some demo programs using it. The library itself is easy to integrate to an existing project using @@cmake@@: just copy the nestk folder as a subfolder of your project and you should be able to start working with Kinect data. You can get more information on the [[KinectUseNestk|nestk page]].

to:
* Linux, MacOSX and Windows support

Changed line 31 from:
* OpenNI / Nite backend support. No more fun with the chessboard-based calibration.
to:
* OpenNI / Nite backend support. No more fun with the chessboard-based calibration, sorry. Thanks to Diererick/Roxlu for the initial CMake integration.
Changed line 33 from:
* Much improved 3D freehand reconstruction.
to:
* Much improved 3D freehand reconstruction with optional ICP refinement. Thanks to Cristobal Belles.
Changed lines 54-55 from:
tar xvfz rgbdemo-0.4.0rc1-Source.tar.gz
cd rgbdemo-0.4.0rc1-Source
to:
tar xvfz rgbdemo-0.5.0-Source.tar.gz
cd rgbdemo-0.5.0-Source
Deleted lines 59-65:
* Note from Stéphane Magnenat about compilation on Ubuntu 10.04 :
[@
> For information, to compile KinectRgbDemo from [1] under Ubuntu 10.04, I had
> to add png12 to all target_link_libraries(...) as well as to had GLU to
> target_link_libraries(...) of rgbd-viewer.
@]

Changed lines 68-69 from:
tar xvfz rgbdemo-0.4.0rc1-Source.tar.gz
cd rgbdemo-0.4.0rc1-Source
to:
tar xvfz rgbdemo-0.5.0-Source.tar.gz
cd rgbdemo-0.5.0-Source
Changed lines 83-84 from:
You can get MacOSX universal (Intel only) binaries from there: [[http://sourceforge.net/projects/roboticslab/files/RGBDemo-0.4.0-Darwin.dmg/download|RGBDemo-0.4.0-Darwin.dmg]] (LGPL License).
to:
You can get MacOSX universal (Intel only) binaries from there: [[http://sourceforge.net/projects/roboticslab/files/RGBDemo-0.5.0-Darwin.dmg/download|RGBDemo-0.5.0-Darwin.dmg]] (LGPL License).
Changed lines 87-89 from:
It has been tested with MinGW and Visual Studio 10 so far. Here is a step-by-step procedure for MinGW:
to:
It has been tested with MinGW and Visual Studio 10 so far. OpenNI backend is not available for Mingw.

Here is a step-by-step procedure for
MinGW, in case you want to use libfreenect:
Added line 93:
* Disable the NESTK_USE_OPENNI cmake variable
Added lines 96-105:
If you want to compile using Visual Studio 2008:
* Install QT binaries for MSVC2008.
* Install OpenNI, SensorKinect, and Nite (in this order).
* Add QT bin path to the Path environment variable, or specify @@QMAKE@@ path in CMake
* Run CMake
* Open the generated solution in @@MSVC2010@@.

If you want to compile using Visual Studio 2010:
*

March 13, 2011, at 07:21 PM by 87.217.161.181 -
Added lines 1-218:
(:title [=Kinect RGBDemo v0.5.0=]:)

(:htoc:)

[[<<]]

!! Demo software to visualize, calibrate and process Kinect cameras output

This software was partly developed in the [[http://roboticslab.uc3m.es | RoboticsLab]] and aims at providing a simple toolkit to start playing with Kinect data and develop standalone programs. Features include:
* Grab kinect images and visualize / replay them
* Support for [[http://openkinect.org|libfreenect]] and [[http://openni.org|OpenNI/Nite]] backends
* Extract skeleton data / hand point position (Nite backend)
* Integration with [[http://opencv.willowgarage.com|OpenCV]] and [[http://pcl.ros.rog|PCL]]
* Calibrate the camera to get point clouds in metric space (libfreenect)
* Export to meshlab/blender using .ply files
* Demo of 3D scene reconstruction using a freehand Kinect
* Demo of people detection and localization
* Linux, MacOSX and (partial) Windows support

The project is divided in a library called @@nestk@@ and some demo programs using it. The library itself is easy to integrate to an existing project using @@cmake@@: just copy the nestk folder as a subfolder of your project and you should be able to start working with Kinect data. You can get more information on the [[KinectUseNestk|nestk page]].

!!! Download

* Source code as an archive [[http://sourceforge.net/projects/roboticslab/files/RGBDemo-0.5.0-Source.tar.gz/download|RGBDemo-0.5.0-Source.tar.gz]] (LGPL License)
* Source code on [[https://github.com/nburrus/rgbdemo|github]]
* [[#macbin|MacOSX Intel binaries]]
* [[#winbin|Windows binaries]]

!!! New features since v0.4.0
* OpenNI / Nite backend support. No more fun with the chessboard-based calibration.
* Basic skeleton / gesture support using Nite.
* Much improved 3D freehand reconstruction.
* PCL integration.

You can have a look at the 3D freehand reconstruction on the following video:

(:youtube 2ml8GiUPTao:)

And at the people detection feature on the following video:

(:youtube nnCDOKLuu0g:)

!!! Compilation on Linux (Ubuntu)

* The source includes a copy of OpenCV since Ubuntu packages are buggy. If you want to use an external OpenCV installation (>= 2.2), disable the USE_EXTERNAL_OPENCV flag in CMake or directly use the @@./linux_configure_external_opencv.sh@@ script.
* Install required packages, e.g. on Ubuntu 10.10:
[@
sudo apt-get install libusb-1.0-0-dev libqt4-dev libgtk2.0-dev cmake libglew1.5-dev libcv-dev libhighgui-dev libcvaux-dev libgsl0-dev libglut3-dev libxmu-dev
@]

* Untar the source, use the provided scripts to launch cmake and compile:
[@
tar xvfz rgbdemo-0.4.0rc1-Source.tar.gz
cd rgbdemo-0.4.0rc1-Source
./linux_configure.sh
./linux_build.sh
@]

* Note from Stéphane Magnenat about compilation on Ubuntu 10.04 :
[@
> For information, to compile KinectRgbDemo from [1] under Ubuntu 10.04, I had
> to add png12 to all target_link_libraries(...) as well as to had GLU to
> target_link_libraries(...) of rgbd-viewer.
@]

!!! Compilation on Mac

You will need:
* An install of [[http://qt.nokia.com/downloads/qt-for-open-source-cpp-development-on-mac-os-x|QT]]
* An install of libusb. You can install it manually from [[http://www.as3kinect.org/distribution/osx/libusb_1.0.pkg|AS3 Download]] or using the provided @@macosx_configure.sh@@ script below.

Then run the following commands:
[@
tar xvfz rgbdemo-0.4.0rc1-Source.tar.gz
cd rgbdemo-0.4.0rc1-Source
./macosx_configure.sh
./macosx_build.sh
@]
The configure script might ask for libusb installation. Say yes if you don't have it installed.

If you still experience some issues with libusb, or have a custom install, you can try:
[@
cmake -DLIBUSB_1_INCLUDE_DIR=$HOME/libusb/include -DLIBUSB_1_LIBRARY=$HOME/libusb/lib/libusb-1.0.dylib build
@]
supposing that you have it installed in @@$HOME/libusb@@.

!!! Mac binaries [[#macbin]]

You can get MacOSX universal (Intel only) binaries from there: [[http://sourceforge.net/projects/roboticslab/files/RGBDemo-0.4.0-Darwin.dmg/download|RGBDemo-0.4.0-Darwin.dmg]] (LGPL License).

!!! Compilation on Windows

It has been tested with MinGW and Visual Studio 10 so far. Here is a step-by-step procedure for MinGW:
* Install QT opensource for Windows. This will also install MinGW.
* Add C:\Qt\2010.05\MinGW\bin to the Path environment variable
* Install and run @@cmake@@ on rgbdemo
* Open the CMakeLists.txt in @@QtCreator@@ or compile manually using mingw-make.

!!! Windows binaries [[#winbin]]

You can get Win32 binaries from there: [[http://sourceforge.net/projects/roboticslab/files/RGBDemo-0.4.0-Win32.zip/download|RGBDemo-0.4.0-Win32.zip]] (LGPL License).

You will first need to install the libfreenect drivers. They are shipped with the archive, in the @@drivers@@ directory. When you plug your Kinect, specify the drivers location manually to the @@drivers\xbox nui motor@@ directory. If you missed it, then go into the device manager and update the drivers giving this location. More details there [[http://openkinect.org/wiki/Getting_Started#Driver_installation|OpenKinect Windows]].

!!! Running the viewer without calibration

* Binaries are in the @@build/bin/@@ directory, you can give it a try without calibration using:
[@
build/bin/rgbd-viewer
@]

If you get an error such as:

[@
libusb couldn't open USB device /dev/bus/usb/001/087: Permission denied.
libusb requires write access to USB device nodes.
FATAL failure: freenect_open_device() failed
@]

Give access rights to your user with:
[@
sudo chmod 666 /dev/bus/usb/001/087
@]
Or install the udev rules provided by libfreenect.

!!! Calibrating your Kinect

A sample calibration file is provided in @@data/kinect_calibration.yml@@. However, you should be able to get a more accurate mapping by estimating new parameters for each Kinect. Below is the procedure I follow.

'''1.''' Build a calibration pattern as shown in [[KinectCalibration]]. You can use the @@Chessboard_A4.pdf@@ or @@Chessboard_A3.pdf@@ file in the @@data/@@ directory for this. I recommend printing the chessboard on a sheet of paper and glue it on a peace of carton. It is not necessary anymore to cut the carton around the paper.

'''2.''' Grab some images of your chessboard using the viewer (File / Grab frame or Ctrl-G). WARNING: you need to grab images in Dual IR/RGB more (enable it in the Capture menu). By default it will save them into directories @@grab1/view????@@. These directories contain the raw files, @@raw/color.png@@, @@raw/depth.yml@@, @@raw/intensity.png@@ that corresponds to the color image, the depth image (in meters), and the IR image normalized to grayscale. You will also get an additional @@raw/depth.png@@ which is the depth image normalized to grayscale.

To get an optimal calibration, grabbed images should ensure the following:
* Cover as most image area as possible. Especially check for coverage of the image corners.
* Try to get the chessboard as close as possible to the camera to get better precision.
* For depth calibration, you will need some images with IR and depth. But for stereo calibration, the depth information is not required, so feel free to cover the IR projector and get very close to the camera to better estimate IR intrinsics and stereo parameters. The calibration algorithm will automatically determine which grabbed images can be used for depth calibration.
* Move the chessboard with various angles.
* I usually grab a set of 30 images to average the errors.
* Typical reprojection error is < 1 pixel. If you get significantly higher values, it means the calibration failed.

'''3.''' Run the calibration program:
[@
build/bin/calibrate_kinect_ir --pattern-size 0.025 grab1
@]
The pattern size correspond to the size in meters of one chessboard square. It should be 0.025 (25mm) for the A4 model.

This will generate the @@kinect_calibration.yml@@ file storing the parameters for the viewer, and two files @@calibration_rgb.yaml@@ and @@calibration_depth.yaml@@ for ROS compatibility.

'''Note with Mac binaries''': if there is a @@grab1@@ directory in the current directory, it will be loaded automatically.

!!! Running the viewer with calibration

* Just give it the path to the calibration file:
[@
build/bin/rgbd-viewer --calibration kinect_calibration.yml
@]
'''New since RGBDemo v0.4.0''': if there is a @@kinect_calibration.yml@@ file in the current directory, it will be loaded automatically.

* You should get a window similar to this:
%height=240px% Attach:viewer_output_main_v2.png

* The main frame is the color-encoded depth image. By moving the mouse, you can see the distance in meters towards a particular pixel. Images are now undistorted.

* You can filter out some value and normalize the depth color range with the filter window (Show / Filters). The Edge filter is recommended.
%height=240px% Attach:viewer_output_filters.png

* You can get a very simple depth-threshold based segmentation with Show / Object Detector
%height=240px% Attach:viewer_output_detection.png

* You can get a 3D view in Show / 3D Window.
%height=240px% Attach:viewer_output_view3d_cloud.png

* By default you get a grayscale point cloud. You can activate color:
%height=240px% Attach:viewer_output_view3d_cloud_color.png

* And finally textured triangles :
%height=240px% Attach:viewer_output_view3d_triangles.png

* You can also save the mesh using the @@Save current mesh@@ button, it will store in into a @@current_mesh.ply@@ file that you can open with Meshlab [[http://meshlab.sourceforge.net/|Meshlab]]:
%height=320px% Attach:viewer_output_meshlab.png

* Or import into [[http://www.blender.org/|Blender]]:
%height=320px% Attach:viewer_output_blender.png

* The associated texture is written into a @@current_mesh.ply.texture.png@@ file and can be loaded into the UV editor in Blender.

!!! Getting Infrared Images

* You can activate the IR mode in the capture menu. There is also a dual RGB/IR mode alternating between the two modes.
%height=320px% Attach:viewer_output_ir.png

!!! Replay mode

* You can grab RGBDImages using the @@File/Grab Frame@@ command. This stores the files into @@viewXXXX@@ directories (see the Calibration section), that can be replayed later using the fake image grabber. This can be activated using the @@--image@@ option:
[@
build/bin/rgbd-viewer --calibration kinect_calibration.yml --image grab1/view0000
@]

* You can also replay a sequence of images stored in a directory with the @@--directory@@ option:
[@
build/bin/rgbd-viewer --calibration kinect_calibration.yml --directory grab1
@]
This will cycle through the set of viewXXXX images inside the @@grab1@@ directory.

!!! Interactive scene reconstruction

* You can try an experimental interactive scene reconstruction mode using the @@build/bin/rgbd-reconstructor@@ program. This is similar to the interative mapping of [[http://ils.intel-research.net/projects/rgbd|Intel RGBD]] but still in a very preliminar stage. The relative pose between image captures is determined using feature points matching and mean squares minimization.

In this mode, point clouds will progressively be aggregated in a single reference frame using a Surfel representation.

!!! People detection

* Launch @@rgbd-people-tracker@@. You need to specify a configuration file. Here an example of full command line:
[@
build/bin/rgbd-people-tracker --calibration kinect_calibration.yml --config data/tracker_config.yml
@]
Calibration and config files will be loaded automatically is they are in the current directory.