A tutorial video on the Kinect package's new intrinsic camera calibration procedure. It uses a semi-transparent grid as a calibration target, and users manually fit virtual grids to the observed grids in both the camera and depth image streams. By capturing a number of grids from a variety of distances and viewing angles, the software can calculate the projection parameters of both cameras, resulting in a physically accurate 3D reconstruction, and proper mapping of the color image onto the reconstructed 3D image.
The calibration application is RawKinectViewer from the Kinect package.
Update: after publishing the video, I found out that the format of the Kinect's factory calibration data has been reverse-engineered. I am updating my Kinect software to use it in lieu of custom calibration data. More details here: http://doc-ok.org/?p=313
More information at:
http://doc-ok.org/?p=289
http://idav.ucdavis.edu/~okreylos/Res...
http://idav.ucdavis.edu/~okreylos/Res...