Rating is available when the video has been rented.
This feature is not available right now. Please try again later.
Published on Mar 15, 2012
Fourth year ECE project at Western University..
The Android application running on the tablet connects to a server with an embedded flash application that displays a live video stream from an Xbox Kinect through the robot arm's point of view (the Kinect was used for a cheap source of depth perception). The user selects an object, and the object's coordinates are transmitted to an image processor which outlines the object using a region growing technique. This image is then augmented into the live webcam feed and streamed back over the server. Upon user confirmation that the correct object was augmented, the server sends a "go" signal to the data processor, which grabs the coordinates from an SQL database and sends them to the robotic arm, which processes the coordinates in Matlab and applies an inverse kinematics algorithm. The arm then converges on the object and (hopefully) grabs it!