Loading...

humanoid multimodal tactile sensors hd

33,990 views

Loading...

Loading...

Loading...

Rating is available when the video has been rented.
This feature is not available right now. Please try again later.
Published on Jan 25, 2011

Part I - Tactile Module:
Time: 00:00-00:09
------------------------------
In the first part we show the newly developed Tactile Module in comparison with a human hand.

On the backside the local controller is visible next to the four combined power and data ports. One BMA150 3-axis accelerometer (black box middle), six PT1000 temperature sensors (blue boxes) and four GP2S60 proximity sensors (outer black boxes) are visible on the front side.

Part II - Robot reaction towards different modalities:
Time: 00:10-00:29
---------------------------------------------------------------------
In the second part we show reactions of a KUKA robotic arm towards the different modalities of a single sensor - proximity, acceleration and temperature.

The robot was programmed to evade on a change of the according sensor value and return to his home position otherwise. With the proximity and temperature experiments only an evasive reaction was programmed. With the impact detection experiment we also implemented an inversion of the reaction, this is why the robot stays within the new boundaries. Please note that there is only one sensor mounted on the visible side of the robot. In general the control loop of these experiments is running at 1kHz and we only actuate the base joint.

Part III - Robot reaction towards multiple modules:
Time: 00:30-01:02
-------------------------------------------------------------------
In the last part we show the performance of a network of connected modules, driving multiple inverse kinematic chains of the KUKA robot.

In this part we always show two states of the experiment, one with our tactile controller turned off, the other one with the controller turned on. Please note that KUKA light weight robotic arm has impedance control capabilities which is used in the orientation experiment to push the robot around.
In the first experiment we show our multitouch controller reacting to proximity sensor input. In contrary to a purely joint force controlled robot we can detect the location of touch if multiple forces on one segment equalize and interact without applying any force.
In the second experiment we show the stabilization of a segment equipped with our sensor module towards a superposition of movement acceleration and gravity. This helps for example to stabilize a loosely placed cup on a tablet. In contrary to other possible solutions we do not need a fixed base nor a very accurate kinematic model. In general the control loop of these experiments is running at 0.1 kHz and we use an inverse kinematic chain per segment. The joint excitation of every inverse kinematic chain is summed up and sent to the robot.

P. Mittendorfer and G. Cheng, "Humanoid multi-modal tactile sensing modules," IEEE Transactions on Robotics, vol. 27, no. 3, pp. 401--410, June 2011.

Loading...

When autoplay is enabled, a suggested video will automatically play next.

Up next


to add this to Watch Later

Add to

Loading playlists...