Collaborative Monocular SLAM with Multiple Micro Aerial Vehicles




Rating is available when the video has been rented.
This feature is not available right now. Please try again later.
Published on Aug 23, 2013

This video demonstrates a framework for collaborative localization and mapping with multiple Micro Aerial Vehicles (MAVs) in unknown environments. Each MAV estimates its motion individually using an onboard, monocular visual odometry algorithm. The system of MAVs acts as a distributed preprocessor that streams only features of selected keyframes
and relative-pose estimates to a centralized ground station. The ground station creates an individual map for each MAV and merges them together whenever it detects overlaps. This
allows the MAVs to express their position in a common, global coordinate frame. The key to real-time performance is the design of data-structures and processes that allow multiple
threads to concurrently read and modify the same map. The presented framework is tested in both indoor and outdoor environments with up to three MAVs. To the best of our knowledge, this is the first work on real-time collaborative monocular SLAM, which has also been applied to MAVs.

For more info, check our paper: http://rpg.ifi.uzh.ch/docs/IROS13_For...
C. Forster, S. Lynen, L. Kneip, D. Scaramuzza, Collaborative Monocular SLAM with Multiple Micro Aerial Vehicles, IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS'13, 2013.
More info at: http://rpg.ifi.uzh.ch/


When autoplay is enabled, a suggested video will automatically play next.

Up next

to add this to Watch Later

Add to

Loading playlists...