Loading...

Safe Exploration in Finite Markov Decision Processes with Gaussian Processes (NIPS 2016 Spotlight)

620 views

Loading...

Loading...

Transcript

The interactive transcript could not be loaded.

Loading...

Rating is available when the video has been rented.
This feature is not available right now. Please try again later.
Published on Nov 14, 2016

Poster session at NIPS 2016 (Monday, Dec 5th, 2016, 6-9.30pm):
https://nips.cc/Conferences/2016/Sche...

Extended version of the paper:
https://arxiv.org/abs/1606.04753

Code:
https://github.com/befelix/SafeMDP

Authors:
Matteo Turchetta, Felix Berkenkamp, Andreas Krause

Abstract:
In classical reinforcement learning, when exploring an environment, agents accept arbitrary short term loss for long term gain. This is infeasible for safety critical applications, such as robotics, where even a single unsafe action may cause system failure. In this paper, we address the problem of safely exploring finite Markov decision processes (MDP). We define safety in terms of an, a priori unknown, safety constraint that depends on states and actions. We aim to explore the MDP under this constraint, assuming that the unknown function satisfies regularity conditions expressed via a Gaussian process prior. We develop a novel algorithm for this task and prove that it is able to completely explore the safely reachable part of the MDP without violating the safety constraint. To achieve this, it cautiously explores safe states and actions in order to gain statistical confidence about the safety of unvisited state-action pairs from noisy observations collected while navigating the environment. Moreover, the algorithm explicitly considers reachability when exploring the MDP, ensuring that it does not get stuck in any state with no safe way out. We demonstrate our method on digital terrain models for the task of exploring an unknown map with a rover.

More information:
https://las.ethz.ch
http://berkenkamp.me

Loading...

When autoplay is enabled, a suggested video will automatically play next.

Up next


to add this to Watch Later

Add to

Loading playlists...