Upload

Loading...

Existential Risks and Extreme Opportunities | Stuart Armstrong | TEDxAthens

2,873 views

Loading...

Loading...

Transcript

The interactive transcript could not be loaded.

Loading...

Loading...

Rating is available when the video has been rented.
This feature is not available right now. Please try again later.
Published on Apr 6, 2015

What are existential risks? They are the risks that threaten the very survival of the human species, or those that could dramatically curtail its potential. There are many, from asteroid impact, to engineered pandemic to artificial intelligence (AI), and they are almost all understudied. AI risk is the least understood, but potentially the deadliest of all, as AIs could be extremely powerful agents with insufficiently safe motivations and goals. The problem is very difficult, philosophically and programmatically. If there obstacles are overcome, however, humanity can expect to look forwards to a world of dramatic abundance of health, wealth, and happiness.

Stuart Armstrong’s research at the Future of Humanity Institute centres on formal decision theory, the risks and possibilities of Artificial Intelligence, the long term potential for intelligent life, and anthropic (self-locating) probability. He is particularly interested in finding decision processes that give the “correct” answer under situations of anthropic ignorance and ignorance of one’s own utility function, ways of mapping humanity’s partially defined values onto an artificial entity, and the interaction between various existential risks. He aims to improve the understanding of the different types and natures of uncertainties surrounding human progress in the mid-to-far future.

This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at http://ted.com/tedx

Loading...

Advertisement
When autoplay is enabled, a suggested video will automatically play next.

Up next


to add this to Watch Later

Add to

Loading playlists...