Rating is available when the video has been rented.
This feature is not available right now. Please try again later.
Published on Jul 26, 2017
Description Biases are bugs. They need to be found, fixed, and learnt from. A mix of good ethics and good engineering practices can get us a long way towards that goal.
In this talk, you'll learn what biases are, what software tools can help, and how to adopt engineering practices that can make your algorithms fairer.
Abstract Algorithms can make decisions, and these decisions can have an impact on people's lives. By feeding data into these algorithms, they can reproduce or amplify our societal biases and take unfair decisions.
Biases are bugs. They need to be found, fixed, and learnt from. A mix of good ethics and good engineering practices can get us a long way towards that goal.
In this talk, you will learn what biases are, see examples of algorithms gone wrong, and explore some software tools you can use and engineering practices you can adopt in your own work to make your algorithms more fair.
PyData is an educational program of NumFOCUS, a 501(c)3 non-profit organization in the United States. PyData provides a forum for the international community of users and developers of data analysis tools to share ideas and learn from each other. The global PyData network promotes discussion of best practices, new approaches, and emerging technologies for data management, processing, analytics, and visualization. PyData communities approach data science using many languages, including (but not limited to) Python, Julia, and R.
PyData conferences aim to be accessible and community-driven, with novice to advanced level presentations. PyData tutorials and talks bring attendees the latest project features along with cutting-edge use cases.