Rating is available when the video has been rented.
This feature is not available right now. Please try again later.
Published on Nov 22, 2011
Real-Time Crowd Support for People with Disabilities Tuesday, November 15, 2011 Jeffrey Bigham University of Rochester Co-sponsored by the CS Colloquium and ISTS Abstract
Jeffrey Bigham University of Rochester The past few decades have seen the development of wonderful new intelligent technology that serves as sensors and agents onto an inaccessible world for people with disabilities, but it remains both too prone to errors and too limited in the scope to reliably address many problems faced by people with disabilities in their everyday lives. We have been developing approaches to crowdsourcing that work in real-time to overcome these problems. In this talk, I'll discuss the following recent projects that use real-time crowdsourcing: VizWiz, an accessible iPhone application that blind people use to take a picture, speak a question, and receive answers from the crowd in under a minute. More than 20,000 questions have been asked so far, giving us insight into the types of questions blind people want answered. Legion, a system that lets dynamic groups collaboratively control existing user interfaces using a VNC-like setup. These applications collectively inform a new model of human-computer interaction in which a dynamic group of unreliable individuals act as a single reliable user. Bio Jeffrey P. Bigham is an Assistant Professor in the Department of Computer Science at the University of Rochester where he directs ROC HCI. His works spans Access Technology, Human Computation, and Intelligent User Interfaces. He is specifically interested in technology that engages the crowd to assist people with disabilities in their everyday lives. Professor Bigham received his Ph.D. in 2009 in Computer Science and Engineering from the University of Washington working with Dr. Richard Ladner, and his B.S.E. from Princeton in 2003. Jeffrey has received a number of awards for his work, including the Andrew W. Mellon Foundation Award for Technology Collaboration, the MIT Technology Review Top 35 Innovators Under 35 Award, two ASSETS Best Student Paper Awards, and the UIST 2010 Best Paper Award.