Upload

Loading icon Loading...

This video is unavailable.

Reading Your Mind: Interfaces for Wearable Computing

Sign in to YouTube

Sign in with your Google Account (YouTube, Google+, Gmail, Orkut, Picasa, or Chrome) to like GoogleTechTalks's video.

Sign in to YouTube

Sign in with your Google Account (YouTube, Google+, Gmail, Orkut, Picasa, or Chrome) to dislike GoogleTechTalks's video.

Sign in to YouTube

Sign in with your Google Account (YouTube, Google+, Gmail, Orkut, Picasa, or Chrome) to add GoogleTechTalks's video to your playlist.

Uploaded on Mar 12, 2008

Google Tech Talks
March, 6 2008

ABSTRACT

Today's mobile devices have inherited many of the characteristics of desktop computing - including the assumptions that the user's full attention can be
focused on the interface and that the user has the manual dexterity to spare for it. These assumptions result in users who run into doorways while typing an e-mail on their mobile phone. When faced with these interface difficulties in our experiments, users sometimes exclaim "I want my device to read my mind!" In this talk, we will demonstrate several prototypes that exploit pattern recognition and good interface design to simulate reading the user's mind by guessing their intent. In addition, we describe preliminary work on an actual brain computer interface. Informed by our own wearable computer use since 1993, my group investigates what mobile users claim to do with their devices, what they actually do with their
devices, what they want to do, and the mobile interface challenges that interfere with the fulfillment of users' desires. We are currently exploring a successful modern incarnation of a
wearable computer, the RIM Blackberry equipped with a Bluetooth earpiece, focusing on its mini-QWERTY
keyboard. We have developed a technique called Automatic Whiteout++ that can eliminate 25% of mini-QWERTY users' "fat finger" typing errors, without
using a dictionary. We will also discuss Dual Purpose Speech agents, which "listen in" on the user's
conversation to help schedule appointments, remember small "notable" pieces of information, and communicate
with remote assistants. Finally, we will describe our preliminary research on BrainSign, a direct brain interface where the user communicates through natural language.

Speaker: Thad Starner
Bio:

Thad Starner is an Associate Professor at Georgia Institute of Technology's School of Interactive Computing. Thad was perhaps the first to integrate a
wearable computer into his everyday life as an intelligent personal assistant. Starner's work as a PhD student would help found the field of Wearable
Computing. His group's prototypes and patents on mobile MP3 players, mobile instant messaging and e-mail, gesture-based interfaces, and mobile
context-based search foreshadowed now commonplace devices and services. Thad has authored over 100 scientific publications with over 100 co-authors on
mobile Human Computer Interaction (HCI), pattern discovery, human power generation for mobile devices, and gesture recognition, and he is a founder and current co-chair of the IEEE Technical Committee on Wearable Information Systems. His work is discussed in public forums both in the United States and internationally, such as CNN, NPR, the BBC, CBS's 60 Minutes, The New York Times, Nikkei Science, The
London Independent, The Bangkok Post, and The Wall Street Journal.

Loading icon Loading...

Loading icon Loading...

Loading icon Loading...

The interactive transcript could not be loaded.

Loading icon Loading...

Loading icon Loading...

Ratings have been disabled for this video.
Rating is available when the video has been rented.
This feature is not available right now. Please try again later.

Loading icon Loading...

Loading...
Working...
to add this to Watch Later

Add to