Loading...

Brain interface for iRobot

838 views

Loading...

Loading...

Loading...

Rating is available when the video has been rented.
This feature is not available right now. Please try again later.
Uploaded on Jun 10, 2010

A demonstration of steady state visual evoked potential based brain computer interface to navigate an iRobot platform. The operator selects one of 4 commands for the robot by looking at a checkerboard; EEG signal processing algorithms detect which checkerboard/command is selected; commands are transmitted to the laptop which controls the robot via a remote connection over the internet; and the laptop on the robot sends back the video from the robot's view webcam to the operator via Skype.

Credits:
This is a capstone project completed by four undergraduate ECE seniors at Northeastern University.
Capstone Group: Saumitra Dasgupta, Mike Fanton, Jonathan Pham, Mike Willard
Advisers: Deniz Erdogmus, Bahram Shafai

Acknowledgments: This material is based upon work supported by the National Science Foundation under Grants 0934509 & 0914808. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

Loading...

When autoplay is enabled, a suggested video will automatically play next.

Up next


to add this to Watch Later

Add to

Loading playlists...