SharedPhys: Live Physiological Sensing, Whole-Body Interaction, and Large-Screen Visualization





The interactive transcript could not be loaded.


Rating is available when the video has been rented.
This feature is not available right now. Please try again later.
Published on Sep 29, 2016

A video supplement to our IDC 2016 paper "SharedPhys: Live Physiological Sensing, Whole-Body Interaction, and Large-Screen Visualizations to Support Shared Inquiry Experiences"
Seokbin Kang, Leyla Norooz, Vanessa Oguamanam, Angelisa C. Plane, Tamara L. Clegg, Jon E. Froehlich.

Paper: http://dl.acm.org/citation.cfm?id=293...
Personal: http://http://www.livehighkang.com/
Makeability Lab: https://makeabilitylab.umiacs.umd.edu/
HCIL: http://hcil.umd.edu/

We present and evaluate a new mixed-reality tool called SharedPhys, which tightly integrates real-time physiological sensing, whole-body interaction, and responsive large-screen visualizations to support new forms of embodied interaction and collaborative learning. While our primary content area is the human body, we use the body and physical activity as a pathway to other STEM areas such as biology, health, and
mathematics. We describe our participatory design process with 20 elementary school teachers, the development of three contrasting SharedPhys prototypes, and results from six exploratory evaluations in two after-school programs. Our findings suggest that the tight coupling between physical interaction, sensing, and visualization in a multi-user environment helps promote engagement, allows children to
easily explore cause-and-effect relationships, supports and shapes social interactions, and promotes playful experiences.


When autoplay is enabled, a suggested video will automatically play next.

Up next

to add this to Watch Later

Add to

Loading playlists...