Rating is available when the video has been rented.
This feature is not available right now. Please try again later.
Published on Jan 14, 2014
Accepted as Poster in Proceedings of the IEEE Virtual Reality Conference, 2 pages, Minneapolis, MN, USA, March, 2014.
The goal of our work is to create highly realistic graphics for Augmented Reality on mobile phones. One of the greatest challenges for this is to provide realistic lighting of the virtual objects that matches the real world lighting. This becomes even more difficult with the limited capabilities of mobile GPUs. Our approach differs in the following important aspects compared to previous attempts: (1) most have relied on rasterizer approaches, while our approach is based on raytracing; (2) we perform distributed rendering in order to address the limited mobile GPU capabilities; (3) we use image-based lighting from a precaptured panorama to incorporate real world lighting. We utilize two markers: one for object tracking andone for registering the panorama. Our initial results are encouraging, as the visual quality resembles real objects and also the reference renderings which were created offline. However, we still need to validate our approach in human subject studies, especially with regards to the trade-off between latency of remote rendering and visual quality.