17 March 2015 archAR: an archaeological augmented reality experience
Author Affiliations +
Abstract
We present an application for Android phones or tablets called “archAR” that uses augmented reality as an alternative, portable way of viewing archaeological information from UCSD’s Levantine Archaeology Laboratory. archAR provides a unique experience of flying through an archaeological dig site in the Levantine area and exploring the artifacts uncovered there. Using a Google Nexus tablet and Qualcomm’s Vuforia API, we use an image target as a map and overlay a three-dimensional model of the dig site onto it, augmenting reality such that we are able to interact with the plotted artifacts. The user can physically move the Android device around the image target and see the dig site model from any perspective. The user can also move the device closer to the model in order to “zoom” into the view of a particular section of the model and its associated artifacts. This is especially useful, as the dig site model and the collection of artifacts are very detailed. The artifacts are plotted as points, colored by type. The user can touch the virtual points to trigger a popup information window that contains details of the artifact, such as photographs, material descriptions, and more.
© (2015) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Bridgette Wiley, Bridgette Wiley, Jürgen P. Schulze, Jürgen P. Schulze, } "archAR: an archaeological augmented reality experience", Proc. SPIE 9392, The Engineering Reality of Virtual Reality 2015, 939203 (17 March 2015); doi: 10.1117/12.2083449; https://doi.org/10.1117/12.2083449
PROCEEDINGS
9 PAGES


SHARE
RELATED CONTENT

Participatory telerobotics
Proceedings of SPIE (May 28 2013)
Photogrammetric 3D reconstruction using mobile imaging
Proceedings of SPIE (March 11 2015)
Structure Sensor for mobile markerless augmented reality
Proceedings of SPIE (March 18 2016)
Perform light and optic experiments in Augmented Reality
Proceedings of SPIE (October 08 2015)
Homography-based predictive control for visual servoing
Proceedings of SPIE (November 23 2011)

Back to Top