8 October 2015 Perform light and optic experiments in Augmented Reality
Author Affiliations +
Proceedings Volume 9793, Education and Training in Optics and Photonics: ETOP 2015; 97930H (2015) https://doi.org/10.1117/12.2223069
Event: Education and Training in Optics and Photonics: ETOP 2015, 2015, Bordeaux, France
In many scientific studies lens experiments are part of the curriculum. The conducted experiments are meant to give the students a basic understanding for the laws of optics and its applications. Most of the experiments need special hardware like e.g. an optical bench, light sources, apertures and different lens types. Therefore it is not possible for the students to conduct any of the experiments outside of the university’s laboratory. Simple optical software simulators enabling the students to virtually perform lens experiments already exist, but are mostly desktop or web browser based. Augmented Reality (AR) is a special case of mediated and mixed reality concepts, where computers are used to add, subtract or modify one’s perception of reality. As a result of the success and widespread availability of handheld mobile devices, like e.g. tablet computers and smartphones, mobile augmented reality applications are easy to use. Augmented reality can be easily used to visualize a simulated optical bench. The students can interactively modify properties like e.g. lens type, lens curvature, lens diameter, lens refractive index and the positions of the instruments in space. Light rays can be visualized and promote an additional understanding of the laws of optics. An AR application like this is ideally suited to prepare the actual laboratory sessions and/or recap the teaching content. The authors will present their experience with handheld augmented reality applications and their possibilities for light and optic experiments without the needs for specialized optical hardware.
Wozniak, Vauderwange, Curticapean, Javahiraly, and Israel: Perform light and optic experiments in Augmented Reality



The classic teaching style is a teacher-centered approach. Especially for scientific studies there are additional laboratory exercises, where the students can practice their theoretical knowledge. It is often this applied experience that deepens the understanding of the provided theoretical knowledge. Mostly the experiments must be conducted in correspondingly equipped laboratories, e.g. physical or chemical experiments often need special equipment and are thereby limited to the laboratory localities and usually also to the lecture times. In order to prepare for the practical exercises students use written tutorial and physics learning books. To prepare and also rework practical experiments, students could use software simulating the exercises. We demonstrate our approach on an Augmented Reality (AR) app allowing students to use their ubiquitous mobile devices in order to prepare or rework exercises on their own. The simulated virtual exercises are closely corresponding with the real ones, they have to carry out. It is reasonable to use software to simulate small interactive experiments which are based on the real exercises. Of course a simulation isn’t a fully adequate substitute for a real experiment, but still can contribute to studies and comprehension of the main principles.

With almost ubiquitous mobile devices students are able to prepare and learn whenever and wherever they like to. Ubiquitous learning becomes a real possibility and makes knowledge accessible but it cannot substitute the learning process.



The idea to use AR for learning applications is not new and various publications describe their approach on incorporating it for learning purposes. [1] e.g. shows the use of an ARToolKit based mobile AR app, using different cards with printed markers to simulate virtual tools. Those tools can be used to simulate physical effects. Among others it is possible to point a virtual laser pointer and thus the laser beam on a virtual lens that refracts the virtual light. The authors approach demands for specifically printed cards with markers on it, as those are being moved by the user in space to let the tools interact. The publication refers to “interactive AR” as “very unique and funny” and sees a “great potential for edutainment”.

In the publication [2] the authors demonstrate their approach on AR for a virtual lens experiment in a physics course. Their setup consisted of markers placed on a rail that reassembles a real optical bench. Students can use the experimental setup like an ordinary lens experimental setup, e.g. move the lens on the rail and observe the change of the image on the virtual screen. The authors use a regular computer display (data projector) to make the outcome of the AR experiment visible. They conducted the experiments with two 8th grader classes, one with the AR version and the other with a regular setup. Their studies showed that, the group using the AR experimental setup didn’t score a significantly better result in after tests but had a positive memory on the experience itself.

Another publication describes an AR application for conducting various experiments in the field of mechanical physics. It is possible to experience virtual bodies behave physically correct. The application uses a physics engine to provide a realistic simulation of masses and forces. The user has to wear a head-mounted video-see-through display (VSD) and is tracked externally by a professional motion tracking system. While principally aiming toward aided learning, no evaluation about its effectiveness for learning had been conducted [3]. Due to its professional tracking system this kind of AR applications nowadays are limited to specially equipped laboratories.



Augmented Reality

The term Augmented Reality can be described as the concept of extending our perception of reality with virtual elements and contents. Azuma defines AR independently of the used technology as a method to combine real and virtual elements interactively and in real-time in three dimensions [4]. Another popular definition is Milgrams reality-virtuality continuum [5]. This continuum spans between the reality and virtuality and allows every form of mixed reality in between (Figure 1). While AR is closer to reality, Augmented Virtuality is closer to virtuality, which is also referred to as Virtual Reality (VR).

Figure 1

Milgrams Reality-Virtuality continuum


Normally the user needs some kind of a display and a computer generating and providing the virtual content and also handling the necessary image registration of virtual and real elements. A correct image registration in real-time is of the utmost importance for a realistic AR experience (Figure 2) [6]. In order to register virtual and real image parts correctly it is necessary to render the virtual content perspectively correct, for this it is necessary to know the position and orientation of the used AR display in relation to the real world scenery and the user’s point of view.

Figure 2

Left side: Visualization of the image registration process of virtual cube on a marker. Right side: AR app running on a smartphone acting as a video-see-through display.


The process of recognizing and continuously updating the relative position and orientation (pose) is called tracking. Depending on the used technology and setup the process can be more or less complicated. AR displays can be classified into two categories. Optical-see-through displays are transparent and don’t completely occlude the users view on the real world. The virtual content is overlaid onto the view of the real world. Therefore it is necessary to take the users point of view into account, in order to get a correct image registration. With video-see-through displays users perceive a captured image onto some kind of screen. The latter are easier to implement, because it is not mandatory to pay attention for the user’s point of view in relation to the AR display and thus to the real world scene.

To simplifies the tracking effort, on VSD systems the point of view usually is also the point of view of the camera. Of course at the expense of decreased realism, as the user has to accept the missing parallax shift and thus a perfectly realistic perspective.

Smartphones or tablet computers can be used as such simplified VSD AR devices. The screen is usually mounted opposite to the built-in camera. The user holds the mobile device in his or her outstretched hand and sees the captured and augmented camera picture on the screen. The device acts as a “window” to the augmented reality.

There are different approaches to realize a positional and orientational tracking of the device. Using the GPS-receiver to obtain the position and the inertial sensors to obtain the orientation of the device is not accurate enough to augment a close object, but usually is accurate enough to augment a landmark on the horizon. To augment the close scenery around a mobile device, the device should have an idea of the condition of its surrounding. This problem is nontrivial and therefore sill part of undergoing research [7]. To simplify the tracking process many established algorithms use known or somehow predefined markers. A marker can be an artificial or real picture placed on a physical object that can be identified within a camera picture. It allows concluding on the relative pose of the camera towards the filmed marker and thus allows rendering and registering the virtual content with the camera image of the real scenery (Figure 3). These markers are usually flat and are placed onto flat objects like tables or printed in magazines, but it is also possible to use predefined three-dimensional objects as markers.

Figure 3

A pose can describe the relative position and orientation between two reference systems


Of course it is possible to implement all tracking functionality by oneself, which offers the greatest flexibility in terms of customizability. In order to minimize development efforts it is advisable to use an “out of the shelf” SDK that offers the required functionality.

Content for AR

As AR applications are interactive and usually consist of real time 3D graphics, the same rules as for 3D game development can be applied. For the creation of 3D content for AR applications any 3D modeling tool can be used. The 3D models shouldn’t consist of to many polygons and the textures shouldn’t been too big so they fit into the graphic memory of the device. On the other hand mobile devices are getting more and more powerful, thus performance issues are becoming less and less important.

3D model databases around the internet can be used to save the time for creating new content. Simple predefined animations can be created directly within the 3d modeling tool. Animations requiring interaction are usually scripted and must be implemented by the developer. Further non AR content like videos or describing text could be provided within html pages that can be linked to or viewed within the app.



In the course “Medientechnik” (engl. “media technology”) our students have to deal with practical physics exercises. Among other experiments they have to execute a lens experiment on an optical bench. It is necessary for them to prepare based on a written script, which describes the setup and execution of the experiment. To understand the underlying physical laws and to transfer them into an individual mental model, the students need to deal with the experimental setup [8]. Varying prior knowledge and experience of the students makes this process more or less difficult. Also the local access to the corresponding experimental arrangements with the lenses and optical bench is limited in time. Looking further it appears that many students have problems to prepare properly for a highly interactive and action-oriented exercise on the basis of static learning resources consisting of text and images.

Within the important process of preparation our AR app should help with the construction of knowledge. According to the theory of situated learning, new knowledge is formed through the active confrontation within learning situations. Hence knowledge is highly context-sensitive [9]. To transfer knowledge from learning situations to application scenarios it is helpful when they resemble to a great extent. The static media, that serves the preparation, is stimulated by our app and thereby aligned to the exercise scenario. Additionally it is possible to access information on screen that is not available in real life, like e.g. the visualization of light rays and refraction.

Users often describe simulations as particularly helpful, if they offer the possibility of interaction. To match the exercise scenario, where e.g. it is possible to replace lenses and choose different focal lengths, our simulation also offers the possibility of doing so. With smartphones and other mobile devices becoming more and more omnipresent nowadays it is a plausible option to utilize them for learning. The goal of our prototypic AR app for mobile devices is to demonstrate a possible usage scenario of AR technology as a ubiquitous learning tool. We want to show how AR can be used to supplement traditional learning methods and lab exercises.



Our AR learning application is designed as a supplementary learning tool, which is intended to be used in addition to existing printed learning materials. For instance specific illustrations in a physics study book could be used as markers, linking to corresponding contents in our AR app. A student studying this book could start the AR app anytime, point the mobile device towards the illustration and instantly be able to get additional interactive content and information regarding the learning topic. The AR app content never should replace or simply duplicate the book but help to elaborate the contents. The app could provide small interactive exercises which are based on the real laboratory or book exercises.

At our school the students have to perform simple lens experiments. For that the students need to prepare and to know different lens types, their characteristics and the experiment setup. To begin with we decided for our AR app to simulate a simple optical bench experiment with one lens, one light source and screen. The lens type and characteristics can be altered interactively on the screen. Corresponding to the chosen lens parameters the path of the light rays is calculated and refracted differently by the virtual lens. The three-dimensional visualization illustrates the functioning of the subsequent real experimental setup in the laboratory and thus makes it easier for the students to become familiar with it (Figure 4).

Figure 4

AR lens experiment app running on a smartphone. (QR-Code with link to video.)


In order to make the physical model more understandable we are planning to visualize the same experimental setup as a 2D cross-section. It is possible to use the illustration of a horizontal axis with an aligned lens symbol as a marker which gets augmented interactively by our app. This visualization type is borrowed from lecture books, therefore already known by the students and suits excellently to illustrate the relationship between the used lens, its characteristics and the expected refraction of the light (Figure 5).

Figure 5

AR app visualizing light refraction by a convergent lens (visualization of 2D cross-section view).


Unlike printed lecture books the app could enable the students to interactively tweak the lens characteristics and make the changed refraction visible. The interactivity is an important part of the application. The goal is to provide the students a deeper understanding of the optical characteristics of the real experiment setup and enable them to transfer their virtually acquired experience into the real laboratory.


There are various ways to create an app for mobile devices. We prefer Unity3D in conjunction with Qualcomm’s AR SDK Vuforia as it allows building Android and iOS apps with one code base. Unity3D is a game development environment that allows the creation of interactive 3D graphics applications for various platforms. The Vuforia SDK is a software library for AR offered by Qualcomm that supports the development of AR mobile apps. The Vuforia SDK supports various tracking algorithms, among others also marker-based ones [10][11]. Furthermore it has an excellent Unity3D plugin that simplifies the application development even more.

The development process in short form consists of creating a Unity3D project and importing the Vuforia SDK Unity3D plugin. Afterwards it is necessary to define one or more AR marker and to setup a 3D scene consisting of the virtual marker counterpart, the 3D content that should be later visible on the marker and of course scripts containing the program code that enables the interactivity.



Unlike the evaluated publications, our solution doesn’t require specially printed AR marker, head-mounted displays and/or further tracking equipment. Most already available printed learning materials can be incorporated into our app and used as AR markers. Working with Unity3D and Vuforia it is possible to create mobile AR apps with little effort. One of the biggest advantages lies in the fact that one code base can be used to compile iOS and Android executables. One could reason that the dependence rises by using proprietary SDKs, but the same risk arise by developing natively for Android or iOS as well.

With our prototypic app we demonstrated a possible application of AR for learning scenarios. It is easy to add further learning material to the application and thus extend it to various other learning domains. The biggest work package though is to develop interactive content that amplifies the existing learning materials in a meaningful way.



Lai, C. and Wang, CL., “Mobile edutainment with interactive augmented reality using adaptive marker tracking”, The 18th IEEE International Conference on Parallel and Distributed Systems (ICPADS 2012), Proceedings, p. 124–131 (2012).Google Scholar


Cai, S., Chaing, F. and Wang, X., “Using the Augmented Reality 3D Technique for a Convex Imaging Experiment in a Physics Course”, International Journal of Engineering Education Vol. 29, No. 4, pp. 856–865 (2013).Google Scholar


Kaufmann, H. and Meyer, B., “Simulating Educational Physical Experiments in Augmented Reality”, Proceedings of ACM SIGGRAPH ASIA 2008 Educators Program, (2008).Google Scholar


Azuma, R., T., “A Survey of Augmented Reality”, Teleoperators and Virtual Environments 6, 4 (August 1997), 355–385 (1997).Google Scholar


Mehler-Bicher, A., Reiß, M. and Steiger, L., „Augmented Reality: Theorie und Praxis“, München: Oldenbourg Verlag (2011).Google Scholar


State, A., Hirota, G., Chen, D., T, Garret, W., F. and Livingston, M., A., “Superior Augmented Reality Registration by Integrating Landmark Tracking and Magnetic Tracking”, Proceedings of SIGGRAPH 96 (New Orleans, LA, August 4-9, 1996), ACM SIGGRAPH, 429–438 (1996).Google Scholar


Mekni, M. and Lemieux, A., “Augmented Reality: Applications, Challenges and Future Trends”, Proceedings of the 13th International Conference on Applied Computer and Applied Computational Science (ACACOS ’14), (April 2014).Google Scholar


Seel, N.M., “Model-Centered Learning Environments”, Technology, Instruction, Cognition and Learning, 1 (3), 242–251 (2003).Google Scholar


Mandl, H., Gruber, H. und Renkl, A., „Situiertes Lernen in multimedialen Umgebungen“, Issing, L., Klimsa, P. (Hrsg.), Information und Lernen mit Multimedia, 167–178 (1997).Google Scholar


Amin, D. and Govilkar, S., „Comparative Study of Augmented Reality SDK‘s“, International Journal on Computational Sciences & Applications (IJCSA) Vol.5, No.1, (February 2015).Google Scholar


© (2015) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Peter Wozniak, Oliver Vauderwange, Dan Curticapean, Nicolas Javahiraly, Kai Israel, "Perform light and optic experiments in Augmented Reality", Proc. SPIE 9793, Education and Training in Optics and Photonics: ETOP 2015, 97930H (8 October 2015); doi: 10.1117/12.2223069; https://doi.org/10.1117/12.2223069

Back to Top