Introduction to Advanced AR
In this session, we introduce the key concepts of Augmented Reality, including a modern definition of AR itself and such concepts as augmentation, detectables, tracking systems, and sensors. We describe key standard AR architectures that can be used as a reference model for the development of AR applications. We also present examples of AR applications in different areas.
Introduction 1/4: Definition
In this video, we give a definition of Augmented Reality and explain what it means. We refer to seminal works of Caudell and Mizell (1992) who first defined the term. We also look at the definition given by Azuma (1997) in the most cited work in the field. We discuss how philosophy and in particular representational realism helps to overcome some of the flaws in the early definitions. We then compose a modern definition that covers all four dimensions we can manipulate for Augmenting Reality: reality itself, the delivery system, our perception, and the user's experience.
Introduction 2/4: Concepts
In this video, we present key concepts needed when dealing with AR. Moreover, we introduce a taxonomy for both hardware and software components.
As with regards to the key concepts, we first define Augmentation and augmentation primitives, such as audio, audio, images, animation, labels, and vibro-tactile patterns. We then define Delivery and visual delivery systems, including head-attached, body-attached, and spatial. We review the four basic types of optical see-thru Head-mounted device technologies (prism, beam splitter, wave guide, and laser beam scanning). We introduce the tracking subsystem as a key component that utilizes sensors to detect position and orientation of real world points-of-interest. We define Sensors as devices that detect or measure physical characteristics and communicate the generated data digitally. We present an overview on the different sensor systems available for tracking, including optical, acoustical, electromagnetic, and mechanical tracking, depth sensors, GPS, compass, and gyroscopes, and tracking with sensor fusion. Finally, we define Detectables, as a super term for markers, image targets, and world anchors. Detectables link the physical real-world features with data on how to recognize them automatically, thus enabling their tracking in the sensor processing subsystem.
We offer a taxonomy for the different hardware components used to create Augmented reality, such as display systems, optics, or other types of sensors. We also provide a taxonomy of software and production tools, such as object tracking, object recognition, interaction libraries, and various authoring tools.
Introduction 3/4: Reference Architectures and Standards
There are several standard architectures available that can serve as a reference model for the development of Augmented Reality applications. In this video, we introduce the three key reference models.
- Mixed and Augmented Reality Reference Model (Edited by Kim, Perey & Preda)
- ETSI Augmented Reality Framework - ARF (WG chair: Muriel Deschanel)
- Standard for an Augmented Reality Learning Experience Model - ARLEM, P1589 (chaired by Wild & Perey)
Introduction 4/4: Examples
In this video, we present examples of Augmented Reality applications in different areas.
- We demonstrate ATLAS - a unique art project (by Yann Deval & Marie-G. Losseau).
- We present how Augmented Reality is used for training astronauts on a Mars surface simulator (article by Ravagnolo et. al and WEKIT).
- We present how Augmented Reality is applied in workplace training, on the example of aircraft maintenance (by WEKIT).
- And we present how Augmented Reality can be used in gaming on a bowling application (by WIll Guest).