Augmented Reality Curriculum

Code Reality and ITiCSE WG AR Curriculum collaboration

Lesson Plans

The AR curriculum is broken into core and elective content, so the course implementation can be adjusted to the target audience and local needs. Lectures are conceptualized to cover theory and foundations, while the tutorials are expected to complement them with selected practicals. The Advanced course builds on the Foundations course. The Foundations course can be included in an undergraduate/bachelor degree, while the Advanced – in a postgraduate / master’s course.

Lesson Plan ‘Foundations of Augmented Reality’

This course presents an introduction to Augmented Reality, with emphasis on designing and developing Augmented Reality applications. The course starts with a comprehensive introduction to the field, covering also its history with early precursors dating back to the 19th century and with more than half a century of serious technology R&D. The course then introduces the state of the art of hardware and software, with practical try-out possibilities of Smart Glasses, interactive clothing, and other futuristic technology. 

The course covers all necessary material about Spatial Computing, Human Computer Interaction, Perception, Design Thinking, and Application Development. A rich mix of theory and practice is complemented with methodology and hands-on development and evaluation. Insights into specialist application areas and job perspectives help sharpen the students’ skills set.

Lecture 1: Introduction to AR

This lecture guides through the history of AR (starting 400 years ago and till modern software and hardware platforms), covers the key concepts in AR, introduces augmentation pipeline, explaining new ideas of “Experience” and state of the art in AR technologies.

Lecture 2: Human Perception and Processing

This lecture introduces the foundations of research and design methodologies in Human Computer Interaction (HCI), the Human Perception requirements for AR, and why different methods should be used for designing AR experiences.

Lecture 3: Technology Overview

This lecture covers the understanding and evaluation of technology alternatives from end-user development to high- and low-level software engineering. It develops a deeper understanding of component-level technology such as display hardware and introduces available sensor hardware, interaction technology, and delivery systems.

Lecture 4: Experience Engineering

This lecture covers details on Design Thinking, the importance of context, user-oriented design, best practices in experience design and examples of “how not to”.

Lecture 5: HCI evaluation methodologies

It covers experiment design and evaluation methodologies like technology acceptance and use, system usability, user satisfaction with hardware and interaction itself.

Lecture 6: Spatial Computing

In this lecture, we introduce the concept of Spatial Computing and provide several best and worst practices of Augmented Reality designs, illustrating them by examples of solutions and applications.

[Elective] Lecture 7: Software Development Methods

The lecture covers agile software engineering models versus traditional methods. It explains scrum processes and kanban in detail. Moreover, agile practices like code style guides (definition of done), source code reviews after sprints, source code management (git) and DevOps basics are addressed.

[Elective] Lecture 8: Geometric Algebra

Reviews the mathematics needed for AR and computer graphics, such as tools that model rotations, understanding of the geometric interpretation of complex numbers, and pros and cons of using matrices, complex numbers, Euler angles or quaternions.

[Elective] Lecture 9: Storytelling

Basic principles in storytelling are covered and tips and tricks on how to engineer an experience are provided.

[Elective] Lecture 10: Careers in AR

The lecture gives an overview of the job market, what employers want, and the skills framework that helps to keep track of developing sought-after talents.

[Elective] Lecture 11: Design Inspiration

Typically provided by local industry, this talk introduces to real client projects, highlighting how they were developed and designed.

[Elective] Lecture 12: Research Directions

This lecture needs constant updating, incorporating new Grand Challenges and latest achievements, as for example available from the IEEE ISMAR conference annual summaries and highlights. At the time of writing, this lecture provides an outlook on future developments and novel tools such as consumer-grade volumetric capture, hearables, voice-activated graphics, or brain-computer interfaces and innovative application areas such as Digital Assistants, Diminished reality, Deep Fakes, Programmable Synthetic Hallucinations, Umwelt Hacking, Perceptual Adaption, Sensory Augmentation, Neuro Design, or Wearable Skin.

[Elective] Lecture 13: Spatial Audio

This lecture covers an introduction to audio displays and how sound is captured or generated in order to make the user perceive it as coming from specific locations in the room.

[Elective] Tutorial 1: Unity Introduction

The tutorial aims to introduce the basic concepts of Unity 3D engine that are required to start developing Augmented Reality applications. If the local study program does not cover Unity in any other mandatory modules before, we recommend to include this generic, non-AR-focused introduction.

[Elective] Tutorial 2: Modelling AR UI/UX

Three-steps paper-based activity for brainstorming ideas of AR applications. It includes modeling the audience, the message and the worlds to answer the questions of who are the users, what is the main concept or purpose, and what should happen in the application.

[Elective] Tutorial 3: New Business Development

In this tutorial, learners should start thinking from a customer perspective. They should understand customer needs and organize them with social requirements engineering processes and tools on the Web. Basics of markets, minimal viable products and sales are introduced. Advanced tools like the Requirements Bazaars and the House of Quality (HoQ) can be utilized in practical sessions.

Tutorial 4: Markers: with demo of authoring tool

The tutorial provides practical instruction on creating marker-based Augmented Reality experience, including creation and packaging for Unity 3D and deploying to devices.

Tutorial 5: 3D modelling

Combines presentation, demonstrations and assisted experimentation to teach the fundamentals of 3D modelling and covers modelling, rigging, texturing/materials/lighting, animating, and exporting to Unity 3D for real-time application.

Tutorial 6: Gesture Interaction

Explains how to add gesture recogniser, add tap events, manage with hands visible and tracked by Hololens, and gives a framework for deeper understanding of hand tracking and gaze outputs.

Tutorial 7: Voice Interaction

Describes basics of how spatial sound can add realism, direct the users gaze and provide gesture feedback by creating simple voice commands and sounds of AR objects interaction in Unity.

Tutorial 8: Gaze Interaction

Detailed presentation of the concept of Gaze interaction and how gaze interaction is used to select objects. The tutorial will teach how to implement gaze interaction using prewritten scripts provided by the MixedRealityToolkit.

[Elective] Tutorial 10: 3D scans, wrapping, & animation

Construction of a 3D character by 3D scanning of a human, converting it into a low-poly 3D model, cleaning the geometry and textures, and animating it to be ready for an Augmented Reality application.

Tutorial 9: Spatial Mapping

Covers general features of Hololens mapping technology, its functions and limitations, Hololens setup guide, indoor spatial mapping tryout and resulting map processing in Blender.

Lesson Plan ‘Advanced Augmented Reality’

The advanced course follows the software development cycle from inception, to implementation, to validation. It adds design thinking and user experience guidelines, as well as advanced storytelling. The course covers creative tools and methods for outlining and substantiating the AR application idea. Designing AR workflows tutorial equips students with the required theory and practice for building AR applications. 

The implementation focused technologies advance from the foundational course to cover spatial understanding (on top of spatial mapping), abstraction for cross-platform/multi-user/multi-device support, artificial intelligence dialog understanding, openCV foundations, wearable technology and making things talk, and volumetric video capture. Finally, evaluating AR introduces the methodologies available for verifying and validating applications. Insights into specialist application areas and job perspectives will help students to sharpen their skills set.

Lecture 1: Introduction to Advanced AR

Introduce the key concepts of AR, such as a modern definition of AR, concepts of  augmentation, detectables, tracking systems, and sensors. Also cover key standard AR architectures and present examples of AR applications in different areas.

Lecture 2: AR UX

Lecture about how AR apps are created from the design point of view, how to make them affective and scalable, both mobile and HMD-based. It covers user experience engineering processes, spatial user interface guidelines and design principles, and how modern usability methods can be used to evaluate AR products.

[Elective] Lecture 3: AR Storytelling

This lecture describes the meaning of storytelling, its importance for AR apps, and the components of storytelling. Discusses several examples of AR apps and games that utilize storytelling to outline specific methods and techniques.

[Elective] Lecture 4: Transreality Storyboarding

How to create experiences or tell stories across different immersive realities as well as media using a transreality-based storyboarding framework.

[Elective] Lecture 5: AR Research Directions

One-hour panel discussion where three principal investigators share their experience, vision, and the Grand Challenges they face in the areas of new user interfaces, holographic AIs, volumetric capture, web AR, robotics, and specific application areas such as TEL, Industry 4.0, BIM, or AEC.

[Elective] Lecture 6: Wearable Computing

An introduction to the field of Wearable Computing, the practice of inventing, designing, building, or using body-worn computational and sensory devices, the functions and applications of wearable computers, design principles, and the underlying materials and technologies.

Lecture 7: Computer Vision

This lecture covers key algorithms for motion tracking, object detection, and object tracking. It looks at SDKs capabilities and existing open source solutions for features detection, description, and matching, also covering camera model and lens calibration.

[Elective] Lecture 8: AI dialog understanding

Some say that AI assistants are the key ingredient in future (spatial) user interfaces, so this lecture reviews what the area of artificial intelligence has to give to the cause, also reviewing the research history of relevant technologies and services for speech recognition and generation, machine translation, and dialogue understanding.

[Elective] Lecture 9: AR Applications and Industry Cases

This lecture is suited for inviting local industry to showcase apps and projects, so as to demonstrate to students what is possible in a professional context, as well as offering opportunities for networking.

Lecture 10: Evaluation

The lecture provides an overview on suitable methodologies for evaluating AR applications, distinguishing between primary and secondary user experience metrics that can be used to evaluate and improve AR products. It introduces in depth to SPINE, a new universal method to evaluate the usability of spatial user interfaces.

[Elective] Lecture 11: Spatial Audio

This lecture covers an introduction to audio displays, how sound is captured or generated, including the relevant production pipelines and toolkits, working principles for human audio processing like interaural time differences and spectral filtering, production principles like head-related transfer functions.

[Elective] Lecture 12: Mobile and Web

This lecture covers mobile platforms, discussing the available frameworks and UX design principles for this form factor.

[Elective] Lecture 13: Volumetric Video

This lecture covers volumetric video as a novel format of interactive media that allows to bring live action content into VR and AR. It introduces the content creation pipeline including multiview capture and other sensors, basic computer vision tasks such as segmentation/keying, calibration and colour correction, as well as typical 3D reconstruction approaches such as shape-from-silhouette, visual hulls and structure-from-motion. Data representation, coding and streaming as dynamic 3D meshes or point clouds are covered next. Finally, immersive visualization, interaction and application scenarios are outlined.

[Elective] Lecture 14: Advanced Computer Graphics for AR

This lecture covers advanced graphics techniques for AR, such as low level OpenGL Shader and GPU programming, photorealistic and non-photorealistic rendering, light capture and relighting techniques, visualization methods, occlusion, graphics optimization and parallelization, x-ray techniques, and the latest graphics research methods in AR.

[Elective] Tutorial 1: AR UX

Reviewing example AR applications and discussing features and issues.

[Elective] Tutorial 2: Designing AR Workflows

Instructs about the workflows used in creating Augmented Reality apps and helps to create a more coherent encompassing design and production workflow for AR.

Tutorial 3: Collaboration and Sharing

Synchronous collaboration in Mixed Reality is possible in a shared virtual environment. This means that the shown 3D objects are synchronized between the participants and changes are broadcasted in real-time. This tutorial is introducing the Photon library, a communication library for online games as a tool for creating sharing and collaboration experiences in mixed reality with a running example of a two person checkers game in mixed reality.

[Elective] Tutorial 4: Making Things Talk

Guides through the steps to link an AR project to microcontroller devices through WiFI and Bluetooth. And demonstrates the full development pipeline from an Arduino sketch to a Unity 3D client that interacts with HoloLens.

Tutorial 5: Computer Vision

The tutorial practices the key computer vision algorithms in OpenCV.

Tutorial 6: AI dialog understanding

The tutorial guides through the steps required to create a responsive 3D character in Unity using IBM’s Watson APIs, starting with explaining how to set up the needed cloud accounts, instantiating the according services, and how to configure service credentials in Unity. It shows how to work with dialogue understanding, speech to text, and text to speech with an example holographic AI.

[Elective] Tutorial 7: Careers in AR: Direction

Following in an introduction, in a series of interactive exercises, participants explore what careers are possible and which ones they might be interested in.

[Elective] Tutorial 8: Volumetric video capture

The tutorial guides the students through setting up and using the camera rigs and software required for capturing volumetric video. It guides them through the post processing tool chain and shows how to integrate results into the AR app.

Tutorial 9: Spatial Understanding

An overview of spatial understanding, demonstration of the solvers’ use and how these controls the way objects behave in the playspace.

[Elective] Tutorial 10: WebXR

WebXR is a collection of standards that support rendering of 3D scenes in Augmented Reality on eyeglasses as well as handheld devices. This tutorial guides students through their first AR app built with web technology.

[Elective] Tutorial 11: AR Evaluation

This tutorial uses Google forms for administering three evaluation instruments for AR, namely SPINE, TAMARA, and TRUST, to conduct a test evaluation jointly with the participants. An associated Overleaf LaTeX document uses knitr to integrate the R code required for the analysis, presenting results live and on click of a button.

[Elective] Tutorial 12: AR portals using shaders

The tutorial is a simple introduction to Unity shaders, demonstrating how two shaders can be used to mask 3D objects in a way that they become visible only when looking through a portal window.

[Elective] Tutorial 13: Geo-Location based AR

The tutorial shows how to work with GPS in AR, to create Pokemon-go like apps.

[Elective] Tutorial 14: Performance profiling

Monitoring and optimising the performance of an AR application is important to ensure a pleasant user experience. Performance profiling of AR applications is not only concerned with the responsiveness of the app. Instead, developers need to look at further measures like the application’s framerate. This tutorial introduces performance metrics, performance profiling tools and performance optimization.

[Elective] Tutorial 15: AR Unit Testing

Unity, for example, ships with a test runner that uses the NUnit framework. NUnit is part of the xUnit testing framework collection. It is the official solution by Unity and is actively used by an increasing number of people in the development of modules for the package manager that was introduced in Unity 2018. This tutorial explores the possibilities of the NUnit framework.

The AR curriculum presented on this page is been developed in the working group Augmented Reality Curriculum of the 25th annual conference on Innovation and Technology in Computer Science Education (ITiCSE) 2020. The lesson plans presented here are part of the full paper submitted for review at ITiCSE. The objectives of the working group can be found in the  ACM digital library, in a short paper:

Mikhail Fominykh, Fridolin Wild, Ralf Klamma, Mark Billinghurst, Lisandra S. Costiner, Andrey Karsakov, Eleni Mangina, Judith Molka-Danielsen, Ian Pollock, Marius Preda, and Aljosa Smolic: “Developing a Model Augmented Reality Curriculum,” in the 25th annual conference on Innovation and Technology in Computer Science Education (ITiCSE 2020), Trondheim, Norway (online), June 17–18, 2020, ACM, ISBN: 978-1-4503-6874-2, pp. 508–509. DOI: 10.1145/3341525.3394991.