Handheld AR App Development with Unity
- Laying the AR Foundation with Unity's AR Foundation Package
- This module will give the you a brief history of augmented reality technologies and introduce you to the concept of SLAM (Simultaneous Localization and Mapping). We'll show you an overview of the technologies used for Mobile AR tracking and the Unity components that are used to work with Mobile AR devices. In the project work, you will create a Unity AR project from scratch using the Unity AR Foundation package and wire up enough functionality to get it running on either an ARKit or ARCore compatible smartphone. The app will allow you to pan your smartphone around to see live video of your environment on the display.
- Architecting AR Space - Pose Tracking and Environment Detections
- In this module, you will learn how to interpret and visualize the information generated by the AR subsystem with regard to the real world geometry it has detected. In the project work, you will add trackable managers and visualizers to your scene so that you can see what the AR vision system is detecting and tracking.
- Designing the UX in AR - Raycast, Light Estimation, Physics and Occlusion
- In this module, you will learn how to use lighting estimation, in addition to the geometry generated by the AR system, to create realistic and immersive occlusion effects. You will also learn techniques that allow Unity physics objects to interact with the detected geometry through screen touches or physics. The result will be that the robot in the virtual scene will match the environment more realistically as the scene lighting is adjusted to correspond to measured lighting conditions. You will also be able to move the robot around using the touchscreen, and place the robot behind surfaces so that it’s partially occluded.
- Advanced AR
- In this module, we will look at features of ARCore and ARKit that are not yet supported by AR Foundation. In our final lesson, we'll learn about AR design best practices recommended by Unity.