Build Your Mixed Reality Game & Publish it on Meta's App Lab
- Description
- Curriculum
- FAQ
- Reviews
Want to make Mixed Reality (MR/VR) games and apps in Unity?
Are you keen to expand your skillset to include Meta’s Presence Platform features so you can create engaging Virtual Reality and MR games and experiences?
Then this course is for you!
Meta has added a ton of functionalities and features for making the creation of Mixed Reality apps and games easy, allowing you to bring your ideas to life.
In this course, you’ll learn how to harness the power of Meta’s Presence Platform to build immersive XR experiences.
Get hands-on experience designing and implementing
- Passthrough
- Scene Understanding
- Smart Object Positioning
- Hand Tracking
- Controller Support
- Interactions
- Movement
- Body Tracking
- Eye Tracking
- Face Tracking
- Voice and Audio Detection
All in all, you will learn to provide your app or game users with intuitive and engaging XR interactions.
You’ll get the most from this course if you have some familiarity with the Unity editor.
The creators are qualified and experienced with mixed reality, and will entertain you along the way.
You’ll also gain access to our community where you can discuss topics on a course-wide basis, or down to the individual video lesson. Get plugged into our XR Creators community of amazing developers on Discord (nearly 7k), and our student chat group.
Don’t hesitate to share your MR project on App Lab and get your portfolio project reviewed!
Dive into the extraordinary world of Mixed Reality Experiences and begin your journey today
Btw, the best submissions will be reviewed to get into the Meta Quest Store and get the chance to win a free Meta Quest Pro headset!
-
1Master Companion DocumentText lesson
-
2PrerequisitesText lesson
-
3Meta Presence PlatformVideo lesson
In this module we’ll get introduced to the Meta Presence Platform - the backbone of Virtual and Mixed Reality development. We will learn about important platform capabilities such as:
Passthrough
Spatial Anchors
Scene Understanding
Hand Tracking
Movement
Voice
-
4Prep 1.1 MaterialsText lesson
-
5Mixed Reality Features of Meta's Presence PlatformVideo lesson
In this chapter we cover the platform features in the Meta Presence Platform.
-
6Prep 1.2 MaterialsText lesson
-
7Your GorillaZilla ProjectVideo lesson
In this chapter we get introduced to the GorillaZilla application.
-
8QuizQuiz
-
9Introduction: Unity and Oculus IntegrationVideo lesson
-
10Prep 2.1 MaterialsText lesson
-
11Set up Your Own Unity ProjectVideo lesson
Prerequisites
Unity Hub
Unity 2022.3 LTS (this course used 2022.3.5f1)
Unity Android Support
Unity Windows Support
Important Links
Install Unity Hub: https://unity.com/downloadInstalling
Unity HubUnity Download Archive: https://unity.com/releases/editor/archive
Description
In this chapter we’ll get Unity installed and configured for Meta Quest devices.
If you’re just installing Unity for the first time, be sure to install a 2022.3 LTS version and during setup make sure to select the following to enable Android and Windows platform support:
You can refer back to the prerequisites document for additional details on setting up Unity.
When Sean instructs you to select the project type “3D (URP)” you may not have it installed on your machine. That’s OK, just click the “Download template” button for it to install and then you can continue.
-
12Prep 2.2 MaterialsText lesson
-
13Import and Configure the Meta SDKVideo lesson
In this chapter, we’ll import the Meta SDK and get the project ready to build and deploy to a Quest device.
When Sean instructs you to add the Meta package in the Unity Package Manager, here are the details:
Package Name: com.meta.xr.sdk.all
Package Version: 59.0.0
NOTE: If the version number is left blank, Unity will install the latest version of the SDK. Though the content of this course should still be relevant with future SDK versions, it was recorded with version 59.
-
14Prep 2.3 MaterialsText lesson
-
15Your First MR Scene Using Meta's Building BlocksVideo lesson
In this chapter, we’ll create our first scene and get it ready for VR using the Camera Rig Building Block.
-
16Prep 2.4 MaterialsText lesson
-
17Deploy your MR project to Meta QuestVideo lesson
Prerequisites
Quest headset must be in Developer Mode.
We recommend using the Meta Quest Developer Hub to enable Developer Mode and make it easier to deploy apps to the headset.
Important Links
Getting Started with Meta Quest Developer Hub https://developer.oculus.com/documentation/unity/ts-odh/
Deploy Build on Headset https://developer.oculus.com/documentation/unity/ts-odh-deploy-build/
Description
In this chapter, we will complete our first build and deploy it to the device for testing.
Before you can deploy an app, your device must be set to Developer Mode. We recommend following the steps in Getting Started with Meta Quest Developer Hub to install the hub and enable Developer Mode on your device.
When it’s time to deploy your app for testing, see Deploy Build on Headset for the steps to install the package.
Once the application is deployed to the device, you’ll need to go under App Library and be sure to select ‘Unknown sources’ to find and launch your application.
-
18Prep 2.5 MaterialsText lesson
-
19Import the GorillaZilla ProjectVideo lesson
Important Links
GorillaZilla Package https://github.com/XRBootcamp/GorillaZilla/releases
XR Gizmos https://github.com/darktable/XRGizmos.git
Toon Shader https://github.com/Delt06/urp-toon-shader.git?path=Packages/com.deltation.toon-shader
Description
In this chapter, we’ll import the GorillaZilla assets into our project so that we can begin building and learning about components, blocks, and scripts.
If you haven’t already done so, please click here to Download the GorillaZilla Package. Make sure you get the asset called GorillaZillaImport.unitypackage.
When adding the XR Gizmos package, here is the information:
Where: Add package from git URL
URL: *https://github.com/darktable/XRGizmos.git*
When adding the Toon Shader package, here is the information:
Where: Add package from git URL
URL: *https://github.com/Delt06/urp-toon-shader.git?path=Packages/com.deltation.toon-shader*
When Sean opens the " BuildingBlocksVersion " scene, your view may look a little different. This is because Unity defaults to a large icons view, but Sean uses a details view. At the bottom of the Project window, there is a slider. Dragging that slider to the left will result in the same details view that Sean has on his screen.
-
20QuizQuiz
-
21Mixed reality and passthrough overviewVideo lesson
-
22Prep 3.1 MaterialsText lesson
-
23Introduction to Passthrough Mixed RealityVideo lesson
Let’s learn about the Passthrough camera system, which allows our virtual objects to appear within our physical environment.
-
24Prep 3.2 MaterialsText lesson
-
25Add the Passthrough Building BlockVideo lesson
In this chapter, we will add passthrough support to our scene.
IMPORTANT: Before you begin this chapter, please rename OurFirstScene to DevScene and open it. Also, if you created a test cube in this scene please delete it now.
-
26Prep 3.3 MaterialsText lesson
-
27Customizing Passthrough VisualizationVideo lesson
In this chapter, we’ll learn how to change the color of the passthrough. This provides a neat effect for important game events.
-
28Scene Understanding IntroductionVideo lesson
-
29Prep 3.5 MaterialsText lesson
-
30Completing Space SetupVideo lesson
In this chapter, we’ll be completing your space setup!
-
31Prep 3.6 MaterialsText lesson
-
32Smart Object Positioning with Scene UnderstandingVideo lesson
In this chapter, we’ll learn about Smart Object Positioning with Scene Understanding.
-
33Prep 3.7 MaterialsText lesson
-
34Access Free Locations with the Room Manager ScriptVideo lesson
In this chapter, we’ll add the Room Manager component to our scene and learn how it helps us layout a city in our physical space.
-
35QuizQuiz
-
36Introduction to Controllers and Hand TrackingVideo lesson
In this module, we’ll learn about the Meta Interaction SDK, a library that allows users to interact with digital worlds through. We’ll learn about controller tracking, hand tracking, and interactions like grab and throw.
-
37Prep 4.1 MaterialsText lesson
-
38Add Controller Support to Your SceneVideo lesson
Prerequisites
Quest Link or Air Link
Sensor streaming over Link enabled in the Oculus App
Important Links
Quest Link https://www.meta.com/help/quest/articles/headsets-and-accessories/oculus-link/connect-link-with-quest-2/
Air Link https://www.meta.com/help/quest/articles/headsets-and-accessories/oculus-link/connect-with-air-link
Description
In this chapter, we will add add support for controllers to our scene. We will also start using Unity “Play Mode” to rapidly test and iterate on our experience.
Please note that Sean moved DevScene from AssetsScenes to Assets_CoreScenes. Feel free to move your DevScene, but it’s also okay to leave it where it is.
IMPORTANT: If you haven’t already done so, please follow the instructions below to enable Quest Link and enable sensor streaming from your device.
Configure Headset for Quest Link or Air Link
Quest Link allows you to play VR titles made for the PC on your Meta Quest. Quest Link also allows you to rapidly develop in Unity since you’ll be able to press “Play” and immediately start running your experience on the headset without having to build an application and deploy it to the device.
Quest Link works over a high-speed USB-C cable. Air Link is similar to Quest Link but works wirelessly over WIFI. Please note that Air Link depends heavily on having a high-speed low-latency router, so we recommend using Quest Link for this course.
Please set either Quest Link or Air Link:
Setup Quest Link (recommended)
Setup Air Link (optional)
Enable Developer Features Over Link
Testing applications quickly in Unity is tremendously powerful, but we have one more important additional step to complete before it will work properly. By default, sensor data from your Quest headset isn’t allowed to stream to your desktop PC for privacy reasons. You must opt into allowing this streaming of data. To do so:
Go into the Oculus app, then click on Settings > Beta (tab).
Turn on Developer Runtime Features, then turn on all of the toggles that allow sensor data to be sent over Link.
-
39Prep 4.2 MaterialsText lesson
-
40Add Hand Tracking Support to Your SceneVideo lesson
In this chapter, we enable hand tracking, and we learn how to make objects grabbable and throwable.
-
41Prep 4.3 MaterialsText lesson
-
42Configure Interactions: Pinch, Grab and ThrowVideo lesson
In this chapter, we enable hand tracking, and we learn how to configure interactions.
-
43Prep 4.4 MaterialsText lesson
-
44Adding Poke Interactions for UIVideo lesson
In this chapter, we’ll look at the Pokable Item block and how it can be used to make buttons for our menu.
-
45Prep 4.5 MaterialsText lesson
-
46AddMonster Hands to Your Gorillazilla GameVideo lesson
In this chapter, we’ll add hand tracking support to Gorilla Zilla, including pokable menus and cool monster hands!
-
47QuizQuiz
-
48Introduction to Movement SDKVideo lesson
In this module, we’ll learn about the Movement SDK and how we can use it to enable features like face and eye tracking.
-
49Prep 5.1 MaterialsText lesson
-
50Import and Configure Movement SDK Samples.Video lesson
Important Links
Movement SDK Overview https://developer.oculus.com/documentation/unity/move-overview/
Unity Movement Package on GitHub https://github.com/oculus-samples/Unity-Movement.git#v3.1.1
Description
In this chapter, we’ll import the Movement Samples from GitHub and create the layers necessary for the samples to function.
When Sean adds the sample package from GitHub, here’s the URL you can copy and paste into Package Manager:
*https://github.com/oculus-samples/Unity-Movement.git#v3.1.1*
Note that later versions should work, but this class was recorded with package version 3.1.1.
-
51Prep 5.2 MaterialsText lesson
-
52Setup Body Body Tracking and Enhance User ImmersionVideo lesson
In this chapter, we’ll create a sample scene to test out the Meta avatar rig and see the Movement SDK in action.
-
53Prep 5.3 MaterialsText lesson
-
54Eye Tracking, Face Tracking and Movement SamplesVideo lesson
In this chapter, we’ll learn about the various Body Tracking samples and learn about some of the use cases for body, face, and eye tracking.
-
55QuizQuiz
-
56Introduction to Meta Voice SDKVideo lesson
In this module, we learn about speech capabilities in the Voice SDK. We’ll learn about Intents, and we’ll use Text to Speech to assist users during the GorillaZilla tutorial.
-
57Prep 6.1 MaterialsText lesson
-
58Meta Voice SDKVideo lesson
In this chapter, we’ll add Text to Speech to our scene, and leverage it to help users through the tutorial.
-
59Prep 6.2 MaterialsText lesson
-
60Use Mic Volume Detection - the GorillaZilla ‘Roaaaaar’Video lesson
In this chapter, we’ll learn about the microphone capabilities in the Voice SDK. We’ll also use the volume sensing capabilities of the SDK to adjust the “Roar” destruction.
IMPORTANT: When you test the game at the end of this chapter, make sure your system is setup to use the microphone from the Quest. This is only necessary when testing over Link. When you deploy the game to the headset it will of course use the built-in microphone.
On Windows you can configure the microphone under Settings > System > Sounds > Input.
-
61QuizQuiz