Unity supports XR development through its plug-inA set of code created outside of Unity that creates functionality in Unity. There are two kinds of plug-ins you can use in Unity: Managed plug-ins (managed .NET assemblies created with tools like Visual Studio) and Native plug-ins (platform-specific native code libraries). More info
See in Glossary framework and a set of feature and tool packages. Use the XR Plug-in Management settings page to enable XR support in a Unity project and to choose the plug-ins for the XR devices your project supports. Use the Package Manager to install the additional feature packages.
The following diagram illustrates the current Unity XR plug-in framework structure and how it works with platform provider implementations:
Plug-ins for an XR platform implement the subsystem interfaces defined in the Unity XR SDK. The Unity Engine and XR packages use the subsystem interfaces to provide features to XR apps.
An XR provider plug-in is a Unity plug-in that supports an XR device platform. For example, the ARCore plugin supports the Android AR platform on hand-held Android devices, while the Oculus plug-in supports Oculus devices on both Windows and Android.
An XR provider plug-in implements interfaces defined by the Unity XR SDK. These interfaces are called Subsystems in Unity XR. A plug-in that implements one or more subsytems is called a Provider plug-in. Typically, a provider plug-in uses the device maker’s own native libraries to implement the Unity interfaces for their devices. The provider plug-in registers the subsystems it provides so that the Unity XR code knows which features are available when it loads the plug-in.
Unity code in the engine and packages use the subsystem interfaces to communicate with the active providers and to provide XR features to your application. Because of this common interface, you can generally reuse the same code across many XR devices. (Of course, if a provider or device doesn’t support a particular feature, then that feature cannot work when using that provider or on those devices lacking the required hardware or software support.)
XR subsystems provide XR features to your Unity application. The Unity XR SDK defines a common interface for subsystems so that all provider plug-ins implementing a feature generally work the same way in your app. Often you can change the active provider and rebuild your app to run on a different XR platform, as long as the platforms are largely similar.
The Unity Engine defines a set of fundamental XR subsystems. Unity packages provide additional subsytems. For example, the AR Subsystems package contains many of the AR-specific subsytem interfaces.
The subsystems defined in the Unity Engine include:
Subsystem | Description |
---|---|
Display | Stereo XR display. |
Input | Spatial tracking and controller input. |
Meshing | Generate 3D meshes from environment scans. |
Note: Unity applications typically do not interact with subsytems directly. Instead, the features provided by a subsytem are exposed to the application through an XR plug-in or package. For example, the ARMeshManager component in the AR Foundation package lets you add the meshes created by the Meshing subsystem to a sceneA Scene contains the environments and menus of your game. Think of each unique Scene file as a unique level. In each Scene, you place your environments, obstacles, and decorations, essentially designing and building your game in pieces. More info
See in Glossary.