From unity
Unity 6 XR development guide. Use when building VR, AR, or MR experiences. Covers XR Interaction Toolkit, XR Plug-in Management, OpenXR, hand tracking, controllers, haptics, AR Foundation (plane detection, anchors, image tracking), and platform-specific setup (Meta Quest, Apple Vision Pro). Based on Unity 6.3 LTS documentation.
npx claudepluginhub cdata/aria-skills --plugin unityThis skill uses the workspace's default tool permissions.
> Source: Unity 6.3 LTS Documentation (6000.3)
Sets up OpenXR for VR/AR/XR apps in Godot 4.3+ with XROrigin3D, controllers, hand tracking, passthrough, and Meta Quest deployment.
Unity 6 Input System guide. Use when handling player input, controls, gamepad, keyboard, mouse, touch, or XR controllers. Covers the new Input System package (recommended), Input Actions, Action Maps, Control Schemes, PlayerInput component, and input debugging. Based on Unity 6.3 LTS documentation.
Configures Unity world-space canvases and UI for Meta Quest VR with TextMesh Pro setup, optimal sizing, viewing distances, and interaction readiness. For VR canvas setup, TMP text, interactive UI, and fixing sizing/interaction issues.
Share bugs, ideas, or general feedback.
Source: Unity 6.3 LTS Documentation (6000.3)
Unity defines XR as encompassing three application types:
The XR stack in Unity 6 consists of:
Configure via Edit > Project Settings > XR Plug-in Management.
| Runtime | Target | Graphics |
|---|---|---|
| Windows Mixed Reality | Win64 | DX11 |
| Oculus PC + Link | Win64 | DX11 |
| Meta Quest | Android arm64 | Vulkan |
| Magic Leap 2 | Android x64 | Vulkan |
| SteamVR | Win64 | DX11 |
// Iterate all OpenXR features
var features = OpenXRSettings.Instance.GetFeatures();
foreach (var feature in features)
{
Debug.Log($"Feature: {feature.name}, Enabled: {feature.enabled}");
}
// Get a specific feature by type
var feature = OpenXRSettings.Instance.GetFeature<MockRuntime>();
if (feature != null)
{
Debug.Log($"MockRuntime Enabled: {feature.enabled}");
}
// Iterate feature groups (Editor only)
var featureSets = OpenXRFeatureSetManager.FeatureSetsForBuildTarget(
BuildTargetGroup.Standalone
);
foreach (var featureSet in featureSets)
{
Debug.Log($"Feature Set ID: {featureSet.featureSetId}");
}
Package: com.unity.xr.interaction.toolkit (v3.3.1 for Unity 6000.3)
A high-level, component-based interaction system for VR and AR. Three foundational elements:
| State | Description |
|---|---|
| Hover | Interactable is a valid target; indicates intention |
| Select | User input (button/trigger) triggers active interaction (grab) |
| Focus | Persists after selection until another interactable is selected |
| Activate | Secondary contextual action mapped to additional controls |
OnSelectExiting, OnHoverExiting, OnFocusExiting)OnSelectEntering, OnHoverEntering, OnFocusEntering)Interactors are always notified before Interactables for both processing and state changes.
| Component | Purpose |
|---|---|
| XR Direct Interactor | Close-range (touch/grab) interaction |
| XR Ray Interactor | Ray-based distance interaction |
| XR Poke Interactor | Poking/touching interaction |
| Near-Far Interactor | Hybrid proximity/distance |
| XR Gaze Interactor | Eye-gaze based interaction |
| XR Socket Interactor | Snap-point interaction |
| Climb Teleport Interactor | Climbing locomotion |
| Component | Purpose |
|---|---|
| XR Grab Interactable | Object grabbing and manipulation |
| XR Simple Interactable | Basic interaction events |
| Climb Interactable | Climbable surfaces |
| Teleportation Anchor | Fixed teleport destination |
| Teleportation Area | Area-based teleport target |
| Teleportation Multi-Anchor Volume | Volume with multiple anchors |
Interactors and Interactables report a normalized [0.0, 1.0] analog selection strength, processed after state changes.
A Group contains multiple member Interactors sorted by priority and only allows one Interactor in the Group to interact at a time. Supports nested Groups.
See skills/unity-xr/references/xr-toolkit.md for full API details.
| Provider | Description |
|---|---|
| Teleportation Provider | Point-and-teleport movement |
| Snap Turn Provider | Rotates user by fixed angles |
| Continuous Turn Provider | Smooth rotation over time |
| Continuous Move Provider | Smooth movement over time |
| Grab Move Provider | Moves user counter to controller movement |
| Two-Handed Grab Move Provider | Dual-controller movement, rotation, scaling |
| Climb Provider | Movement while selecting Climb Interactables |
| Gravity Provider | Gravitational effects with grounded state detection |
Transformations are applied sequentially based on ascending priority; same-priority transformations apply in queue order.
Package: com.unity.xr.arfoundation (v6.3.3 for Unity 6000.3)
AR Foundation uses a two-package architecture:
| Platform | Provider Plug-in |
|---|---|
| Android | Google ARCore XR Plug-in |
| iOS | Apple ARKit XR Plug-in |
| visionOS | Apple visionOS XR Plug-in |
| HoloLens 2 | OpenXR Plug-in |
| Meta Quest | Unity OpenXR: Meta |
| Android XR | Unity OpenXR: Android XR |
| Feature | Manager Component |
|---|---|
| Session Management | ARSession |
| Device Tracking | ARTrackedPoseDriver |
| Camera | ARCameraManager |
| Plane Detection | ARPlaneManager |
| Image Tracking | ARTrackedImageManager |
| Face Tracking | ARFaceManager |
| Body Tracking | ARHumanBodyManager |
| Object Tracking | ARTrackedObjectManager |
| Point Clouds | ARPointCloudManager |
| Ray Casting | ARRaycastManager |
| Anchors | ARAnchorManager |
| Meshing | ARMeshManager |
| Environment Probes | AREnvironmentProbeManager |
| Occlusion | AROcclusionManager |
| Bounding Box Detection | ARBoundingBoxManager |
| Participants | ARParticipantManager |
Hand-tracking devices expose the HandTracking characteristic and CommonUsages.HandData feature:
Hand handData;
if (device.TryGetFeatureValue(CommonUsages.handData, out handData))
{
// Get wrist bone
Bone rootBone;
if (handData.TryGetRootBone(out rootBone))
{
Vector3 wristPosition;
rootBone.TryGetPosition(out wristPosition);
}
// Get finger bones (up to 21 bones per hand)
List<Bone> fingerBones = new List<Bone>();
if (handData.TryGetFingerBones(HandFinger.Index, fingerBones))
{
foreach (Bone bone in fingerBones)
{
Vector3 bonePosition;
bone.TryGetPosition(out bonePosition);
}
}
}
Each Bone contains: position, orientation, parent/child references.
using UnityEngine.XR;
// Get devices by role
var devices = new List<InputDevice>();
InputDevices.GetDevicesWithRole(InputDeviceRole.RightHanded, devices);
if (devices.Count > 0)
{
var device = devices[0];
// Read trigger button
bool triggerPressed;
if (device.TryGetFeatureValue(CommonUsages.triggerButton,
out triggerPressed) && triggerPressed)
{
// Handle trigger press
}
// Read joystick axis
Vector2 axis;
if (device.TryGetFeatureValue(CommonUsages.primary2DAxis, out axis))
{
// Use axis.x, axis.y
}
}
| Usage | Type | Description |
|---|---|---|
primary2DAxis | Vector2 | Thumbstick/touchpad |
trigger | float | Trigger axis (0-1) |
grip | float | Grip axis (0-1) |
triggerButton | bool | Trigger pressed |
gripButton | bool | Grip pressed |
primaryButton | bool | A/X button |
secondaryButton | bool | B/Y button |
menuButton | bool | Menu button |
userPresence | bool | User wearing headset |
HapticCapabilities capabilities;
if (device.TryGetHapticCapabilities(out capabilities))
{
if (capabilities.supportsImpulse)
{
uint channel = 0;
float amplitude = 0.5f; // 0-1
float duration = 1.0f; // seconds
device.SendHapticImpulse(channel, amplitude, duration);
}
}
Eyes eyeData;
if (device.TryGetFeatureValue(CommonUsages.eyesData, out eyeData))
{
// Access left/right eye positions, gaze direction, blink amounts
}
// Subscribe to connection events
InputDevices.deviceConnected += OnDeviceConnected;
InputDevices.deviceDisconnected += OnDeviceDisconnected;
// Always validate before use
if (device.isValid)
{
// Safe to read features
}
var subsystems = new List<XRInputSubsystem>();
SubsystemManager.GetInstances<XRInputSubsystem>(subsystems);
if (subsystems.Count > 0)
{
subsystems[0].TrySetTrackingOriginMode(TrackingOriginModeFlags.Floor);
// Get play area boundary
List<Vector3> boundaryPoints = new List<Vector3>();
if (subsystems[0].TryGetBoundaryPoints(boundaryPoints))
{
// Points are floor-level, clockwise-ordered
}
}
// On the grabbable object:
// 1. Add Rigidbody
// 2. Add Collider
// 3. Add XR Grab Interactable component
// Configure Movement Type based on needs:
// - Instantaneous: lowest latency, no physics
// - Kinematic: physics-synced, moderate latency
// - Velocity Tracking: full physics, potential lag
Grab Interactable (Rigidbody, XRGrabInteractable)
├── Visuals (MeshFilter, MeshRenderer)
├── Collider (BoxCollider, MeshCollider)
└── Visual Feedback (Affordance providers)
Use the XR Interaction Simulator component to test interactions in the Editor without physical hardware.
Input.GetAxis() will not work.InputDevice.isValid before reading features; devices can disconnect at any time.SystemInfo.supportsComputeShaders.| Class | Namespace | Purpose |
|---|---|---|
XROrigin | Unity.XR.CoreUtils | Camera rig and tracking space |
XRInteractionManager | UnityEngine.XR.Interaction.Toolkit | Coordinates interactions |
XRBaseInteractor | UnityEngine.XR.Interaction.Toolkit | Base interactor class |
XRBaseInteractable | UnityEngine.XR.Interaction.Toolkit | Base interactable class |
XRGrabInteractable | UnityEngine.XR.Interaction.Toolkit | Grabbable object |
InputDevice | UnityEngine.XR | Physical XR device |
InputDevices | UnityEngine.XR | Device discovery |
CommonUsages | UnityEngine.XR | Standard input feature names |
OpenXRSettings | UnityEngine.XR.OpenXR | OpenXR configuration |
ARSession | UnityEngine.XR.ARFoundation | AR session management |
ARPlaneManager | UnityEngine.XR.ARFoundation | Plane detection |
ARAnchorManager | UnityEngine.XR.ARFoundation | Spatial anchors |
unity-input -- Input System package, action maps, bindingsunity-graphics -- Rendering pipelines, shader considerations for XR