From godot-prompter
Sets up OpenXR for VR/AR/XR apps in Godot 4.3+: XROrigin3D, controllers, hand tracking, passthrough, input handling, Meta Quest deployment.
npx claudepluginhub jame581/godotprompter --plugin godot-prompterThis skill uses the workspace's default tool permissions.
All examples target Godot 4.3+ with no deprecated APIs. GDScript is shown first, then C#.
Builds immersive VR/AR experiences using WebXR, Three.js XR, Quest/Meta development, hand tracking, spatial UI/UX, AR anchors, and XR performance optimization. Useful for VR/AR/WebXR projects.
Builds WebXR experiences for Meta Quest using Immersive Web SDK with ECS architecture, Three.js integration, spatial UI, and XR input handling for VR/MR apps in Quest Browser.
Provides Apple Human Interface Guidelines for visionOS apps on Apple Vision Pro, covering spatial layout, eye/hand input, and immersive experiences. Use for RealityKit or spatial UI tasks.
Share bugs, ideas, or general feedback.
All examples target Godot 4.3+ with no deprecated APIs. GDScript is shown first, then C#.
Related skills: 3d-essentials for 3D rendering and environment, physics-system for 3D physics interactions, input-handling for non-XR input patterns, export-pipeline for platform exports.
OpenXR (or OpenXR Plugin depending on version)truetrue (for XR shader support)Disabled (the XR runtime controls frame timing)Main (Node3D)
├── XROrigin3D ← Player's physical space origin
│ ├── XRCamera3D ← Head-mounted display
│ ├── XRController3D (left) ← Left controller
│ │ └── LeftHandModel (MeshInstance3D or hand tracking)
│ ├── XRController3D (right) ← Right controller
│ │ └── RightHandModel
│ └── (XRBodyTracker via XRServer — optional full body tracking)
├── WorldEnvironment
└── GameWorld (Node3D)
└── ... level geometry
extends Node3D
func _ready() -> void:
var xr_interface: XRInterface = XRServer.find_interface("OpenXR")
if xr_interface and xr_interface.is_initialized():
get_viewport().use_xr = true
else:
push_error("OpenXR not available")
public partial class XRMain : Node3D
{
public override void _Ready()
{
var xrInterface = XRServer.FindInterface("OpenXR");
if (xrInterface != null && xrInterface.IsInitialized())
GetViewport().UseXr = true;
else
GD.PushError("OpenXR not available");
}
}
XRController3D automatically tracks the physical controller's position and rotation.
extends XRController3D
func _ready() -> void:
button_pressed.connect(_on_button_pressed)
button_released.connect(_on_button_released)
input_float_changed.connect(_on_float_changed)
func _on_button_pressed(button_name: String) -> void:
match button_name:
"trigger_click":
shoot()
"grip_click":
grab()
"ax_button": # A or X depending on hand
jump()
"by_button": # B or Y depending on hand
open_menu()
func _on_button_released(button_name: String) -> void:
if button_name == "grip_click":
release()
func _on_float_changed(float_name: String, value: float) -> void:
if float_name == "trigger":
# Analog trigger value 0.0–1.0
update_trigger_visual(value)
public partial class XRControllerHandler : XRController3D
{
public override void _Ready()
{
ButtonPressed += OnButtonPressed;
ButtonReleased += OnButtonReleased;
InputFloatChanged += OnFloatChanged;
}
private void OnButtonPressed(string buttonName)
{
switch (buttonName)
{
case "trigger_click": Shoot(); break;
case "grip_click": Grab(); break;
case "ax_button": Jump(); break;
}
}
private void OnButtonReleased(string buttonName)
{
if (buttonName == "grip_click") Release();
}
private void OnFloatChanged(string floatName, double value)
{
if (floatName == "trigger") UpdateTriggerVisual((float)value);
}
}
| Signal Name | Physical Button |
|---|---|
trigger_click | Index finger trigger (digital) |
trigger | Index finger trigger (analog float) |
grip_click | Side grip |
grip | Side grip (analog float) |
ax_button | A (right) or X (left) |
by_button | B (right) or Y (left) |
primary | Thumbstick click |
primary_x/y | Thumbstick axes (-1 to 1) |
extends XROrigin3D
@export var move_speed: float = 2.0
@export var turn_speed: float = 2.0
@export var snap_turn_degrees: float = 30.0
@onready var left_controller: XRController3D = $LeftController
@onready var camera: XRCamera3D = $XRCamera3D
func _physics_process(delta: float) -> void:
# Smooth locomotion from left thumbstick
var input := Vector2(
left_controller.get_float("primary_x"),
left_controller.get_float("primary_y")
)
if input.length() > 0.1:
# Move in camera's forward direction (ignore vertical)
var forward := -camera.global_basis.z
forward.y = 0.0
forward = forward.normalized()
var right := camera.global_basis.x
right.y = 0.0
right = right.normalized()
var movement: Vector3 = (forward * input.y + right * input.x) * move_speed * delta
global_position += movement
OpenXR hand tracking provides skeletal hand data without controllers.
extends XROrigin3D
@onready var left_hand: XRController3D = $LeftController
@onready var right_hand: XRController3D = $RightController
func _process(_delta: float) -> void:
# Check if hand tracking is active (vs controller tracking)
var xr_interface: XRInterface = XRServer.find_interface("OpenXR")
if not xr_interface:
return
# Hand tracking data is accessed through the XRHandTracker
var left_tracker: XRHandTracker = XRServer.get_tracker("left_hand") as XRHandTracker
if left_tracker:
# Check for pinch gesture
if left_tracker.get_hand_joint_flags(XRHandTracker.HAND_JOINT_INDEX_TIP) & XRHandTracker.HAND_JOINT_FLAG_POSITION_TRACKED:
var thumb_tip: Vector3 = left_tracker.get_hand_joint_transform(XRHandTracker.HAND_JOINT_THUMB_TIP).origin
var index_tip: Vector3 = left_tracker.get_hand_joint_transform(XRHandTracker.HAND_JOINT_INDEX_TIP).origin
var pinch_distance: float = thumb_tip.distance_to(index_tip)
if pinch_distance < 0.02: # 2cm threshold
_on_pinch_detected()
public partial class XRHandTrackingOrigin : XROrigin3D
{
public override void _Process(double delta)
{
var xrInterface = XRServer.FindInterface("OpenXR");
if (xrInterface == null)
return;
var leftTracker = XRServer.GetTracker("left_hand") as XRHandTracker;
if (leftTracker != null)
{
var flags = leftTracker.GetHandJointFlags(XRHandTracker.HandJoint.IndexTip);
if (flags.HasFlag(XRHandTracker.HandJointFlags.PositionTracked))
{
Vector3 thumbTip = leftTracker.GetHandJointTransform(XRHandTracker.HandJoint.ThumbTip).Origin;
Vector3 indexTip = leftTracker.GetHandJointTransform(XRHandTracker.HandJoint.IndexTip).Origin;
float pinchDistance = thumbTip.DistanceTo(indexTip);
if (pinchDistance < 0.02f) // 2cm threshold
OnPinchDetected();
}
}
}
}
Note: Hand tracking availability depends on the XR headset. Meta Quest, Apple Vision Pro, and some SteamVR setups support it. Always fall back to controller input.
extends XRController3D
var _held_object: RigidBody3D = null
var _grab_joint: Generic6DOFJoint3D = null
@onready var grab_area: Area3D = $GrabArea # small Area3D at controller position
func _ready() -> void:
button_pressed.connect(_on_button_pressed)
button_released.connect(_on_button_released)
func _on_button_pressed(button_name: String) -> void:
if button_name == "grip_click" and _held_object == null:
_try_grab()
func _on_button_released(button_name: String) -> void:
if button_name == "grip_click" and _held_object != null:
_release()
func _try_grab() -> void:
var bodies: Array[Node3D] = grab_area.get_overlapping_bodies()
for body in bodies:
if body is RigidBody3D:
_held_object = body
# Create a joint to attach object to controller
_grab_joint = Generic6DOFJoint3D.new()
add_child(_grab_joint)
_grab_joint.node_a = get_path()
_grab_joint.node_b = _held_object.get_path()
break
func _release() -> void:
if _grab_joint:
_grab_joint.queue_free()
_grab_joint = null
if _held_object:
# Apply controller velocity to thrown object
_held_object.linear_velocity = get_pose().linear_velocity
_held_object.angular_velocity = get_pose().angular_velocity
_held_object = null
public partial class XRGrabController : XRController3D
{
private RigidBody3D _heldObject = null;
private Generic6DOFJoint3D _grabJoint = null;
[Export] public Area3D GrabArea { get; set; }
public override void _Ready()
{
ButtonPressed += OnButtonPressed;
ButtonReleased += OnButtonReleased;
}
private void OnButtonPressed(string buttonName)
{
if (buttonName == "grip_click" && _heldObject == null)
TryGrab();
}
private void OnButtonReleased(string buttonName)
{
if (buttonName == "grip_click" && _heldObject != null)
Release();
}
private void TryGrab()
{
foreach (var body in GrabArea.GetOverlappingBodies())
{
if (body is RigidBody3D rigidBody)
{
_heldObject = rigidBody;
_grabJoint = new Generic6DOFJoint3D();
AddChild(_grabJoint);
_grabJoint.NodeA = GetPath();
_grabJoint.NodeB = _heldObject.GetPath();
break;
}
}
}
private void Release()
{
if (_grabJoint != null)
{
_grabJoint.QueueFree();
_grabJoint = null;
}
if (_heldObject != null)
{
var pose = GetPose();
_heldObject.LinearVelocity = pose.LinearVelocity;
_heldObject.AngularVelocity = pose.AngularVelocity;
_heldObject = null;
}
}
}
Standard Godot Control nodes don't work directly in 3D XR. Use a SubViewport rendered onto a MeshInstance3D:
UIPanel (StaticBody3D)
├── MeshInstance3D (QuadMesh, material with SubViewport texture)
├── CollisionShape3D (for ray interaction)
└── SubViewport (size = 1024x768)
└── Control (your UI scene)
extends XRController3D
@onready var ray: RayCast3D = $RayCast3D # pointing forward from controller
func _physics_process(_delta: float) -> void:
if ray.is_colliding():
var collider: Object = ray.get_collider()
if collider.has_method("xr_hover"):
collider.xr_hover(ray.get_collision_point())
if get_float("trigger") > 0.8:
if collider.has_method("xr_click"):
collider.xr_click(ray.get_collision_point())
public partial class XRPointerController : XRController3D
{
[Export] public RayCast3D Ray { get; set; }
public override void _PhysicsProcess(double delta)
{
if (Ray.IsColliding())
{
var collider = Ray.GetCollider();
if (collider is GodotObject obj)
{
if (obj.HasMethod("xr_hover"))
obj.Call("xr_hover", Ray.GetCollisionPoint());
if (GetFloat("trigger") > 0.8f)
{
if (obj.HasMethod("xr_click"))
obj.Call("xr_click", Ray.GetCollisionPoint());
}
}
}
}
}
Tip: Use the community addon Godot XR Tools for production-ready interaction systems, locomotion, and UI helpers.
Passthrough blends the real world with virtual content (AR/MR).
func enable_passthrough() -> void:
var xr_interface: XRInterface = XRServer.find_interface("OpenXR")
if xr_interface:
# Request passthrough blend mode
xr_interface.environment_blend_mode = XRInterface.XR_ENV_BLEND_MODE_ALPHA_BLEND
# Make the background transparent
get_viewport().transparent_bg = true
RenderingServer.set_default_clear_color(Color(0, 0, 0, 0))
private void EnablePassthrough()
{
var xrInterface = XRServer.FindInterface("OpenXR");
if (xrInterface != null)
{
// Request passthrough blend mode
xrInterface.EnvironmentBlendMode = XRInterface.EnvironmentBlendModeEnum.AlphaBlend;
// Make the background transparent
GetViewport().TransparentBg = true;
RenderingServer.SetDefaultClearColor(new Color(0, 0, 0, 0));
}
}
Passthrough support: Meta Quest 3/Pro, Apple Vision Pro, Varjo XR-4. Not all headsets support it.
Godot OpenXR Vendors (or install from AssetLib)OpenXROptional or RequiredOptional (if needed)| Setting | Recommended Value |
|---|---|
| Renderer | Mobile |
| MSAA | 2x or 4x (VR needs antialiasing) |
| Texture Compression | ETC2/ASTC |
| Target FPS | 72 (Quest 2) or 90 (Quest 3) |
Critical: VR must maintain consistent frame rate. Dropped frames cause nausea. Profile aggressively and keep draw calls low.
On Windows, Godot 4.5 adds a Direct3D 12 rendering backend for OpenXR. Previously, Windows XR builds required Vulkan. D3D12 is relevant for Quest Link and SteamVR on machines with good D3D12 driver support.
Enable in Project Settings:
D3D12 (Windows only)When to use D3D12: If your target audience runs Quest Link or SteamVR on Windows and encounters Vulkan driver issues, D3D12 can be more stable. Vulkan remains the default and is typically preferred for performance.
Standalone VR headsets (Meta Quest, Pico) running the Mobile Vulkan renderer now support foveated rendering via the VK_EXT_fragment_density_map extension. The peripheral view is rendered at lower resolution while the foveal region stays sharp, significantly reducing GPU load.
Enable via the OpenXR Vendors plugin:
LOW, MEDIUM, HIGH).VK_EXT_fragment_density_map device extension is requested automatically by the plugin at runtime — no Vulkan code needed.Note: Foveated rendering requires the Mobile Vulkan renderer. It has no effect on the Forward+ renderer or on PC VR headsets. Test GPU utilization with and without it on device using Meta's OVR Metrics Tool or Pico's equivalents.
Application SpaceWarp is a frame-synthesis technique supported on Meta Quest and Pico headsets. The GPU renders every other frame at half rate; the SpaceWarp runtime synthesizes the missing frames using motion vectors. This halves the rendering budget while maintaining perceived smoothness.
Enable via the OpenXR Vendors plugin:
Caution: SpaceWarp can introduce ghosting artifacts on fast-moving objects. Disable per-object if you see visual glitches. Test thoroughly on device.
The OpenXR Render Models extension lets the platform supply animated, branded controller meshes at runtime. You no longer need to bundle Quest Touch or Pico controller meshes in your project.
Enable:
OpenXRRenderModel nodes as children of each XRController3D — the plugin populates them with the platform-native mesh automatically.# The render model node auto-populates; no manual mesh assignment needed.
# You can show/hide it to toggle between your own model and the platform model:
@onready var render_model: Node3D = $XRController3D/OpenXRRenderModel
func _ready() -> void:
render_model.visible = true # Use platform-native controller model
// The render model node auto-populates; no manual mesh assignment needed.
[Export] public Node3D RenderModel { get; set; }
public override void _Ready()
{
RenderModel.Visible = true; // Use platform-native controller model
}
Availability: Render Models require the platform runtime to support the
XR_EXT_hand_tracking_data_sourceor equivalent render model extension. Confirmed on Meta Quest and Pico runtimes. Fall back to bundled meshes when the node has no mesh loaded.
Godot 4.5 adds native visionOS (Apple Vision Pro) export via the unified "Apple Embedded" platform driver. Windowed visionOS apps (running in the Shared Space) can be exported directly without additional tooling beyond Xcode.
Setup:
visionOS in the preset options.ARKit + Passthrough entitlements.Note: visionOS Full Space XR (immersive mode) requires Apple's
com.apple.developer.arkitentitlement and is distinct from windowed Shared Space mode. The OpenXR path on visionOS mirrors the passthrough workflow in Section 6.
Godot 4.6 ships with native OpenXR 1.1 runtime support. Devices and runtimes that implement OpenXR 1.1 automatically unlock 1.1 features (improved compositor layers, updated interaction profiles, etc.) without any project-level change. No API change is required — the engine negotiates the spec version with the runtime at startup.
Note: OpenXR 1.1 was introduced in Godot 4.6 (beta as of this writing). API behaviour may evolve before the stable release — see https://godotengine.org/article/dev-snapshot-godot-4-6-beta-1/ for current details.
Godot 4.6 stabilises the XR Spatial Entities extension, enabling:
XRSpatialAnchor)Basic spatial anchor usage:
# Requires: OpenXR Spatial Entities extension enabled in the vendor plugin
# XRSpatialAnchor is a Node3D placed in your scene that the runtime keeps locked
# to a real-world location.
extends Node3D
@export var anchor_scene: PackedScene # Scene containing XRSpatialAnchor
func place_anchor_at(world_position: Vector3) -> void:
var anchor: XRSpatialAnchor = XRSpatialAnchor.new()
anchor.position = world_position
add_child(anchor)
# The XR runtime takes over tracking once the node is added to the scene tree.
# Persist the anchor UUID to restore it on next launch (platform-specific API).
// Requires: OpenXR Spatial Entities extension enabled in the vendor plugin
public partial class SpatialAnchorManager : Node3D
{
public void PlaceAnchorAt(Vector3 worldPosition)
{
var anchor = new XRSpatialAnchor();
anchor.Position = worldPosition;
AddChild(anchor);
// The XR runtime takes over tracking once added to the scene tree.
// Persist anchor UUID via platform-specific API for cross-session recall.
}
}
Note:
XRSpatialAnchor, plane tracking, and marker tracking APIs were introduced in Godot 4.6 (beta as of this writing). The full API surface — especially persistence, query callbacks, and plane/marker node types — may change before the stable release. See https://godotengine.org/article/dev-snapshot-godot-4-6-beta-1/ for current signatures and the Godot OpenXR Vendors plugin for platform-specific spatial entity setup.
| Symptom | Cause | Fix |
|---|---|---|
| Black screen in headset | use_xr = true not set on viewport | Set in _ready() after checking XR interface |
| Controller input not firing | Wrong signal name for the platform | Check OpenXR action map bindings in Project Settings |
| Objects scale wrong in VR | Scene not built at real-world scale | Use 1 unit = 1 meter throughout the scene |
| Motion sickness from locomotion | Smooth rotation | Use snap turning (30° increments) or add a vignette during movement |
| UI unreadable in VR | Panel too far away or too small | Place UI at 1–2m distance, use SubViewport at 1024+ resolution |
| Hand tracking jittery | Raw joint data used directly | Apply smoothing (lerp toward new position each frame) |
| Export fails on Quest | Missing Android build template or wrong architecture | Install Android Build Template; enable arm64; set API level 29+ |
XROrigin3D → XRCamera3D + XRController3D hierarchyget_viewport().use_xr = true after interface checktrigger_click, grip_click, etc.)XRSpatialAnchor and vendor plugin spatial entities extension (Godot 4.6+)