Develop complete apps for Even Realities G2 smart glasses using Even Hub SDK: scaffold Vite+TypeScript projects and templates, build precise UIs and layouts for greyscale displays, integrate hardware like mic/IMU/battery, handle touchpad inputs and state persistence, reference CLI/SDK APIs, automate simulator testing, and package .ehpk bundles for submission.
npx claudepluginhub even-realities/everything-evenhub --plugin everything-evenhubImplement background state persistence for Even Hub G2 plugins — automatically analyzes existing plugin code, identifies state that needs to survive background/foreground transitions, and inserts setBackgroundState + onBackgroundRestore calls. Use when a plugin loses state after the phone goes to the background and returns.
Package and deploy an Even Hub G2 app — validate app.json, build, pack into .ehpk, and prepare for submission. Use when packaging, deploying, publishing, or submitting an Even Hub app.
Even Hub CLI command reference — login, init, qr, and pack commands with all options. Use when running CLI commands, generating QR codes, initializing projects, or packaging apps.
UI/UX design guidelines for Even Hub G2 smart glasses — display constraints, layout patterns, icon design, Unicode characters, and community resources. Use when designing glasses app interfaces or planning layouts.
Use G2 hardware features in Even Hub apps — microphone audio capture, IMU motion data, device info, user info, and local storage. Use when working with audio, IMU, battery, wearing detection, or persistent storage.
Pixel-accurate font measurement for Even Realities G2 glasses — predict text layout dimensions matching the LVGL rendering engine. Use when sizing text containers precisely.
Build glasses display UI for Even Hub G2 apps — text containers, lists, images, page lifecycle, and layout patterns on the 576x288 canvas. Use when creating or updating glasses display content.
Handle user input and events in Even Hub G2 apps — touchpad gestures, ring input, scroll, foreground/background lifecycle, and event routing. Use when implementing user interaction or event handling.
Scaffold a new Even Hub G2 smart glasses app from scratch with Vite, TypeScript, SDK, simulator, and CLI. Use when creating a new Even Hub project, starting a glasses app, or bootstrapping development.
Complete Even Hub SDK API reference — all methods, types, interfaces, enums, and event models for G2 smart glasses development. Use when looking up specific API signatures, parameters, return types, or type definitions.
Automate the EvenHub glasses simulator via its HTTP API. Use when testing or controlling the simulator programmatically — sending glasses input (up, down, click, double click), capturing screenshots, or reading browser console logs.
Scaffold a new Even Hub G2 project by cloning one of the starter templates from [`even-realities/evenhub-templates`](https://github.com/even-realities/evenhub-templates) via `degit`. Unlike `/quickstart` (which bootstraps a blank Vite app from scratch), this skill drops the user into a template that already has the wiring they asked for — mic pipeline, image container, paginated reader, etc.
Test and debug Even Hub G2 apps using the desktop simulator — launch, configure, debug, take screenshots, and understand simulator vs hardware differences. Use when testing apps without physical glasses.
Everything EvenHub is a Claude Code skill set for Even Realities G2 smart glasses app development. It provides 12 AI-assisted skills covering the full development lifecycle — from project scaffolding to UI composition, input handling, device features, simulation testing, font measurement, and SDK/CLI reference lookups.
In Claude Code, run:
/plugin marketplace add even-realities/everything-evenhub
/plugin install everything-evenhub@everything-evenhub
The skills will be available after installation. To update later:
/plugin marketplace update everything-evenhub
After installation, try these in any Claude Code session:
# Scaffold a new G2 app from scratch (blank Vite base)
/quickstart my-weather-app
# Or scaffold from a curated starter template — pick the one closest to what you're building
/template my-reader --text-heavy
/template --asr my-transcription-app
/template --image photo-frame
/template --minimal hello-glasses
# Build and package for distribution
/build-and-deploy
# Look up SDK APIs
/sdk-reference createStartUpPageContainer
# Look up CLI commands
/cli-reference evenhub qr
# Get design guidance
/design-guidelines settings screen with 5 options
During development, use these skills to implement features:
# Build glasses display UI
/glasses-ui "show a 3-item menu with a title bar"
# Add input handling
/handle-input "single press cycles screens, double press exits"
# Use hardware features (audio, IMU, storage)
/device-features "toggle microphone recording on click"
# Measure text for pixel-accurate layouts
/font-measurement "size a text container for a long paragraph with 8px padding"
# Test with the simulator
/test-with-simulator "debug my app with glow effect"
# Automate simulator testing
/simulator-automation "take a screenshot and verify text is displayed"
| Tier | Skill | Description |
|---|---|---|
| Tier 1 — One-Click | quickstart | Scaffold a blank G2 app from scratch (Vite + TS + SDK) |
| Tier 1 — One-Click | template | Scaffold from a curated starter (minimal, asr, image, text-heavy) via degit |
| Tier 1 — One-Click | build-and-deploy | Package and publish app to Even Hub |
| Tier 2 — Core Development | glasses-ui | Build glasses display UI with containers, text, images, and lists |
| Tier 2 — Core Development | handle-input | Handle touchpad gestures, ring input, and lifecycle events |
| Tier 2 — Core Development | device-features | Use audio capture, IMU, device info, and local storage |
| Tier 2 — Core Development | test-with-simulator | Run and debug your app in the Even Hub Simulator |
| Tier 2 — Core Development | simulator-automation | Automate the simulator via its HTTP API — screenshots, input, console logs |
| Tier 2 — Core Development | font-measurement | Pixel-accurate text and list measurement matching LVGL firmware rendering |
| Tier 3 — Reference | sdk-reference | Look up Even Hub SDK APIs and types |
| Tier 3 — Reference | cli-reference | Look up Even Hub CLI commands |
| Tier 3 — Reference | design-guidelines | G2 display design constraints and best practices |
Each skill includes a harness test to verify it produces correct output when used by an AI agent. Run a test with:
/harness quickstart
See harness/README.md for details on adding tests for new skills.
MIT
Agent skills for Meta Quest and Horizon OS development. Helps Claude assist with Quest app debugging, performance analysis, project setup, Unity and WebXR workflows, Spatial SDK and Platform SDK integration, store submission checks, and hzdb device workflows.
Share bugs, ideas, or general feedback.
MiniMax AI skills library for frontend, fullstack, Android, iOS, shader, GIF sticker, document, presentation, spreadsheet, and multimodal media workflows
Comprehensive UI/UX design plugin for mobile (iOS, Android, React Native) and web applications with design systems, accessibility, and modern patterns
Expert CLI/TUI design consultant for command structure, visual design, accessibility, and UX patterns
Skills pack for AI-driven macOS development, system automation, and Apple platform integration.
Editorial "Apple Platform Design" bundle for Claude Code from Antigravity Awesome Skills.