npx claudepluginhub LAURA-agent/reachy-mini-pluginAutomatic movement responses for Reachy Mini during conversations
Automatic movement responses for Reachy Mini during conversations
Automatic emotion-based movements for Reachy Mini robot during Claude Code conversations
This Claude Code plugin enables Reachy Mini to automatically perform emotion-based movements synchronized with Claude's responses, creating natural multimodal communication without requiring explicit commands.
Key Features:
Reachy Mini robot with daemon running:
mjpython -m reachy_mini.daemon.app.main --fastapi-port 8100
Claude Code installed and configured
Python 3.8+ with requests library:
pip install requests
Clone this repository into your Claude Code plugins directory:
cd ~/.claude/plugins
git clone https://github.com/LAURA-agent/reachy-mini-plugin.git reachy-mini
Restart Claude Code to load the plugin
Verify installation:
# Plugin should appear in available plugins list
ls ~/.claude/plugins/reachy-mini
<!-- MOVE: thoughtful1 -->
Let me analyze this code carefully...
Result: Reachy performs a thoughtful gesture once
<!-- TTS: "The build passed! All tests are green and deployment is complete." -->
<!-- MOOD: celebratory -->
Build successful! All tests passing, zero errors, deployed in under 2 minutes.
Result: Reachy continuously performs celebratory emotions (success, proud, cheerful) until TTS finishes speaking
The plugin uses Stop hooks to automatically detect and process movement markers in Claude's responses:
Single Move:
<!-- MOVE: emotion_name -->
Continuous Mood:
<!-- MOOD: mood_category -->
Markers are invisible in rendered output - they only appear in the raw response text.
Use when: Short responses, specific emotional reactions, precise control
Command: /reachy-mini:move
Marker: <!-- MOVE: emotion_name -->
Behavior:
Example:
<!-- MOVE: surprised1 -->
Found it! This callback is firing 15,000 times per second.
Use when: Longer explanations, sustained presence, TTS-synchronized communication
Command: /reachy-mini:mood
Marker: <!-- MOOD: mood_category -->
Behavior:
is_playing: false)Example:
<!-- TTS: "Let me explain async patterns. Promises handle future values, async await makes them readable." -->
<!-- MOOD: thoughtful -->
Let me break down async patterns step by step:
- Promises represent future values
- Async/await makes them readable
- Proper error handling prevents silent failures
Result: Thoughtful emotions (thoughtful1, curious1, attentive1, etc.) play continuously during ~30 second TTS
82 total emotions from Pollen Robotics library: