AI-powered development tools configuration and usage
Provides AI-powered coding assistance through aichat, aider, LocalAI, and AGiXT platforms.
npx claudepluginhub flexnetos/ripple-envThis skill inherits all available tools. When active, it can use any tool Claude has access to.
This environment includes AI-powered development tools to enhance your ROS2 development workflow.
There are two ways to access AI tools in this environment:
| Method | Commands Available | Requires |
|---|---|---|
| Devshell | ai, pair | nom develop or direnv allow |
| Home-Manager | All aliases (ai-code, pair-voice, etc.) | Module enablement |
Note: The devshell commands are always available when you enter the development environment. The extended aliases require home-manager module configuration.
aichat is the default AI assistant - a tiny, provider-agnostic CLI that works with multiple AI providers.
| Provider | Model Examples | API Key Env Var |
|---|---|---|
| Anthropic | claude-3-opus, claude-3-sonnet | ANTHROPIC_API_KEY |
| OpenAI | gpt-4, gpt-4-turbo, gpt-3.5-turbo | OPENAI_API_KEY |
| gemini-pro, gemini-1.5-pro | GOOGLE_API_KEY | |
| Ollama | llama2, codellama, mistral | (local, no key) |
| Azure OpenAI | gpt-4, gpt-35-turbo | AZURE_OPENAI_API_KEY |
# Set your API key (choose your provider)
export ANTHROPIC_API_KEY="your-key-here"
# or
export OPENAI_API_KEY="your-key-here"
# Basic usage
ai "explain what ROS2 topics are"
# Code assistance
ai-code "write a ROS2 publisher node in Python"
# Code review
ai-review "review this launch file for best practices"
# Explain code
cat src/my_node.py | ai-explain
| Alias | Command | Purpose |
|---|---|---|
ai | aichat | General AI chat |
ai-code | aichat --role coder | Code generation |
ai-explain | aichat --role explain | Code explanation |
ai-review | aichat --role reviewer | Code review |
aichat stores configuration in ~/.config/aichat/config.yaml:
# Example configuration
model: claude # Short model name (aichat resolves to latest)
save: true
highlight: true
temperature: 0.7
# Custom roles
roles:
- name: ros2-expert
prompt: |
You are a ROS2 expert. Help with:
- Node development (Python/C++)
- Launch files
- Message/Service definitions
- Best practices for robotics
Note: aichat uses short model names (e.g., claude, gpt-4, gemini-pro) and automatically resolves to the latest available version.
For offline/private AI assistance:
# Install Ollama (if not already)
curl -fsSL https://ollama.com/install.sh | sh
# Pull a coding model
ollama pull codellama
# Use with aichat
aichat --model ollama:codellama "write a ROS2 subscriber"
# Explain a ROS2 concept
ai "explain ROS2 QoS profiles"
# Generate a launch file
ai-code "create a launch file that starts a camera node and image processor"
# Debug an error
ai "why am I getting 'could not find package' in colcon build"
# Review code
cat src/robot_controller/robot_controller/controller.py | ai-review
cat file.py | ai "explain this"ai-code for generation, ai-review for feedbackaichat -s session-name to continue conversationsAider is a Git-integrated AI pair programmer that edits code in your repo with automatic commits.
# Start aider in current directory
pair
# Work on specific files
pair src/my_package/my_node.py
# Voice-to-code mode (requires portaudio)
pair-voice
# Watch mode - auto-commit on file changes
pair-watch
# Use specific model
aider --model claude-3-sonnet-20240229
aider --model gpt-4-turbo
| Feature | Description |
|---|---|
| Git Integration | Auto-commits changes with descriptive messages |
| Repo Mapping | Understands your entire codebase structure |
| Voice Mode | Speak your coding requests |
| Watch Mode | Monitors files and auto-commits changes |
| 100+ Languages | Python, C++, Rust, TypeScript, etc. |
| Alias | Command | Purpose |
|---|---|---|
pair | aider | Start AI pair programming |
pair-voice | aider --voice | Voice-to-code mode |
pair-watch | aider --watch | Auto-commit on changes |
pair-claude | aider --model claude-3-sonnet-20240229 | Use Claude |
pair-gpt4 | aider --model gpt-4-turbo | Use GPT-4 |
# Edit a ROS2 node
pair src/my_robot/my_robot/controller.py
> "Add a service server that accepts velocity commands"
# Modify launch files
pair src/my_robot/launch/robot.launch.py
> "Add a parameter for robot_name"
# Update CMakeLists.txt
pair src/my_robot/CMakeLists.txt
> "Add the new action interface dependency"
Create ~/.aider.conf.yml:
# Default model
model: claude-3-sonnet-20240229
# Auto-commit settings
auto-commits: true
auto-lint: true
# Editor integration
edit-format: diff
# Voice settings (if using --voice)
voice-language: en
# Dark mode for terminal
dark-mode: true
# API keys (set in .envrc or shell profile)
export ANTHROPIC_API_KEY="sk-ant-..." # For Claude
export OPENAI_API_KEY="sk-..." # For GPT-4
export DEEPSEEK_API_KEY="..." # For DeepSeek
Voice-to-code requires:
portaudio (included in devshell)Add to your shell profile or .envrc:
# Choose one provider
export ANTHROPIC_API_KEY="sk-ant-..."
export OPENAI_API_KEY="sk-..."
export GOOGLE_API_KEY="..."
# Optional: Set default model
export AICHAT_MODEL="claude-3-sonnet-20240229"
If using home-manager, enable the AI modules to get all aliases:
{
# Enable aichat with aliases (ai, ai-code, ai-explain, ai-review)
programs.aichat = {
enable = true;
settings = {
model = "claude"; # Short model name
save = true;
highlight = true;
};
};
# Enable aider with aliases (pair, pair-voice, pair-watch, pair-claude, pair-gpt4)
programs.aider = {
enable = true;
settings = {
model = "claude-3-sonnet-20240229";
auto-commits = true;
dark-mode = true;
};
};
}
Module Locations:
modules/common/ai/aichat.nix - aichat configurationmodules/common/ai/aider.nix - aider configurationmodules/common/ai/default.nix - AI module aggregatorLocalAI provides an OpenAI-compatible API server for running LLMs locally. It's the recommended inference backend for this environment.
# Start LocalAI server
localai start
# Check status
localai status
# List available models
localai models
# Stop server
localai stop
| Feature | Description |
|---|---|
| OpenAI API | Drop-in replacement for OpenAI API |
| P2P Federation | Distributed inference across multiple machines |
| Model Formats | GGUF, GGML, Safetensors, HuggingFace |
| GPU Support | CUDA, ROCm, Metal acceleration |
| No Internet | Fully offline capable |
LocalAI uses the models directory at ~/.local/share/localai/models.
# Set custom models path
export LOCALAI_MODELS_PATH="/path/to/models"
# Download a model (example)
curl -L "https://huggingface.co/TheBloke/Mistral-7B-v0.1-GGUF/resolve/main/mistral-7b-v0.1.Q4_K_M.gguf" \
-o ~/.local/share/localai/models/mistral-7b.gguf
# Use LocalAI with aichat
export OPENAI_API_BASE="http://localhost:8080/v1"
aichat --model local-model "Hello"
# Use LocalAI with aider
OPENAI_API_BASE=http://localhost:8080/v1 aider
| Port | Service |
|---|---|
| 8080 | LocalAI API |
Documentation: See docs/adr/adr-006-agixt-integration.md for architecture decisions.
AGiXT is a powerful AI Agent Automation Platform that enables building and orchestrating complex AI workflows.
# Ensure LocalAI is running first
localai start
# Start AGiXT services
agixt up
# Check service status
agixt status
# View logs
agixt logs
# Stop services
agixt down
┌─────────────────────────────────────────────────────────────┐
│ AGiXT Stack │
├─────────────┬─────────────┬─────────────┬───────────────────┤
│ AGiXT API │ AGiXT UI │ PostgreSQL │ MinIO │
│ :7437 │ :3437 │ :5432 │ :9000/:9001 │
└──────┬──────┴──────┬──────┴──────┬──────┴─────────┬─────────┘
│ │ │ │
└─────────────┴─────────────┴────────────────┘
│
┌───────┴───────┐
│ LocalAI │
│ :8080 │
└───────────────┘
| Port | Service |
|---|---|
| 7437 | AGiXT API |
| 3437 | AGiXT UI |
| 5432 | PostgreSQL |
| 9000 | MinIO API |
| 9001 | MinIO Console |
| 8080 | LocalAI (on host) |
# .env.agixt or exported
export AGIXT_URL="http://localhost:7437"
export AGIXT_API_KEY="agixt-dev-key"
export LOCALAI_URL="http://localhost:8080"
# Full command reference
agixt up # Start all services
agixt down # Stop all services
agixt logs # Follow logs
agixt status # Show container status
agixt shell # Shell into AGiXT container
The AGiXT Rust SDK bridge (rust/agixt-bridge/) enables ROS2 nodes to communicate with AGiXT:
# Build the bridge
cd rust/agixt-bridge
cargo build
# Run example
cargo run --example basic_chat
Key files:
rust/agixt-bridge/ - Rust SDK integrationdocker-compose.agixt.yml - Docker Compose configuration.env.agixt.example - Environment templatedocs/adr/adr-006-agixt-integration.md - Architecture decision recordActivates when the user asks about AI prompts, needs prompt templates, wants to search for prompts, or mentions prompts.chat. Use for discovering, retrieving, and improving prompts.
Search, retrieve, and install Agent Skills from the prompts.chat registry using MCP tools. Use when the user asks to find skills, browse skill catalogs, install a skill for Claude, or extend Claude's capabilities with reusable AI agent components.
This skill should be used when the user asks to "create an agent", "add an agent", "write a subagent", "agent frontmatter", "when to use description", "agent examples", "agent tools", "agent colors", "autonomous agent", or needs guidance on agent structure, system prompts, triggering conditions, or agent development best practices for Claude Code plugins.