Create and scaffold DAPR AI agents with the dapr-agents framework
Scaffolds DAPR AI agents with LLM integration, tools, and durable workflows.
/plugin marketplace add Sahib-Sawhney-WH/sahibs-claude-plugin-marketplace/plugin install dapr@sahib-claude-marketplaceCreate intelligent, durable AI agents powered by LLMs using the DAPR Agents framework.
When the user runs /dapr:agent:
Determine Agent Type
Generate Agent Code
Create Supporting Files
| Argument | Description |
|---|---|
assistant <name> | Create a basic AssistantAgent |
durable <name> | Create a workflow-backed DurableAgent |
service <name> | Create a headless AgentService |
multi <name> | Create a multi-agent system |
--tools | Comma-separated list of tools to include |
--memory | Memory type: short-term, long-term, vector |
--llm | LLM provider: openai, azure, anthropic, ollama |
/dapr:agent assistant weather-bot
Creates an agent that can:
/dapr:agent durable order-processor --tools "inventory,payment,shipping"
Creates a fault-tolerant agent with:
/dapr:agent service research-agent --memory vector
Creates a REST API agent with:
/dapr:agent multi customer-support --agents "triage,technical,billing"
Creates coordinated agents with:
weather-bot/
├── agent.py # AssistantAgent implementation
├── tools.py # Tool definitions
├── requirements.txt # dapr-agents dependencies
├── components/
│ ├── statestore.yaml # Memory persistence
│ └── conversation.yaml # LLM component
└── Dockerfile
order-processor/
├── agent.py # DurableAgent with workflow
├── workflow.py # Agent workflow definition
├── tools.py # Tool definitions
├── activities.py # Workflow activities
├── requirements.txt
├── components/
│ ├── statestore.yaml
│ ├── conversation.yaml
│ └── resiliency.yaml # Retry policies
└── Dockerfile
customer-support/
├── dapr.yaml # Multi-app configuration
├── agents/
│ ├── triage/
│ │ └── agent.py
│ ├── technical/
│ │ └── agent.py
│ └── billing/
│ └── agent.py
├── orchestrator/
│ └── workflow.py # Agent orchestration
├── components/
│ ├── statestore.yaml
│ ├── pubsub.yaml # Agent communication
│ └── conversation.yaml
└── requirements.txt
Basic agent with LLM integration and tool calling:
from dapr_agents import AssistantAgent, tool
@tool
def get_weather(city: str) -> str:
"""Get weather for a city."""
return f"Weather in {city}: Sunny, 72°F"
agent = AssistantAgent(
name="weather-bot",
role="Weather Assistant",
instructions="Help users with weather queries",
tools=[get_weather]
)
Workflow-backed agent with fault tolerance:
from dapr_agents import DurableAgent
agent = DurableAgent(
name="order-processor",
role="Order Processing Agent",
workflow_name="order_workflow"
)
Headless agent as REST service:
from dapr_agents import AgentService
service = AgentService(
agent=agent,
port=8000,
enable_memory=True
)
service.start()
# Run with DAPR sidecar
dapr run --app-id weather-bot --app-port 8000 --resources-path ./components -- python agent.py
# Or with dapr.yaml for multi-agent
dapr run -f dapr.yaml
Set environment variables for your LLM provider:
export OPENAI_API_KEY="sk-..."
export AZURE_OPENAI_API_KEY="..."
export AZURE_OPENAI_ENDPOINT="https://..."
export AZURE_OPENAI_DEPLOYMENT="gpt-4"
export OLLAMA_MODEL="llama3"
Default conversation history, cleared on restart.
Persistent memory using Dapr state store.
Semantic memory with vector embeddings for retrieval-augmented generation.