Higher capability. Lower cost. Faster agents.
Deploy locally or to the cloud in one command. Same code, any scale.
LithosAI ·
Cloud ·
Docs ·
Quickstart ·
Examples ·
Contributing ·
Slack
About
Agentic inference is exploding. Motus is an open-source agent serving project that enables higher capability, lower cost, and faster agents. It keeps deployment simple across local and cloud environments at any scale.
Use with your coding agent
The fastest way to get started is to let your coding agent handle building, serving, and deploying with Motus.
Motus works out of the box with any coding agent (e.g., Claude Code, Codex, or Cursor). Install the plugin with one command:
curl -fsSL https://www.lithosai.com/motus/install.sh | sh
Then use it directly in your workflow:
/motus # activate Motus skills
build your agent # start building your agent
/motus serve # serve locally
/motus deploy # deploy to the cloud
See plugins/motus/README.md for marketplace installs and more details.
Serve & deploy any agent
Install Motus to serve agents locally and deploy them to Motus Cloud. Motus supports agents built with:
- Motus
- OpenAI Agents SDK
- Anthropic SDK
- Google ADK
- Plain Python
Install the Motus Python library and CLI tool
Using uv:
uv add lithosai-motus
Or with pip:
pip install lithosai-motus
Serve locally and deploy to the cloud
# Serve locally
motus serve start myapp:agent --port 8000
# Chat with your local agent
motus serve chat http://localhost:8000 "Hello!"
# Deploy to Motus Cloud
motus deploy --name myapp myapp:agent
# Chat with your deployed agent
motus serve chat https://myapp.lithosai.com "Hello!"
Build with Motus
Motus provides a complete agent toolkit---including agents, tools, memory, guardrails, and tracing---powered by a runtime that automatically converts Python code into parallel, resilient workflows. Everything is designed to be simple, intuitive, and customizable.
Build an agent
from motus.agent import ReActAgent
from motus.models import OpenAIChatClient
from motus.runtime import resolve
from motus.tools import tool
@tool # define a simple tool
async def search(query: str) -> str:
"""Search the web for information."""
return f"Results for: {query}"
# define a ReAct agent
agent = ReActAgent(client=OpenAIChatClient(), model_name="gpt-4o", tools=[search])
print(resolve(agent("Hello World!")))
Start simple, and explore the agents documentation for more advanced usage.
Build a workflow
Example: fetch an article, summarize it, extract hashtags in parallel, then publish:
from motus.runtime import resolve
from motus.runtime.agent_task import agent_task
@agent_task # wrap functions as tasks in your workflow
async def summarize(article): ... # just a normal function
@agent_task
async def extract(article): ... # extract hashtags
@agent_task(retries=3, timeout=10.0) # augment tasks with retries and timeouts
async def fetch(url): ...
@agent_task
async def publish(summary, hashtags): ... # publish on LinkedIn