Build AI gateway services for routing and managing LLM requests. Use when implementing API proxies, rate limiting, or multi-provider AI services.
/plugin marketplace add astoeffer/moodle-plugin-marketplace/plugin install cloodle-ai-integration@astoeffer-dev-pluginsThis skill is limited to using the following tools:
Multi-provider AI configuration for Cloodle platform.
AI_PROVIDER=ollama
OLLAMA_BASE_URL=http://localhost:11434
OLLAMA_MODEL=llama3.2
AI_PROVIDER=anthropic
ANTHROPIC_API_KEY=sk-ant-...
ANTHROPIC_MODEL=claude-sonnet-4-20250514
AI_PROVIDER=huggingface
HF_API_KEY=hf_...
HF_MODEL=gpt-oss-20b
def get_llm():
provider = os.getenv("AI_PROVIDER", "ollama")
if provider == "ollama":
from langchain_ollama import ChatOllama
return ChatOllama(
base_url=os.getenv("OLLAMA_BASE_URL"),
model=os.getenv("OLLAMA_MODEL", "llama3.2")
)
elif provider == "anthropic":
from langchain_anthropic import ChatAnthropic
return ChatAnthropic(
model=os.getenv("ANTHROPIC_MODEL")
)
elif provider == "huggingface":
from langchain_huggingface import HuggingFaceEndpoint
return HuggingFaceEndpoint(
repo_id=os.getenv("HF_MODEL")
)
| Use Case | Provider | Model |
|---|---|---|
| Development | Ollama | llama3.2 |
| Production Chat | Anthropic | claude-sonnet |
| Cost Sensitive | HuggingFace | gpt-oss-20b |
| High Quality | Anthropic | claude-opus |
/opt/cloodle/tools/ai/multi_agent_rag_system/.env
curl http://localhost:11434/api/tags # Ollama
Create distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
This skill should be used when the user asks to "create a hookify rule", "write a hook rule", "configure hookify", "add a hookify rule", or needs guidance on hookify rule syntax and patterns.