Configure local AI providers for Warpio
Configures local AI providers for Warpio including LM Studio, Ollama, and custom APIs
/plugin marketplace add akougkas/claude-code-4-science/plugin install warpio@iowarp-scientific-computingIf you prefer to use Ollama instead of LM Studio:
# Update your .env file
echo "LOCAL_AI_PROVIDER=ollama" >> .env
echo "OLLAMA_API_URL=http://localhost:11434/v1" >> .env
echo "OLLAMA_MODEL=your-model-name" >> .env
To use a different model in LM Studio:
# Update your .env file
echo "LMSTUDIO_MODEL=your-new-model-name" >> .env
For other OpenAI-compatible APIs:
# Update your .env file
echo "LOCAL_AI_PROVIDER=custom" >> .env
echo "CUSTOM_API_URL=your-api-url" >> .env
echo "CUSTOM_API_KEY=your-api-key" >> .env
echo "CUSTOM_MODEL=your-model-name" >> .env
After making changes, test with:
/warpio-local-test
The following variables control local AI behavior:
LOCAL_AI_PROVIDER - Provider type (lmstudio/ollama/custom)LMSTUDIO_API_URL - LM Studio API endpointLMSTUDIO_MODEL - LM Studio model nameOLLAMA_API_URL - Ollama API endpointOLLAMA_MODEL - Ollama model nameCUSTOM_API_URL - Custom provider URLCUSTOM_MODEL - Custom provider model.env file with desired configuration/warpio-local-test/warpio-local-status