Use local models via Ollama (private, data stays local)
/plugin marketplace add gopherguides/gopher-ai/plugin install gopherguides-llm-tools-plugins-llm-tools@gopherguides/gopher-ai<prompt># Use Local Models via Ollama **If `$ARGUMENTS` is empty or not provided:** Display usage information and ask for input: This command runs prompts through local models via Ollama. Your data stays on your machine. **Usage:** `/ollama <prompt>` **Examples:** | Command | Description | |---------|-------------| | `/ollama review this authentication code` | Code review | | `/ollama explain this concurrent pattern` | Code explanation | | `/ollama suggest Go idioms for this function` | Best practices | | `/ollama what security issues do you see` | Security analysis | **Recommended Models fo...