Run GGUF models locally with Mozilla Llamafile, launching OpenAI-compatible API servers configurable for GPU/CPU inference, SDK integrations, installations, startups, and connection troubleshooting in offline setups.
npx claudepluginhub jamie-bitflight/claude_skills --plugin llamafileRun AI models locally with Ollama - free alternative to OpenAI, Anthropic, and other paid LLM APIs. Zero-cost, privacy-first AI infrastructure.
When calling LLM APIs from Python code. When connecting to llamafile or local LLM servers. When switching between OpenAI/Anthropic/local providers. When implementing retry/fallback logic for LLM calls. When code imports litellm or uses completion() patterns.
Smart LLM routing with Claude subscription monitoring, complexity-first model selection, and 20+ AI providers
Monorepo skill generator that creates specialized AI skills and checklists for each subsystem to improve code quality and consistency.
Editorial "LLM Application Developer" bundle for Claude Code from Antigravity Awesome Skills.
Share bugs, ideas, or general feedback.
Core skills library for Claude Code: TDD, debugging, collaboration patterns, and proven techniques
Own this plugin?
Verify ownership to unlock analytics, metadata editing, and a verified badge.
Sign in to claimOwn this plugin?
Verify ownership to unlock analytics, metadata editing, and a verified badge.
Sign in to claim