Install and configure Ollama for local AI models
sonnet# Setup Ollama - Local AI Installation I'll help you install and configure Ollama for free, local AI model deployment. ## Step 1: Detect Operating System Let me check your system: ## Step 2: Install Ollama ### For macOS: ### For Linux: ### For Windows: Download installer from: https://ollama.com/download/windows ## Step 3: Verify Installation ## Step 4: Pull Recommended Models ## Step 5: Test Model ## Step 6: Configure API Access Ollama runs on `http://localhost:11434` by default. Test the API: ## Optional: GPU Acceleration ### NVIDIA GPU: ### Apple Silicon: ##...