DeepSeek AI large language model API via curl. Use this skill for chat completions, reasoning, and code generation with OpenAI-compatible endpoints.
/plugin marketplace add vm0-ai/api0/plugin install api0@api0This skill inherits all available tools. When active, it can use any tool Claude has access to.
Use the DeepSeek API via direct curl calls to access powerful AI language models for chat, reasoning, and code generation.
Official docs:
https://api-docs.deepseek.com/
Use this skill when you need to:
export DEEPSEEK_API_KEY="your-api-key"
| Type | Price |
|---|---|
| Input (cache hit) | $0.028 |
| Input (cache miss) | $0.28 |
| Output | $0.42 |
DeepSeek does not enforce strict rate limits. They will try to serve every request. During high traffic, connections are maintained with keep-alive signals.
Important: When using
$VARin a command that pipes to another command, wrap the command containing$VARinbash -c '...'. Due to a Claude Code bug, environment variables are silently cleared when pipes are used directly.bash -c 'curl -s "https://api.example.com" -H "Authorization: Bearer $API_KEY"' | jq .
All examples below assume you have DEEPSEEK_API_KEY set.
The base URL for the DeepSeek API is:
https://api.deepseek.com (recommended)https://api.deepseek.com/v1 (OpenAI-compatible)Send a simple chat message:
bash -c 'curl -s "https://api.deepseek.com/chat/completions" -X POST -H "Content-Type: application/json" -H "Authorization: Bearer ${DEEPSEEK_API_KEY}" -d "{\"model\": \"deepseek-chat\", \"messages\": [{\"role\": \"system\", \"content\": \"You are a helpful assistant.\"}, {\"role\": \"user\", \"content\": \"Hello, who are you?\"}]}"' | jq .
Available models:
deepseek-chat: DeepSeek-V3.2 non-thinking mode (128K context, 8K max output)deepseek-reasoner: DeepSeek-V3.2 thinking mode (128K context, 64K max output)Adjust creativity/randomness with temperature:
bash -c 'curl -s "https://api.deepseek.com/chat/completions" -X POST -H "Content-Type: application/json" -H "Authorization: Bearer ${DEEPSEEK_API_KEY}" -d "{\"model\": \"deepseek-chat\", \"messages\": [{\"role\": \"user\", \"content\": \"Write a short poem about coding.\"}], \"temperature\": 0.7, \"max_tokens\": 200}"' | jq -r '.choices[0].message.content'
Parameters:
temperature (0-2, default 1): Higher = more creative, lower = more deterministictop_p (0-1, default 1): Nucleus sampling thresholdmax_tokens: Maximum tokens to generateGet real-time token-by-token output:
curl -s "https://api.deepseek.com/chat/completions" -X POST -H "Content-Type: application/json" -H "Authorization: Bearer ${DEEPSEEK_API_KEY}" -d '{"model": "deepseek-chat", "messages": [{"role": "user", "content": "Explain quantum computing in simple terms."}], "stream": true}'
Streaming returns Server-Sent Events (SSE) with delta chunks, ending with data: [DONE].
Use the reasoner model for complex reasoning tasks:
bash -c 'curl -s "https://api.deepseek.com/chat/completions" -X POST -H "Content-Type: application/json" -H "Authorization: Bearer ${DEEPSEEK_API_KEY}" -d "{\"model\": \"deepseek-reasoner\", \"messages\": [{\"role\": \"user\", \"content\": \"What is 15 * 17? Show your work.\"}]}"' | jq -r '.choices[0].message.content'
The reasoner model excels at math, logic, and multi-step problems.
Force the model to return valid JSON:
bash -c 'curl -s "https://api.deepseek.com/chat/completions" -X POST -H "Content-Type: application/json" -H "Authorization: Bearer ${DEEPSEEK_API_KEY}" -d "{\"model\": \"deepseek-chat\", \"messages\": [{\"role\": \"system\", \"content\": \"You are a JSON generator. Always respond with valid JSON.\"}, {\"role\": \"user\", \"content\": \"List 3 programming languages with their main use cases.\"}], \"response_format\": {\"type\": \"json_object\"}}"' | jq -r '.choices[0].message.content' | jq .
Continue a conversation with message history:
bash -c 'curl -s "https://api.deepseek.com/chat/completions" -X POST -H "Content-Type: application/json" -H "Authorization: Bearer ${DEEPSEEK_API_KEY}" -d "{\"model\": \"deepseek-chat\", \"messages\": [{\"role\": \"user\", \"content\": \"My name is Alice.\"}, {\"role\": \"assistant\", \"content\": \"Nice to meet you, Alice.\"}, {\"role\": \"user\", \"content\": \"What is my name?\"}]}"' | jq -r '.choices[0].message.content'
Use Fill-in-the-Middle for code completion (beta endpoint):
bash -c 'curl -s "https://api.deepseek.com/beta/completions" -X POST -H "Content-Type: application/json" -H "Authorization: Bearer ${DEEPSEEK_API_KEY}" -d "{\"model\": \"deepseek-chat\", \"prompt\": \"def add(a, b):\\n \", \"max_tokens\": 20}"' | jq -r '.choices[0].text'
FIM is useful for:
Define functions the model can call:
bash -c 'curl -s "https://api.deepseek.com/chat/completions" -X POST -H "Content-Type: application/json" -H "Authorization: Bearer ${DEEPSEEK_API_KEY}" -d "{\"model\": \"deepseek-chat\", \"messages\": [{\"role\": \"user\", \"content\": \"What is the weather in Tokyo?\"}], \"tools\": [{\"type\": \"function\", \"function\": {\"name\": \"get_weather\", \"description\": \"Get the current weather for a location\", \"parameters\": {\"type\": \"object\", \"properties\": {\"location\": {\"type\": \"string\", \"description\": \"The city name\"}}, \"required\": [\"location\"]}}}]}"' | jq .
The model will return a tool_calls array when it wants to use a function.
Extract usage information from response:
bash -c 'curl -s "https://api.deepseek.com/chat/completions" -X POST -H "Content-Type: application/json" -H "Authorization: Bearer ${DEEPSEEK_API_KEY}" -d "{\"model\": \"deepseek-chat\", \"messages\": [{\"role\": \"user\", \"content\": \"Hello\"}]}"' | jq '.usage'
Response includes:
prompt_tokens: Input token countcompletion_tokens: Output token counttotal_tokens: Sum of bothDeepSeek is fully compatible with OpenAI SDKs. Just change the base URL:
Python:
from openai import OpenAI
client = OpenAI(api_key="your-deepseek-key", base_url="https://api.deepseek.com")
Node.js:
import OpenAI from 'openai';
const client = new OpenAI({ apiKey: 'your-deepseek-key', baseURL: 'https://api.deepseek.com' });
For complex requests with nested JSON (like function calling), use a temp file to avoid shell escaping issues:
# Write JSON to temp file
cat > /tmp/deepseek_request.json << 'EOF'
{
"model": "deepseek-chat",
"messages": [{"role": "user", "content": "What is the weather in Tokyo?"}],
"tools": [{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get current weather",
"parameters": {
"type": "object",
"properties": {"location": {"type": "string"}},
"required": ["location"]
}
}
}]
}
EOF
# Make request using the file
bash -c 'curl -s "https://api.deepseek.com/chat/completions" -X POST -H "Content-Type: application/json" -H "Authorization: Bearer ${DEEPSEEK_API_KEY}" -d @/tmp/deepseek_request.json' | jq .
deepseek-chat for general tasks, deepseek-reasoner for complex reasoningresponse_format, include JSON instructions in system messageapi.deepseek.com/beta-d @filename to avoid shell quoting issues