From arize-skills
Manages Arize AI integrations for LLM providers like OpenAI, Anthropic, Gemini: create, list, update, delete credentials. Use for evaluators and Arize LLM features.
npx claudepluginhub arize-ai/arize-skills --plugin arize-skillsThis skill uses the workspace's default tool permissions.
> **`SPACE`** — Most `--space` flags and the `ARIZE_SPACE` env var accept a space **name** (e.g., `my-workspace`) or a base64 space **ID** (e.g., `U3BhY2U6...`). Find yours with `ax spaces list`.
Manages Arize AI integrations for LLM providers (OpenAI, Anthropic, Bedrock, Vertex AI, etc.) via ax CLI: list, create, update, delete.
Manages Arize ML platform resources like models, monitors, prompts, evaluators, dashboards, spaces via arize_toolkit CLI. Lists, creates, updates, deletes resources, configures profiles, handles admin tasks from terminal.
Manages TrueFoundry LLM provider account integrations: add, list, and configure accounts for OpenAI, AWS Bedrock, Google Vertex, Azure, Groq, Together AI, and self-hosted models.
Share bugs, ideas, or general feedback.
SPACE— Most--spaceflags and theARIZE_SPACEenv var accept a space name (e.g.,my-workspace) or a base64 space ID (e.g.,U3BhY2U6...). Find yours withax spaces list. Note:ai-integrations createdoes not accept--space— AI integrations are account-scoped. Use--spaceonly withlist,get,update, anddelete.
openAI, anthropic, awsBedrock)TGxtSW50ZWdyYXRpb246MTI6YUJjRA==); required for evaluator creation and other downstream operationsdefault (provider API key), proxy_with_headers (proxy via custom headers), or bearer_token (bearer token auth)Proceed directly with the task — run the ax command you need. Do NOT check versions, env vars, or profiles upfront.
If an ax command fails, troubleshoot based on the error:
command not found or version error → see references/ax-setup.md401 Unauthorized / missing API key → run ax profiles show to inspect the current profile. If the profile is missing or the API key is wrong, follow references/ax-profiles.md to create/update it. If the user doesn't have their key, direct them to https://app.arize.com/admin > API Keysax spaces list to pick by name, or ask the userax ai-integrations list --space SPACE to check for platform-managed credentials. If none exist, ask the user to provide the key or create an integration via the arize-ai-provider-integration skill.env files or search the filesystem for credentials. Use ax profiles for Arize credentials and ax ai-integrations for LLM provider keys. If credentials are not available through these channels, ask the user.List all integrations accessible in a space:
ax ai-integrations list --space SPACE
Filter by name (case-insensitive substring match):
ax ai-integrations list --space SPACE --name "openai"
Paginate large result sets:
# Get first page
ax ai-integrations list --space SPACE --limit 20 -o json
# Get next page using cursor from previous response
ax ai-integrations list --space SPACE --limit 20 --cursor CURSOR_TOKEN -o json
Key flags:
| Flag | Description |
|---|---|
--space | Space name or ID to filter integrations |
--name | Case-insensitive substring filter on integration name |
--limit | Max results (1–100, default 15) |
--cursor | Pagination token from a previous response |
-o, --output | Output format: table (default) or json |
Response fields:
| Field | Description |
|---|---|
id | Base64 integration ID — copy this for downstream commands |
name | Human-readable name |
provider | LLM provider enum (see Supported Providers below) |
has_api_key | true if credentials are stored |
model_names | Allowed model list, or null if all models are enabled |
enable_default_models | Whether default models for this provider are allowed |
function_calling_enabled | Whether tool/function calling is enabled |
auth_type | Authentication method: default, proxy_with_headers, or bearer_token |
ax ai-integrations get NAME_OR_ID
ax ai-integrations get NAME_OR_ID -o json
ax ai-integrations get NAME_OR_ID --space SPACE # required when using name instead of ID
Use this to inspect an integration's full configuration or to confirm its ID after creation.
Before creating, always list integrations first — the user may already have a suitable one:
ax ai-integrations list --space SPACE
If no suitable integration exists, create one. The required flags depend on the provider.
ax ai-integrations create \
--name "My OpenAI Integration" \
--provider openAI \
--api-key $OPENAI_API_KEY
ax ai-integrations create \
--name "My Anthropic Integration" \
--provider anthropic \
--api-key $ANTHROPIC_API_KEY
ax ai-integrations create \
--name "My Azure OpenAI Integration" \
--provider azureOpenAI \
--api-key $AZURE_OPENAI_API_KEY \
--base-url "https://my-resource.openai.azure.com/"
AWS Bedrock uses IAM role-based auth. Provide the ARN of the role Arize should assume via --provider-metadata:
ax ai-integrations create \
--name "My Bedrock Integration" \
--provider awsBedrock \
--provider-metadata '{"role_arn": "arn:aws:iam::123456789012:role/ArizeBedrockRole"}'
Vertex AI uses GCP service account credentials. Provide the GCP project and region via --provider-metadata:
ax ai-integrations create \
--name "My Vertex AI Integration" \
--provider vertexAI \
--provider-metadata '{"project_id": "my-gcp-project", "location": "us-central1"}'
ax ai-integrations create \
--name "My Gemini Integration" \
--provider gemini \
--api-key $GEMINI_API_KEY
ax ai-integrations create \
--name "My NVIDIA NIM Integration" \
--provider nvidiaNim \
--api-key $NVIDIA_API_KEY \
--base-url "https://integrate.api.nvidia.com/v1"
ax ai-integrations create \
--name "My Custom Integration" \
--provider custom \
--base-url "https://my-llm-proxy.example.com/v1" \
--api-key $CUSTOM_LLM_API_KEY
| Provider | Required extra flags |
|---|---|
openAI | --api-key <key> |
anthropic | --api-key <key> |
azureOpenAI | --api-key <key>, --base-url <azure-endpoint> |
awsBedrock | --provider-metadata '{"role_arn": "<arn>"}' |
vertexAI | --provider-metadata '{"project_id": "<gcp-project>", "location": "<region>"}' |
gemini | --api-key <key> |
nvidiaNim | --api-key <key>, --base-url <nim-endpoint> |
custom | --base-url <endpoint> |
| Flag | Description |
|---|---|
--model-name | Allowed model name (repeat for multiple, e.g. --model-name gpt-4o --model-name gpt-4o-mini); omit to allow all models |
--enable-default-models | Enable the provider's default model list |
--function-calling-enabled | Enable tool/function calling support |
--auth-type | Authentication type: default, proxy_with_headers, or bearer_token |
--headers | Custom headers as JSON object or file path (for proxy auth) |
--provider-metadata | Provider-specific metadata as JSON object or file path |
Capture the returned integration ID (e.g., TGxtSW50ZWdyYXRpb246MTI6YUJjRA==) — it is needed for evaluator creation and other downstream commands. If you missed it, retrieve it:
ax ai-integrations list --space SPACE -o json
# or by name/ID directly:
ax ai-integrations get NAME_OR_ID
update is a partial update — only the flags you provide are changed. Omitted fields stay as-is.
# Rename
ax ai-integrations update NAME_OR_ID --name "New Name"
# Rotate the API key
ax ai-integrations update NAME_OR_ID --api-key $OPENAI_API_KEY
# Change the model list (replaces all existing model names)
ax ai-integrations update NAME_OR_ID --model-name gpt-4o --model-name gpt-4o-mini
# Update base URL (for Azure, custom, or NIM)
ax ai-integrations update NAME_OR_ID --base-url "https://new-endpoint.example.com/v1"
Add --space SPACE when using a name instead of ID. Any flag accepted by create can be passed to update.
Warning: Deletion is permanent. Evaluators that reference this integration will no longer be able to run.
ax ai-integrations delete NAME_OR_ID --force
ax ai-integrations delete NAME_OR_ID --space SPACE --force # required when using name instead of ID
Omit --force to get a confirmation prompt instead of deleting immediately.
| Problem | Solution |
|---|---|
ax: command not found | See references/ax-setup.md |
401 Unauthorized | API key may not have access to this space. Verify key and space ID at https://app.arize.com/admin > API Keys |
No profile found | Run ax profiles show --expand; set ARIZE_API_KEY env var or write ~/.arize/config.toml |
Integration not found | Verify with ax ai-integrations list --space SPACE |
has_api_key: false after create | Credentials were not saved — re-run update with the correct --api-key or --provider-metadata |
| Evaluator runs fail with LLM errors | Check integration credentials with ax ai-integrations get INT_ID; rotate the API key if needed |
provider mismatch | Cannot change provider after creation — delete and recreate with the correct provider |
arize-evaluatorarize-experimentSee references/ax-profiles.md § Save Credentials for Future Use.