Providers
Agentix supports five LLM backends. All provider SDKs are bundled in the core install — no extra pip install is needed.
Supported providers
provider | Models | Default model |
|---|---|---|
"anthropic" | Claude 3.x / 4.x | claude-sonnet-4-20250514 |
"openai" | GPT-4o, o-series, etc. | gpt-4o |
"openai_compatible" | Any OpenAI API–compatible model | gpt-oss-120b |
"gemini" | Gemini 2.x Flash/Pro | gemini-2.5-flash |
"deepseek" | DeepSeek Chat / Coder | deepseek-chat |
Anthropic
from agentix import AgentixAgentOptions
options = AgentixAgentOptions(
provider="anthropic",
model="claude-sonnet-4-20250514",
)
API key: AGENTIX_API_KEY
Extended thinking (effort)
Claude models support extended reasoning. Use effort as a shorthand or configure thinking directly:
# Shorthand — maps to thinking budget tokens
options = AgentixAgentOptions(
provider="anthropic",
effort="high", # low=2k | medium=5k | high=10k | max=32k tokens
)
# Explicit
options = AgentixAgentOptions(
provider="anthropic",
thinking={"enabled": True, "budget_tokens": 10000},
)
ThinkingBlock entries appear in AssistantMessage.content when thinking is active.
Beta features
options = AgentixAgentOptions(
provider="anthropic",
betas=["interleaved-thinking-2025-05-14"],
)
Key rotation
For high-throughput use cases, supply a callback that returns a fresh API key before each LLM call:
import itertools
keys = itertools.cycle(["sk-ant-key1", "sk-ant-key2", "sk-ant-key3"])
options = AgentixAgentOptions(
provider="anthropic",
key_provider=lambda: next(keys),
)
OpenAI
options = AgentixAgentOptions(
provider="openai",
model="gpt-4o",
)
API key: AGENTIX_API_KEY
o-series reasoning models
options = AgentixAgentOptions(
provider="openai",
model="o3",
effort="high", # maps to reasoning_effort="high" for o-series
)
OpenAI-compatible
Use this provider for any endpoint that speaks the OpenAI API — Ollama, vLLM, LiteLLM, Cerebras, Together AI, Groq, and others.
base_url is required and must be set either in llm_options or via AGENTIX_BASE_URL.
# Ollama (local)
options = AgentixAgentOptions(
provider="openai_compatible",
model="llama3.2",
llm_options={
"base_url": "http://localhost:11434/v1",
"api_key": "ollama", # placeholder; Ollama ignores this
},
)
# vLLM (hosted)
options = AgentixAgentOptions(
provider="openai_compatible",
model="meta-llama/Meta-Llama-3-70B-Instruct",
llm_options={
"base_url": "https://my-vllm-host/v1",
"api_key": "...",
},
)
Via environment variables:
export AGENTIX_PROVIDER=openai_compatible
export AGENTIX_BASE_URL=http://localhost:11434/v1
export AGENTIX_MODEL=llama3.2
AGENTIX_BASE_URLis only respected byopenai_compatibleanddeepseekproviders. It is ignored foranthropic,openai, andgemini.
Google Gemini
options = AgentixAgentOptions(
provider="gemini",
model="gemini-2.5-flash",
)
API key: AGENTIX_API_KEY
DeepSeek
options = AgentixAgentOptions(
provider="deepseek",
model="deepseek-chat",
)
API key: AGENTIX_API_KEY
The default base_url points to DeepSeek's official API. Override it to use a self-hosted or proxy endpoint:
options = AgentixAgentOptions(
provider="deepseek",
llm_options={"base_url": "https://my-proxy/v1"},
)
Selecting a provider via environment
Set the provider and model without touching code — useful for deploying the same agent to different backends:
export AGENTIX_PROVIDER=openai
export AGENTIX_MODEL=gpt-4o
export AGENTIX_API_KEY=sk-...
Then use a minimal options object:
options = AgentixAgentOptions(name="my-agent")
# provider="openai", model="gpt-4o" filled from environment
Environment variables are only applied when the corresponding field hasn't been explicitly set in code. Explicit fields always win.
Fallback model
Configure a fallback for when the primary model fails after all retries:
options = AgentixAgentOptions(
provider="anthropic",
model="claude-opus-4-20250514",
fallback_model="claude-sonnet-4-20250514", # used after retry exhaustion
)
Inline API key
Pass an API key directly to avoid relying on environment variables. Agentix redacts keys in logs and __repr__ automatically:
options = AgentixAgentOptions(
provider="anthropic",
llm_options={"api_key": "sk-ant-..."},
)
Prefer environment variables in production. Inline keys are convenient for short scripts but easy to accidentally commit or log.