Integrations
AgentXchain is agent/IDE/LLM agnostic by design. The five adapters (manual, local_cli, api_proxy, mcp, remote_agent) support virtually every connection pattern. These guides show you exactly how to connect your preferred platform.
How it works
Every agent connects through a runtime configured in your agentxchain.json. Each runtime uses one of five adapter types:
| Adapter | When to use | Transport |
|---|---|---|
manual | Human-in-the-loop roles | Operator reads prompt, writes result file |
local_cli | Local CLI tools (Claude Code, Codex, Cursor) | Subprocess stdin/argv |
api_proxy | Direct LLM API calls | HTTP to provider endpoint |
mcp | MCP-compatible agent servers | stdio or Streamable HTTP |
remote_agent | External agent services (Devin, custom) | HTTP POST/response |
Pick the guide for your platform below. Each is standalone — you only need the guides relevant to your setup.
IDE / Agent Platforms
- Claude Code —
local_cliviaclaude -p - OpenAI Codex CLI —
local_cliviacodex - Cursor —
local_cli - VS Code — Extension +
local_cli - Windsurf (Codeium) —
local_cli - Google Jules —
api_proxy(Google) - Devin —
remote_agent(HTTP)
Local Model Runners
- Ollama —
api_proxyvialocalhost:11434/v1 - MLX (Apple Silicon) —
api_proxyviamlx-lm.server
API Providers
- Anthropic — Claude Opus 4.6, Sonnet 4.6, Haiku 4.5
- OpenAI — GPT-5.4, GPT-5.3-Codex, GPT-OSS
- Google — Gemini 3.1 Pro, Flash, Gemma 4
- DeepSeek — V3.2, R2, Coder-V3
- Mistral AI — Devstral 2, Codestral, Leanstral
- xAI — Grok 4.20 Beta 2, Grok Code Fast 1
- Amazon Bedrock — Nova 2 Pro, Nova 2 Lite, Nova Premier
- Qwen (Alibaba) — Qwen3-Coder-480B, Qwen3.6-Plus
- Groq — Ultra-fast inference for GPT-OSS, Qwen3, Llama 4
- Cohere — Command A Reasoning, Command R+
Protocol Native
- MCP (Model Context Protocol) — Any MCP-compatible agent via stdio or HTTP