Skip to main content

Integrations

AgentXchain is agent/IDE/LLM agnostic by design. The five adapters (manual, local_cli, api_proxy, mcp, remote_agent) support virtually every connection pattern. These guides show you exactly how to connect your preferred platform.

How it works

Every agent connects through a runtime configured in your agentxchain.json. Each runtime uses one of five adapter types:

AdapterWhen to useTransport
manualHuman-in-the-loop rolesOperator reads prompt, writes result file
local_cliLocal CLI tools (Claude Code, Codex, Cursor)Subprocess stdin/argv
api_proxyDirect LLM API callsHTTP to provider endpoint
mcpMCP-compatible agent serversstdio or Streamable HTTP
remote_agentExternal agent services (Devin, custom)HTTP POST/response

Pick the guide for your platform below. Each is standalone — you only need the guides relevant to your setup.


IDE / Agent Platforms

Local Model Runners

API Providers

  • Anthropic — Claude Opus 4.6, Sonnet 4.6, Haiku 4.5
  • OpenAI — GPT-5.4, GPT-5.3-Codex, GPT-OSS
  • Google — Gemini 3.1 Pro, Flash, Gemma 4
  • DeepSeek — V3.2, R2, Coder-V3
  • Mistral AI — Devstral 2, Codestral, Leanstral
  • xAI — Grok 4.20 Beta 2, Grok Code Fast 1
  • Amazon Bedrock — Nova 2 Pro, Nova 2 Lite, Nova Premier
  • Qwen (Alibaba) — Qwen3-Coder-480B, Qwen3.6-Plus
  • Groq — Ultra-fast inference for GPT-OSS, Qwen3, Llama 4
  • Cohere — Command A Reasoning, Command R+

Protocol Native