██╗ ██╗ ██████╗██╗ █████╗ ██╗ ██╗██████╗ ███████╗ ╚██╗██╔╝██╔════╝██║ ██╔══██╗██║ ██║██╔══██╗██╔════╝ ╚███╔╝ ██║ ██║ ███████║██║ ██║██║ ██║█████╗ ██╔██╗ ██║ ██║ ██╔══██║██║ ██║██║ ██║██╔══╝ ██╔╝ ██╗╚██████╗███████╗██║ ██║╚██████╔╝██████╔╝███████╗ ╚═╝ ╚═╝ ╚═════╝╚══════╝╚═╝ ╚═╝ ╚═════╝ ╚═════╝ ╚══════╝
Claude Code — adapted for every model.
OpenAI, Gemini, DeepSeek, Groq, Ollama and 200+ providers.
Same powerful workflow. Your API key. Your choice.
Works with OpenAI · Gemini · DeepSeek · Groq · Ollama · GitHub Models · Azure · AWS Bedrock · Vertex AI
Choose your provider and start coding. No account needed for local models.
Cloud or local. Fast or powerful. Cheap or free. Pick what fits.
No rewrites. No compromises. Full Claude Code toolkit — minus the lock-in.
Bash, file edit, glob, grep, web fetch, web search, agent spawning, MCP — all tools work with every provider.
Route different agents to different models. Use a fast cheap model for explore, a powerful one for coding.
200+ built-in commands. /provider, /memory, /commit, /review and more — all work out of the box.
Connect any Model Context Protocol server. Databases, APIs, file systems — extend Xclaude's reach without changing providers.
All Anthropic analytics, tracking, GrowthBook flags and auto-updater calls are stripped at build time. Your data stays local.
Automatic retry with exponential backoff on 429/500/503. Respects Retry-After headers for all providers.
Persistent session state, task tracking, and auto-memory — works the same regardless of which LLM you're using.
Vim keybindings, VS Code extension support, and a rich terminal UI powered by React + Ink.