Autopilot AI orchestration — describe your intent, orchex auto-generates a plan, parallelizes it into streams with file ownership enforcement, self-heals failures, and routes across 6 LLM providers (Claude, OpenAI, Gemini, DeepSeek, Ollama, AWS Bedrock).
Overview
Orchex is an MCP server that orchestrates multi-LLM workflows. Describe what you want — Orchex plans, parallelizes, and executes safely.
Features:
Auto-plan — describe your goal, get a structured execution plan with parallel streams 6 LLM providers — Claude, OpenAI, Gemini, DeepSeek, Ollama, AWS Bedrock Self-healing — automatic error categorization and fix stream generation Self-improving — learns from execution reports to improve future plans Ownership enforcement — each stream owns its files, preventing conflicts 12 MCP tools — init, add_stream, status, execute, complete, recover, learn, init-plan, auto, reset-learning, rollback-stream, reload
Server Config
{
"mcpServers": {
"orchex": {
"command": "npx",
"args": [
"-y",
"@wundam/orchex"
]
}
}
}