Local LLM router. One endpoint for Claude Max/Pro, OpenAI, OpenRouter, Groq, Ollama, LiteLLM, any OpenAI-compat URL —...
Copy the install, test the workflow, then decide if it earns a permanent slot.
The signal is softer here. Treat it like a pattern source unless it solves a very specific gap.
Copy the install, test the workflow, then decide if it earns a permanent slot.
You can test this quickly and remove it cleanly if it misses.
GitHub health unknown. no security policy. 0 open issues make this testable, but not something to trust blind.
AI Agent
Universal
Model
Claude
Build Time
Instant
Fastest way to find out if dario belongs in your setup.
Copy the install command, run a real test, and back it out cleanly if it slows you down.
claude mcp add dario -- npx darioRun this first. You will know quickly if the workflow earns a permanent slot.
claude mcp remove darioNo messy cleanup loop. If it misses, remove it and keep moving.
Install Location
~/ └─ .claude.json └─ mcp_servers/ └─ dario ← registers here
Local LLM router. One endpoint for Claude Max/Pro, OpenAI, OpenRouter, Groq, Ollama, LiteLLM, any OpenAI-compat URL — your tools don't need to change. OAuth for Claude subscriptions, multi-account pool, MCP server. Zero runtime deps.