traylinx/switchAILocal/switchailocal
Unified LLM proxy for AI agents. Route all model requests through http://localhost:18080/v1. Provides FREE access to Gemini CLI, Claude CLI, Codex, and Vibe via your existing subscriptions. Use when: (1) making LLM calls using provider prefixes, (2) switching between CLI/Local/Cloud providers, (3) needing to attach local files/folders to prompts via CLI, (4) requiring intelligent routing between models, or (5) needing to monitor provider health and analytics.
Popularity
0
Stars
0
Forks
Feb 5, 2026
Updated
CodeThreat AppSec
Full SAST + SCA agentic security analysis for MCP servers and Skills.