About
Overview
The LiteLLM MCP Server provides a unified proxy for calling multiple LLM providers. It normalizes the API across OpenAI, Anthropic, Cohere, and many others with automatic fallback and load balancing.
Capabilities
Tools & Capabilities
⚡completion
Generate text via any supported provider
⚡list_models
List configured models
⚡get_spend
Get API spend tracking
Setup
Installation
bash
Install
pip install mcp-server-litellmExamples
Example Usage
javascript
Usage
{
"mcpServers": {
"litellm": {
"command": "python",
"args": ["-m", "mcp_server_litellm"]
}
}
}Quick Info
Authorberriai
LanguagePython
StatusStable
Stars★ 120
Last UpdatedFeb 12, 2026