🤖
LiteLLM
⭐ Featuredby BerriAI
About
Universal LLM proxy and load balancer. Route requests across 100+ LLM providers (OpenAI, Anthropic, Gemini, Mistral) with a unified API and cost tracking.
Installation
pip
pip install litellm-mcp-serverCategories
Frequently Asked Questions
What is the LiteLLM MCP server?
Universal LLM proxy and load balancer. Route requests across 100+ LLM providers (OpenAI, Anthropic, Gemini, Mistral) with a unified API and cost tracking.
How do I install LiteLLM?
Install via pip:
pip install litellm-mcp-serverWhat AI clients work with LiteLLM?
Quick Info
- Install Type
- pip
- Author
- BerriAI
- Categories
- 1
- Integrations
- 5
Related Servers
🧠✓
Memory
Knowledge graph-based persistent memory system. Store and retrieve contextual information.
🤖✓
Sequential Thinking
Dynamic and reflective problem-solving through thought sequences.
🔍
Exa
Search Engine made for AIs. Neural search with understanding of content meaning.
🗄️
Milvus
Search, Query and interact with data in your Milvus Vector Database.
🗄️
Chroma
Embeddings, vector search, document storage, and full-text search with the open-source AI application database.
Ad Placeholder