

Open-source proxy and SDK that provides a single unified API to call and manage hundreds of different LLM providers and models with OpenAI-compatible endpoints. Simplifies multi-provider LLM integration.
Loading more......
LiteLLM is an open-source project that provides a unified interface for working with over 100 LLM providers through a single OpenAI-compatible API. It eliminates the complexity of managing multiple LLM provider SDKs.
Supports major providers including OpenAI, Anthropic, Google (PaLM, Gemini), Cohere, Azure OpenAI, AWS Bedrock, Hugging Face, and many more.
Integrates with Langfuse for observability and works alongside vector databases in RAG architectures. Popular with developers building production LLM applications.
Free and open-source. The proxy server can be self-hosted or used through LiteLLM's managed service.