beam
Discover
PulseActivityAnalyticsBest forMapOrgs
Niches
AgentsMCPRAGCoding AssistantsInference & ServingVector DBs
Personal
WatchlistCompare
A
Hi, Adam
—
beam
LIVE──────── · ──:──:── UTCabout
beam
Discover
PulseActivityAnalyticsBest forMapOrgs
Niches
AgentsMCPRAGCoding AssistantsInference & ServingVector DBs
Personal
WatchlistCompare
A
Hi, Adam
—
beam
Back to Pulse
Tool profile

BerriAI/litellm

stable

Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, VLLM, NVIDIA NIM]

ai-gatewayanthropicazure-openaibedrockgatewaylangchainlitellmllm
Velocity score
3.35/ 10
[STARS]
46k
[FORKS]
7.9k
[CONTRIBUTORS]
1.5k
[LAST_COMMIT]
today
OPEN_ON_GITHUB
Velocity class: stable
30-day stars
3.35/ 10 score
last 90d
[SIGNAL_TRACE / 90_PT]
Score breakdown
789/ 1000
observability · BerriAI/litellm
Velocity50%
Adoption30%
Maintenance15%
Community5%
[CODE_GROWTH]
977
[INSTALL_VEL]
500
[ACTIVITY]
669
[COMMUNITY_SIGNAL]
1000

Terminal score: 0–1000 raw, weighted across 4 dimensions. Public score: 0–10 normalized (shown in the 30-day stars chart above).

LIVE──────── · ──:──:── UTCabout