beam
Discover
PulseActivityAnalyticsBest forMapOrgs
Niches
AgentsMCPRAGCoding AssistantsInference & ServingVector DBs
Personal
WatchlistCompare
A
Hi, Adam
—
beam
LIVE──────── · ──:──:── UTCabout
beam
Discover
PulseActivityAnalyticsBest forMapOrgs
Niches
AgentsMCPRAGCoding AssistantsInference & ServingVector DBs
Personal
WatchlistCompare
A
Hi, Adam
—
beam
Back to Pulse
Tool profile

vllm-project/vllm

stable

A high-throughput and memory-efficient inference and serving engine for LLMs

amdblackwellcudadeepseekdeepseek-v3gptgpt-ossinference
Velocity score
3.93/ 10
[STARS]
80k
[FORKS]
17k
[CONTRIBUTORS]
2.6k
[LAST_COMMIT]
today
OPEN_ON_GITHUB
Velocity class: stable
30-day stars
3.93/ 10 score
last 89d
[SIGNAL_TRACE / 89_PT]
Score breakdown
955/ 1000
inference · vllm-project/vllm
Velocity50%
Adoption30%
Maintenance15%
Community5%
[CODE_GROWTH]
998
[INSTALL_VEL]
998
[ACTIVITY]
709
[COMMUNITY_SIGNAL]
1000

Terminal score: 0–1000 raw, weighted across 4 dimensions. Public score: 0–10 normalized (shown in the 30-day stars chart above).

LIVE──────── · ──:──:── UTCabout