🐼 llm-serving

👇 1 个项目

vllm

55.1k Python Apache-2.0

A high-throughput and memory-efficient inference and serving engine for LLMs

1 3 年前 6 个月前