jp6/cu126/: vllm versions
Because this project isn't in the mirror_whitelist
,
no releases from root/pypi are included.
Latest version on stage is: 0.6.4.dev0+gfd47e57f.d20241015.cu126
A high-throughput and memory-efficient inference and serving engine for LLMs