jp6/cu126/: flash-attn versions
Because this project isn't in the mirror_whitelist
,
no releases from root/pypi are included.
Latest version on stage is: 2.6.3
Flash Attention: Fast and Memory-Efficient Exact Attention
Index | Version | Documentation |
---|---|---|
jp6/cu126 | 2.6.3 | |
jp6/cu126 | 2.5.7 |