Fast linear attention models and layers
No known vulnerabilities in the latest version
[](/packages/flash-linear-attention)
<a href="/packages/flash-linear-attention"><img src="/api/badges/flash-linear-attention?period=month" alt="PyPI Stats"></a>