Provider-agnostic framework for high-throughput LLM processing with async workers, automatic retries, rate limiting, and intelligent validation recovery.
No known vulnerabilities in the latest version
Based on latest version 0.7.1. If you're running an older version, check OSV for your specific version.
[](/packages/async-batch-llm)
<a href="/packages/async-batch-llm"><img src="/api/badges/async-batch-llm?period=month" alt="PyPI Stats"></a>