Awesome-LLM-Inference v0.6
What's Changed
- Add an ICLR paper for KV cache compression by @Janghyun1230 in #8
- Add github link for paper FP8-Quantization[2208.09225] by @Mr-Philo in #9
New Contributors
- @Janghyun1230 made their first contribution in #8
- @Mr-Philo made their first contribution in #9
Full Changelog: v0.5...v0.6