Skip to content

New Mixture-of-Experts Plugin

Compare
Choose a tag to compare
@fabianlim fabianlim released this 02 Jan 14:43
· 4 commits to main since this release

Release

  • framework v0.5.0: minor version. Updated to manage new moe plugin.
  • moe v0.1.0: new plugin. Released mixture-of-experts plugin with ScatterMoE kernels.
  • peft v0.3.5: patch version. Fixed Autocast warnings (#113).
  • foak v0.4.0: minor version. Add support for Liger fused-ce (#93), fixes for Fused Ops (dropout and activation).

What's Changed

  • Fix Dropout in Fused LoRA Operations by @fabianlim in #102
  • Add ExpertParallel Mixture-of-Experts Plugin by @fabianlim in #99
  • Disable MLP Fused Ops if Not SwiGLU, Depracted Fast Quantized Peft Plugin, Update Benchmarks by @fabianlim in #106
  • fix: requirements file path in error by @willmj in #111
  • fix: Deprecation Warnings in AutoCast API by @Abhishek-TAMU in #113
  • feat: add liger kernel with fused cross entropy loss by @anhuong in #93
  • feat: Checkpoint utils safetensors by @willmj in #116

New Contributors

Full Changelog: v0.4.0.4...v0.4.0.5