Update README.md

This commit is contained in:
chywang
2025-07-28 14:11:51 +08:00
committed by GitHub
parent 0ed2afac1a
commit 2f21aaae17

View File

@@ -11,6 +11,7 @@ Introducing **EasyDistill**, a pioneering toolkit on knowledge distillation (KD)
# News
- July 28th: We have released the functionalities of knowledge distillation from MLLM (aka MMKD). Refer to [Here](./easydistill/mmkd). Evaluations on the qualities of instruction-following and CoT datasets have been updated. Refer to [Here](./easydistill/eval).
- June 25th: We have released a new series of DistilQWen models named DistilQwen-ThoughtY, togeter with OmniThought-0528 (CoTs distilled from DeepSeek-R1-0528).