From 2f21aaae1779009aeca613b55481e7ac70b66864 Mon Sep 17 00:00:00 2001 From: chywang Date: Mon, 28 Jul 2025 14:11:51 +0800 Subject: [PATCH] Update README.md --- README.md | 1 + 1 file changed, 1 insertion(+) diff --git a/README.md b/README.md index 6432def..08acfa0 100644 --- a/README.md +++ b/README.md @@ -11,6 +11,7 @@ Introducing **EasyDistill**, a pioneering toolkit on knowledge distillation (KD) # News +- July 28th: We have released the functionalities of knowledge distillation from MLLM (aka MMKD). Refer to [Here](./easydistill/mmkd). Evaluations on the qualities of instruction-following and CoT datasets have been updated. Refer to [Here](./easydistill/eval). - June 25th: We have released a new series of DistilQWen models named DistilQwen-ThoughtY, togeter with OmniThought-0528 (CoTs distilled from DeepSeek-R1-0528).