docs: Add chinese readme

This commit is contained in:
KMnO4-zx
2025-05-28 21:16:40 +08:00
parent 25a74dd70b
commit 3d6844098b
2 changed files with 4 additions and 0 deletions

View File

@@ -1,7 +1,9 @@
# EasyDistill: Easy Knowledge Distillation for Large Language Models
<div align="center">
[中文](./README_zh.md) | [English](./README.md)
</div>
Introducing **EasyDistill**, a pioneering toolkit on knowledge distillation (KD) for large language models (LLMs). With the growing complexity and size of LLMs, **EasyDistill** offers a versatile and user-friendly platform to streamline the KD process, supporting both black-box and white-box methodologies. It facilitates efficient model training, enabling smaller models to emulate the performance of larger ones without compromising accuracy. **EasyDistill** boasts an extensive range of features, including data synthesis, supervised fine-tuning, ranking optimization, and reinforcement learning, all tailored for various KD scenarios. Designed to accommodate both System 1 (fast, intuitive) and System 2 (slow, analytical) cognitive models, the toolkit is modular and easy to use, with a simple command-line interface guiding users. Beyond academic exploration, **EasyDistill** anchors practical industrial solutions, offering robust distilled models and open-source datasets, while also showcasing seamless integration with Alibaba Clouds AI platform, PAI. Committed to bridging theoretical advancements with practical needs, **EasyDistill** empowers the NLP community, making state-of-the-art KD strategies accessible to researchers and industry practitioners alike.

View File

@@ -1,7 +1,9 @@
# EasyDistill: Easy Knowledge Distillation for Large Language Models
<div align="center">
[中文](./README_zh.md) | [English](./README.md)
</div>
**EasyDistill** 是一个专为大语言模型LLMs知识蒸馏KD而设计的开创性工具包。随着大语言模型复杂性和规模的不断增长**EasyDistill** 提供了一个多功能且用户友好的平台来简化知识蒸馏过程,支持黑盒和白盒两种方法。它促进高效的模型训练,使较小的模型能够在不损失准确性的情况下模拟较大模型的性能。**EasyDistill** 拥有广泛的功能特性包括数据合成、监督微调、排序优化和强化学习所有这些都针对各种知识蒸馏场景进行了定制。该工具包设计用于适应系统1快速、直觉和系统2缓慢、分析认知模型具有模块化和易于使用的特点配备简单的命令行界面来指导用户。除了学术探索之外**EasyDistill** 还锚定实际的工业解决方案提供强大的蒸馏模型和开源数据集同时展示与阿里云AI平台PAI的无缝集成。致力于连接理论进步与实际需求**EasyDistill** 赋能NLP社区使最先进的知识蒸馏策略对研究人员和行业从业者都变得可及。