docs: Add chinese readme
This commit is contained in:
@@ -1,7 +1,9 @@
|
||||
# EasyDistill: Easy Knowledge Distillation for Large Language Models
|
||||
|
||||
<div align="center">
|
||||
|
||||
[中文](./README_zh.md) | [English](./README.md)
|
||||
|
||||
</div>
|
||||
|
||||
Introducing **EasyDistill**, a pioneering toolkit on knowledge distillation (KD) for large language models (LLMs). With the growing complexity and size of LLMs, **EasyDistill** offers a versatile and user-friendly platform to streamline the KD process, supporting both black-box and white-box methodologies. It facilitates efficient model training, enabling smaller models to emulate the performance of larger ones without compromising accuracy. **EasyDistill** boasts an extensive range of features, including data synthesis, supervised fine-tuning, ranking optimization, and reinforcement learning, all tailored for various KD scenarios. Designed to accommodate both System 1 (fast, intuitive) and System 2 (slow, analytical) cognitive models, the toolkit is modular and easy to use, with a simple command-line interface guiding users. Beyond academic exploration, **EasyDistill** anchors practical industrial solutions, offering robust distilled models and open-source datasets, while also showcasing seamless integration with Alibaba Cloud’s AI platform, PAI. Committed to bridging theoretical advancements with practical needs, **EasyDistill** empowers the NLP community, making state-of-the-art KD strategies accessible to researchers and industry practitioners alike.
|
||||
|
@@ -1,7 +1,9 @@
|
||||
# EasyDistill: Easy Knowledge Distillation for Large Language Models
|
||||
|
||||
<div align="center">
|
||||
|
||||
[中文](./README_zh.md) | [English](./README.md)
|
||||
|
||||
</div>
|
||||
|
||||
**EasyDistill** 是一个专为大语言模型(LLMs)知识蒸馏(KD)而设计的开创性工具包。随着大语言模型复杂性和规模的不断增长,**EasyDistill** 提供了一个多功能且用户友好的平台来简化知识蒸馏过程,支持黑盒和白盒两种方法。它促进高效的模型训练,使较小的模型能够在不损失准确性的情况下模拟较大模型的性能。**EasyDistill** 拥有广泛的功能特性,包括数据合成、监督微调、排序优化和强化学习,所有这些都针对各种知识蒸馏场景进行了定制。该工具包设计用于适应系统1(快速、直觉)和系统2(缓慢、分析)认知模型,具有模块化和易于使用的特点,配备简单的命令行界面来指导用户。除了学术探索之外,**EasyDistill** 还锚定实际的工业解决方案,提供强大的蒸馏模型和开源数据集,同时展示与阿里云AI平台PAI的无缝集成。致力于连接理论进步与实际需求,**EasyDistill** 赋能NLP社区,使最先进的知识蒸馏策略对研究人员和行业从业者都变得可及。
|
||||
|
Reference in New Issue
Block a user