You can not select more than 25 topics Topics must start with a chinese character,a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
i-robot 688ed894aa
!2680 【1.1】update blip2 for openmind
4 weeks ago
..
core Trainer增加push_to_hub接口 1 month ago
dataset Trainer适配平行社区,兼容当前使用方式 2 months ago
generation AutoModel、PretrianedModel基类切换 2 months ago
inference !2240 [qwen] 适配mslite的多卡导出与推理 2 months ago
models update blip2 for openmind 1 month ago
modules wizardcoder支持bf16训推 1 month ago
pet AutoModel、PretrianedModel基类切换 2 months ago
pipeline !2563 平行社区pipeline接口参数修复 1 month ago
tools 修复strategy保存路径带重复rank的问题 1 month ago
trainer 修复src_strategy不合并问题 1 month ago
utils 1、mindspore权重转torch权重 2 months ago
wrapper fixed 2cf3fab from https://gitee.com/huanglei_Sorry/mindformers/pulls/2422 2 months ago
__init__.py 切换旧auto类至新auto类 2 months ago
auto_class.py ST替换BaseProcessor为ProcessorMixin 2 months ago
mindformer_book.py Trainer适配平行社区,兼容当前使用方式 2 months ago
version_control.py 修复batchsize>1时候切分逻辑问题引入的bug 2 months ago

MindSpore Transformers套件的目标是构建一个大模型训练、微调、评估、推理、部署的全流程开发套件: 提供业内主流的Transformer类预训练模型和SOTA下游任务应用,涵盖丰富的并行特性。期望帮助用户轻松的实现大模型训练和创新研发。

Jupyter Notebook Python Markdown Shell