nanluan
  • Joined on Oct 19, 2023
Loading Heatmap…

nanluan created new model bert-base-wikipedia-sections-mean-tokens

6 months ago

nanluan created repository nanluan/bert-base-wikipedia-sections-m...

6 months ago

nanluan created new model sentence-bert-base-italian-xxl-uncased

6 months ago

nanluan created repository nanluan/sentence-bert-base-italian-xxl...

6 months ago

nanluan created repository nanluan/sentence-bert-base-italian-unc...

6 months ago

nanluan created new model multilingual-e5-base

6 months ago

nanluan created repository nanluan/multilingual-e5-base

6 months ago

nanluan created new model gte-small

6 months ago

nanluan created repository nanluan/gte-small

6 months ago

nanluan created new model nomic-embed-text-v1.5

6 months ago

nanluan created repository nanluan/nomic-embed-text-v1.5

6 months ago

nanluan created new model paraphrase-multilingual-MiniLM-L12-v2

6 months ago

nanluan created repository nanluan/paraphrase-multilingual-MiniLM...

6 months ago

nanluan opened issue openMind/openMind_Library#316

使用openmind测试 nq-distilbert-base-v1 模型的心得体会

6 months ago

nanluan commented on issue openMind/openMind_Library#315

基于openmind跑通msmarco-distilbert-base-v2

### 25.【msmarco-distilbert-base-v2】 ### 原模型名称及在启智社区的地址 模型名称:msmarco-distilbert-base-v2 模型地址:[FoundationModel/sentence-transformers - sentence-transformers - OpenI - 启智AI开源社区提供普惠算力! (pcl.ac.cn)](https://openi.pcl.ac.cn/FoundationModel/sentence-transformers/modelmanage/model_readme_tmpl?name=xlm-r-bert-base-nli-stsb-mean-tokens) 属于方:启智上传 ### 云脑调试任务名称及运行过程截图 ![image-20240902195247195](https://zhengzizhi122921.oss-cn-beijing.aliyuncs.com/img/image-20240902195247195.png) ![image-20240915192752385](https://zhengzizhi122921.oss-cn-beijing.aliyuncs.com/img/image-20240915192752385.png) ### 基于openMind工具链完成适配后的模型名称及在启智社区、魔乐社区的地址 模型名称:msmarco-distilbert-base-v2 模型地址:[nanluan/msmarco-distilbert-base-v2 - msmarco-distilbert-base-v2 - OpenI - 启智AI开源社区提供普惠算力! (pcl.ac.cn)](https://openi.pcl.ac.cn/nanluan/msmarco-distilbert-base-v2/modelmanage/model_readme_tmpl?name=msmarco-distilbert-base-v2) ![image-20240915193211274](https://zhengzizhi122921.oss-cn-beijing.aliyuncs.com/img/image-20240915193211274.png) 魔乐社区:[nanluan/msmarco-distilbert-base-v2 | 魔乐社区 (modelers.cn)](https://modelers.cn/models/nanluan/msmarco-distilbert-base-v2) ![image-20240915193317100](https://zhengzizhi122921.oss-cn-beijing.aliyuncs.com/img/image-20240915193317100.png) ### 心得体验 ### . **模型背景** `msmarco-distilbert-base-v2` 是一个基于 DistilBERT 的预训练模型,专为信息检索(IR)任务优化,特别是在 MS MARCO 数据集上进行训练。DistilBERT 是 BERT 的一个精简版本,旨在通过减少模型参数和计算量来提高效率,同时尽可能保留性能。 ### 2. **性能评估** - **速度与效率**:DistilBERT 的优势在于速度和计算效率。在进行大规模查询或需要快速响应的应用中,它的性能显著优于原始 BERT。测试过程中,模型在响应时间上表现良好,适合于实时或近实时的应用场景。 - **准确性**:虽然 DistilBERT 在参数和计算量上做了简化,但在信息检索任务中的表现仍然非常出色。测试结果通常表明它能够有效地理解查询意图并提供相关性高的文档。 ### 3. **测试体验** - **易用性**:模型的使用相对简单,许多现成的库和工具(如 Hugging Face Transformers)都支持该模型。加载和运行模型的过程流畅,能够快速上手。 - **结果质量**:在处理检索任务时,`msmarco-distilbert-base-v2` 能够提供高相关性的文档检索结果,但在某些复杂查询或边界情况中,可能会出现准确性略有下降的情况。这是因为模型在压缩过程中可能丢失了一些细节。

6 months ago

nanluan opened issue openMind/openMind_Library#315

基于openmind跑通msmarco-distilbert-base-v2

6 months ago

nanluan created new model bert-base-nli-cls-token

6 months ago

nanluan created repository nanluan/bert-base-nli-cls-token

6 months ago

nanluan created new model e5-base-unsupervised

6 months ago

nanluan created repository nanluan/e5-base-unsupervised

6 months ago