Deleting a branch is permanent. It CANNOT be undone. Continue?
tangdezhi_123/MSAdapter_self:br_tdz_1
into master
1 year ago
Deleting a branch is permanent. It CANNOT be undone. Continue?
Dear OpenI User
Thank you for your continuous support to the Openl Qizhi Community AI Collaboration Platform. In order to protect your usage rights and ensure network security, we updated the Openl Qizhi Community AI Collaboration Platform Usage Agreement in January 2024. The updated agreement specifies that users are prohibited from using intranet penetration tools. After you click "Agree and continue", you can continue to use our services. Thank you for your cooperation and understanding.
For more agreement content, please refer to the《Openl Qizhi Community AI Collaboration Platform Usage Agreement》
新增mindtorch.torch.nn.functional.prompt_flash_attention的精度测试用例,图模式和pynative模式,本地已验证通过。
==================================== ERRORS ====================================
_ ERROR collecting testing/ut/pytorch/nn/functional/test_prompt_flash_attention.py _
ImportError while importing test module '/drone/src/testing/ut/pytorch/nn/functional/test_prompt_flash_attention.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
/usr/local/python-3.7.5/lib/python3.7/importlib/init.py:127: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
testing/ut/pytorch/nn/functional/test_prompt_flash_attention.py:12: in
from mindpsore import nn
E ModuleNotFoundError: No module named 'mindpsore'
FAILED testing/ut/pytorch/nn/functional/test_prompt_flash_attention.py::test_prompt_flash_attention_no_padding - RuntimeError: For primitive[PromptFlashAttention], the rank of atten_mask should be 3 or 4, but got 0
需要配套1129及之后的r2.3的ms包 && 新增精度用例已在910B上本地验证通过。
81d9ebb30e
.