lilinjie 246a8af223 | 1 day ago | |
---|---|---|
.. | ||
cmake | 3 weeks ago | |
examples | 2 months ago | |
include | 6 months ago | |
java | 1 month ago | |
minddata | 2 months ago | |
providers | 2 months ago | |
python | 4 days ago | |
schema | 2 months ago | |
src | 1 day ago | |
test | 3 weeks ago | |
tools | 1 day ago | |
CMakeLists.txt | 2 months ago | |
OWNERS | 3 months ago | |
README.md | 2 years ago | |
README_CN.md | 2 years ago | |
build_lite.sh | 2 months ago |
MindSpore lite is a high-performance, lightweight open source reasoning framework that can be used to meet the needs of AI applications on mobile devices. MindSpore Lite focuses on how to deploy AI technology more effectively on devices. It has been integrated into HMS (Huawei Mobile Services) to provide inferences for applications such as image classification, object detection and OCR. MindSpore Lite will promote the development and enrichment of the AI software/hardware application ecosystem.
For more details please check out our MindSpore Lite Architecture Guide.
Cooperative work with MindSpore training
Lightweight
High-performance
Versatility
Model selection and personalized training
Select a new model or use an existing model for incremental training using labeled data. When designing a model for mobile device, it is necessary to consider the model size, accuracy and calculation amount.
The MindSpore team provides a series of pre-training models used for image classification, object detection. You can use these pre-trained models in your application.
The pre-trained model provided by MindSpore: Image Classification. More models will be provided in the feature.
MindSpore allows you to retrain pre-trained models to perform other tasks.
Model converter and optimization
If you use MindSpore or a third-party model, you need to use MindSpore Lite Model Converter Tool to convert the model into MindSpore Lite model. The MindSpore Lite model converter tool provides the converter of TensorFlow Lite, Caffe, ONNX to MindSpore Lite model, fusion and quantization could be introduced during convert procedure.
MindSpore also provides a tool to convert models running on IoT devices .
Model deployment
This stage mainly realizes model deployment, including model management, deployment, operation and maintenance monitoring, etc.
Inference
Load the model and perform inference. Inference is the process of running input data through the model to get output.
MindSpore provides pre-trained model that can be deployed on mobile device example.
We test a couple of networks on HUAWEI Mate40 (Hisilicon Kirin9000e) mobile phone, and get the test results below for your reference.
NetWork | Thread Number | Average Run Time(ms) |
---|---|---|
basic_squeezenet | 4 | 6.415 |
inception_v3 | 4 | 36.767 |
mobilenet_v1_10_224 | 4 | 4.936 |
mobilenet_v2_10_224 | 4 | 3.644 |
resnet_v2_50 | 4 | 25.071 |
MindSpore is a new open source deep learning training/inference framework that could be used for mobile, edge and cloud scenarios.
C++ Python Text C Unity3D Asset other
Dear OpenI User
Thank you for your continuous support to the Openl Qizhi Community AI Collaboration Platform. In order to protect your usage rights and ensure network security, we updated the Openl Qizhi Community AI Collaboration Platform Usage Agreement in January 2024. The updated agreement specifies that users are prohibited from using intranet penetration tools. After you click "Agree and continue", you can continue to use our services. Thank you for your cooperation and understanding.
For more agreement content, please refer to the《Openl Qizhi Community AI Collaboration Platform Usage Agreement》