Are you sure you want to delete this task? Once this task is deleted, it cannot be recovered.
zephon993 220f3be97e | 1 year ago | |
---|---|---|
nano部署 | 1 year ago | |
README.md | 1 year ago |
!unzip -oq /home/aistudio/data/data128635/bottle.zip
!git clone https://gitee.com/paddlepaddle/PaddleDetection.git
!pip install paddlepaddle-gpu
!pip install pycocotools
!pip install lap
!pip install motmetrics
import cv2
import os
from matplotlib import pyplot as plt
import numpy
os.environ['CUDA_VISIBLE_DEVICES'] = '0'
!pip install paddlex
!pip install paddle2onnx
!paddlex --split_dataset --format VOC --dataset_dir bottle --val_value 0.1 --test_value 0.1
%cd PaddleDetection/
!python tools/x2coco.py
--dataset_type voc
--voc_anno_dir ../bottle/
--voc_anno_list ../bottle/train_list.txt
--voc_label_list ../bottle/labels.txt
--voc_out_name ../bottle/voc_train.json
!python tools/x2coco.py
--dataset_type voc
--voc_anno_dir ../bottle/
--voc_anno_list ../bottle/val_list.txt
--voc_label_list ../bottle/labels.txt
--voc_out_name ../bottle/voc_val.json
!python tools/x2coco.py
--dataset_type voc
--voc_anno_dir ../bottle/
--voc_anno_list ../bottle/test_list.txt
--voc_label_list ../bottle/labels.txt
--voc_out_name ../bottle/voc_test.json
%cd
!python PaddleDetection/tools/train.py
-c PaddleDetection/configs/ppyolo/ppyolo_tiny_650e_coco.yml
--vdl_log_dir ~/log_crowdhuman/ppyolo_voc
--use_vdl True
!python PaddleDetection/tools/export_model.py -c PaddleDetection/configs/ppyolo/ppyolo_tiny_650e_coco.yml -o weights=output/ppyolo_tiny_650e_coco/65.pdparams
%cd PaddleDetection/
result=!python ./deploy/python/infer.py --model_dir=../output_inference/ppyolo_tiny_650e_coco --image_file=../1024.jpg
import numpy as np
index1='class_id'
index2='right_bottom'
for dt in result:
if index1 in dt:
temp=dt
break
print(temp)
b = temp.split(',')
x1=int(float(b[2][11:]))
y1=int(float(b[3][:-1]))
x2=int(float(b[4][14:]))
y2=int(float(b[5][:-1]))
img =np.array(cv2.imread('/home/aistudio/1024.jpg'))
print(img.shape)
img2 = img[y1-20:y2+20,x1-20:x2+20]
plt.imshow(img2)
!pip install --upgrade paddlepaddle -i https://mirror.baidu.com/pypi/simple
!pip install --upgrade paddlehub -i https://mirror.baidu.com/pypi/simple
!pip install shapely -i https://pypi.tuna.tsinghua.edu.cn/simple
!pip install pyclipper -i https://pypi.tuna.tsinghua.edu.cn/simple
import paddlehub as hub
import cv2
import numpy as np
import matplotlib.pyplot as plt # plt 用于显示图片
import matplotlib.image as mpimg # mpimg 用于读取图片
import numpy as np
ocr = hub.Module(name="chinese_ocr_db_crnn_server")
np_images =[img2]
results = ocr.recognize_text(
images=np_images, # 图片数据,ndarray.shape 为 [H, W, C],BGR格式;
use_gpu=False, # 是否使用 GPU;若使用GPU,请先设置CUDA_VISIBLE_DEVICES环境变量
output_dir='ocr_result', # 图片的保存路径,默认设为 ocr_result;
visualization=True, # 是否将识别结果保存为图片文件;
box_thresh=0.5, # 检测文本框置信度的阈值;
text_thresh=0.5) # 识别中文文本置信度的阈值;
for result in results:
data = result['data']
save_path = result['save_path']
for infomation in data:
print('text: ', infomation['text'], '\nconfidence: ', infomation['confidence'], '\ntext_box_position: ', infomation['text_box_position'])
if '物料编号' in infomation['text']:
code = infomation['text']
print(code)
break
code2 = code[-5:]
print(code2)
f=open('/home/aistudio/{}.txt'.format(code2), encoding='utf-8')
for line in f:
print(line)
参考资料:
Jetson Nano是Nvidia推出的低配版GPU运算平台,可以用来入门深度学习模型的部署,上手起来也是非常简单。
系统安装过程分为3步:
1.下载必要的软件及镜像
2.格式化SD卡并写入镜像
3.连接电源并启动
开机后,如果能够成功进入上面的显示界面,那么恭喜你,你已成功安装。
如果你在安装过程中遇到了问题,或者是想深入配置(风扇,wifi,,换源,远程桌面等),那么可以看看下面这几篇文章:
下载对应Nano版本的paddlepaddle
最后在Nano端安装下载好的whl包即可完成安装
Dear OpenI User
Thank you for your continuous support to the Openl Qizhi Community AI Collaboration Platform. In order to protect your usage rights and ensure network security, we updated the Openl Qizhi Community AI Collaboration Platform Usage Agreement in January 2024. The updated agreement specifies that users are prohibited from using intranet penetration tools. After you click "Agree and continue", you can continue to use our services. Thank you for your cooperation and understanding.
For more agreement content, please refer to the《Openl Qizhi Community AI Collaboration Platform Usage Agreement》