跳转至

部署

依赖

pip install -r requirement.txt

MindSpore Lite环境准备

参考:Lite环境配置
注意:MindSpore Lite适配的python环境为3.7,请在安装Lite前准备好python3.7的环境

  1. 根据环境,下载配套的tar.gz包和whl包

  2. 解压tar.gz包并安装对应版本的whl包

    tar -zxvf mindspore_lite-2.0.0a0-cp37-cp37m-{os}_{platform}_64.tar.gz
    pip install mindspore_lite-2.0.0a0-cp37-cp37m-{os}_{platform}_64.whl
    

  3. 配置Lite的环境变量 LITE_HOME为tar.gz解压出的文件夹路径,推荐使用绝对路径
    export LITE_HOME=/path/to/mindspore-lite-{version}-{os}-{platform}
    export LD_LIBRARY_PATH=$LITE_HOME/runtime/lib:$LITE_HOME/tools/converter/lib:$LD_LIBRARY_PATH
    export PATH=$LITE_HOME/tools/converter/converter:$LITE_HOME/tools/benchmark:$PATH
    

快速开始

模型转换

ckpt模型转为mindir模型,此步骤可在CPU/Ascend910上运行

python ./deploy/export.py --config ./path_to_config/model.yaml --weight ./path_to_ckpt/weight.ckpt --per_batch_size 1 --file_format MINDIR --device_target [CPU/Ascend]
e.g.
# 在CPU上运行
python ./deploy/export.py --config ./configs/yolov5/yolov5n.yaml --weight yolov5n_300e_mAP273-9b16bd7b.ckpt --per_batch_size 1 --file_format MINDIR --device_target CPU
# 在Ascend上运行
python ./deploy/export.py --config ./configs/yolov5/yolov5n.yaml --weight yolov5n_300e_mAP273-9b16bd7b.ckpt --per_batch_size 1 --file_format MINDIR --device_target Ascend

Lite Test

python deploy/test.py --model_type Lite --model_path ./path_to_mindir/weight.mindir --config ./path_to_config/yolo.yaml
e.g.
python deploy/test.py --model_type Lite --model_path ./yolov5n.mindir --config ./configs/yolov5/yolov5n.yaml

Lite Predict

python ./deploy/predict.py --model_type Lite --model_path ./path_to_mindir/weight.mindir --config ./path_to_conifg/yolo.yaml --image_path ./path_to_image/image.jpg
e.g.
python deploy/predict.py --model_type Lite --model_path ./yolov5n.mindir --config ./configs/yolov5/yolov5n.yaml --image_path ./coco/image/val2017/image.jpg

脚本说明

  • predict.py 支持单张图片推理
  • test.py 支持COCO数据集推理
  • 注意:当前只支持在Ascend 310上推理

MindX部署

环境配置

参考:MindX环境准备
注意:MindX目前支持的python版本为3.9,请在安装MindX前,准备好python3.9的环境

  1. 在MindX官网获取环境安装包,目前支持3.0.0版本MindX推理

  2. 跳转至下载页面下载Ascend-mindxsdk-mxmanufacture_{version}_linux-{arch}.run

  3. 将安装包放置于Ascend310机器目录中并解压

  4. 如不是root用户,需增加对套件包的可执行权限:

    chmod +x Ascend-mindxsdk-mxmanufacture_{version}_linux-{arch}.run
    

  5. 进入开发套件包的上传路径,安装mxManufacture开发套件包。
    ./Ascend-mindxsdk-mxmanufacture_{version}_linux-{arch}.run --install
    
    安装完成后,若出现如下回显,表示软件成功安装。
    The installation is successfully
    
    安装完成后,mxManufacture软件目录结构如下所示:
    .
    ├── bin
    ├── config
    ├── filelist.txt
    ├── include
    ├── lib
    ├── opensource
    ├── operators
    ├── python
    ├── samples
    ├── set_env.sh
    ├── toolkit
    └── version.info
    
  6. 进入mxmanufacture的安装目录,运行以下命令,使MindX SDK环境变量生效。
    source set_env.sh
    
  7. 进入./mxVision-3.0.0/python/,安装mindx-3.0.0-py3-none-any.whl
    pip install mindx-3.0.0-py3-none-any.whl
    

模型转换

  1. ckpt模型转为air模型,此步骤需要在Ascend910上操作

    python ./deploy/export.py --config ./path_to_config/model.yaml --weight ./path_to_ckpt/weight.ckpt --per_batch_size 1 --file_format AIR
    e.g.
    python ./deploy/export.py --config ./configs/yolov5/yolov5n.yaml --weight yolov5n_300e_mAP273-9b16bd7b.ckpt --per_batch_size 1 --file_format AIR
    
    yolov7需要在2.0版本以上的Ascend910机器运行export

  2. air模型转为om模型,使用atc转换工具,此步骤需安装MindX环境,在Ascend310上运行

    atc --model=./path_to_air/weight.air --framework=1 --output=yolo  --soc_version=Ascend310
    

MindX Test

对COCO数据推理:

python ./deploy/test.py --model_type MindX --model_path ./path_to_om/weight.om --config ./path_to_config/yolo.yaml
e.g.
python ./deploy/test.py --model_type MindX --model_path ./yolov5n.om --config ./configs/yolov5/yolov5n.yaml

MindX Predict

对单张图片推理:

python ./deploy/predict.py --model_type MindX --model_path ./path_to_om/weight.om --config ./path_to_config/yolo.yaml --image_path ./path_to_image/image.jpg
e.g.
python ./deploy/predict.py --model_type MindX --model_path ./yolov5n.om --config ./configs/yolov5/yolov5n.yaml --image_path ./coco/image/val2017/image.jpg

MindIR部署

环境要求

mindspore>=2.1

注意事项

  1. 当前仅支持Predict

  2. 理论上也可在Ascend910上运行,未测试

模型转换

ckpt模型转为mindir模型,此步骤可在CPU上运行

python ./deploy/export.py --config ./path_to_config/model.yaml --weight ./path_to_ckpt/weight.ckpt --per_batch_size 1 --file_format MINDIR --device_target CPU
e.g.
# 在CPU上运行
python ./deploy/export.py --config ./configs/yolov5/yolov5n.yaml --weight yolov5n_300e_mAP273-9b16bd7b.ckpt --per_batch_size 1 --file_format MINDIR --device_target CPU

MindIR Test

敬请期待

MindIR Predict

对单张图片推理:

python ./deploy/predict.py --model_type MindIR --model_path ./path_to_mindir/weight.mindir --config ./path_to_conifg/yolo.yaml --image_path ./path_to_image/image.jpg
e.g.
python deploy/predict.py --model_type MindIR --model_path ./yolov5n.mindir --config ./configs/yolov5/yolov5n.yaml --image_path ./coco/image/val2017/image.jpg

ONNX部署

注意: 仅部分模型支持导出ONNX并使用ONNXRuntime进行部署

环境配置

pip install onnx>=1.9.0
pip install onnxruntime>=1.8.0

注意事项

  1. 当前并非所有mindyolo均支持ONNX导出和推理(仅以YoloV3为例)

  2. 当前仅支持Predict功能

  3. 导出ONNX需要调整nn.SiLU算子,采用sigmoid算子底层实现

例如:添加如下自定义层并替换mindyolo中所有的nn.SiLU

class EdgeSiLU(nn.Cell):
    """
    SiLU activation function: x * sigmoid(x). To support for onnx export with nn.SiLU.
    """

    def __init__(self):
        super().__init__()

    def construct(self, x):
        return x * ops.sigmoid(x)

模型转换

ckpt模型转为ONNX模型,此步骤以及Test步骤均仅支持CPU上运行

python ./deploy/export.py --config ./path_to_config/model.yaml --weight ./path_to_ckpt/weight.ckpt --per_batch_size 1 --file_format ONNX --device_target [CPU]
e.g.
# 在CPU上运行
python ./deploy/export.py --config ./configs/yolov3/yolov3.yaml --weight yolov3-darknet53_300e_mAP455-adfb27af.ckpt --per_batch_size 1 --file_format ONNX --device_target CPU

ONNX Test

敬请期待

ONNXRuntime Predict

对单张图片推理:

python ./deploy/predict.py --model_type ONNX --model_path ./path_to_onnx_model/model.onnx --config ./path_to_config/yolo.yaml --image_path ./path_to_image/image.jpg
e.g.
python ./deploy/predict.py --model_type ONNX --model_path ./yolov3.onnx --config ./configs/yolov3/yolov3.yaml --image_path ./coco/image/val2017/image.jpg

标准和支持的模型库

Name Scale Context ImageSize Dataset Box mAP (%) Params FLOPs Recipe Download
YOLOv8 N D310x1-G 640 MS COCO 2017 37.2 3.2M 8.7G yaml ckpt
mindir
YOLOv8 S D310x1-G 640 MS COCO 2017 44.6 11.2M 28.6G yaml ckpt
mindir
YOLOv8 M D310x1-G 640 MS COCO 2017 50.5 25.9M 78.9G yaml ckpt
mindir
YOLOv8 L D310x1-G 640 MS COCO 2017 52.8 43.7M 165.2G yaml ckpt
mindir
YOLOv8 X D310x1-G 640 MS COCO 2017 53.7 68.2M 257.8G yaml ckpt
mindir
YOLOv7 Tiny D310x1-G 640 MS COCO 2017 37.5 6.2M 13.8G yaml ckpt
mindir
YOLOv7 L D310x1-G 640 MS COCO 2017 50.8 36.9M 104.7G yaml ckpt
mindir
YOLOv7 X D310x1-G 640 MS COCO 2017 52.4 71.3M 189.9G yaml ckpt
mindir
YOLOv5 N D310x1-G 640 MS COCO 2017 27.3 1.9M 4.5G yaml ckpt
mindir
YOLOv5 S D310x1-G 640 MS COCO 2017 37.6 7.2M 16.5G yaml ckpt
mindir
YOLOv5 M D310x1-G 640 MS COCO 2017 44.9 21.2M 49.0G yaml ckpt
mindir
YOLOv5 L D310x1-G 640 MS COCO 2017 48.5 46.5M 109.1G yaml ckpt
mindir
YOLOv5 X D310x1-G 640 MS COCO 2017 50.5 86.7M 205.7G yaml ckpt
mindir
YOLOv4 CSPDarknet53 D310x1-G 608 MS COCO 2017 45.4 27.6M 52G yaml ckpt
mindir
YOLOv4 CSPDarknet53(silu) D310x1-G 640 MS COCO 2017 45.8 27.6M 52G yaml ckpt
mindir
YOLOv3 Darknet53 D310x1-G 640 MS COCO 2017 45.5 61.9M 156.4G yaml ckpt
mindir
YOLOX N D310x1-G 416 MS COCO 2017 24.1 0.9M 1.1G yaml ckpt
mindir
YOLOX Tiny D310x1-G 416 MS COCO 2017 33.3 5.1M 6.5G yaml ckpt
mindir
YOLOX S D310x1-G 640 MS COCO 2017 40.7 9.0M 26.8G yaml ckpt
mindir
YOLOX M D310x1-G 640 MS COCO 2017 46.7 25.3M 73.8G yaml ckpt
mindir
YOLOX L D310x1-G 640 MS COCO 2017 49.2 54.2M 155.6G yaml ckpt
mindir
YOLOX X D310x1-G 640 MS COCO 2017 51.6 99.1M 281.9G yaml ckpt
mindir
YOLOX Darknet53 D310x1-G 640 MS COCO 2017 47.7 63.7M 185.3G yaml ckpt
mindir