Skip to content

Deployment

Dependencies

pip install -r requirement.txt

MindSpore Lite environment preparation

Reference: Lite environment configuration
Note: The python environment that MindSpore Lite is adapted to is 3.7. Please prepare the python3.7 environment before installing Lite

  1. Depending on the environment, download the matching tar.gz package and whl package.

  2. Unzip the tar.gz package and install the corresponding version of the whl package

    tar -zxvf mindspore_lite-2.0.0a0-cp37-cp37m-{os}_{platform}_64.tar.gz
    pip install mindspore_lite-2.0.0a0-cp37-cp37m-{os}_{platform}_64.whl
    

  3. Configure Lite environment variables LITE_HOME is the folder path extracted from tar.gz. It is recommended to use the absolute path.
    export LITE_HOME=/path/to/mindspore-lite-{version}-{os}-{platform}
    export LD_LIBRARY_PATH=$LITE_HOME/runtime/lib:$LITE_HOME/tools/converter/lib:$LD_LIBRARY_PATH
    export PATH=$LITE_HOME/tools/converter/converter:$LITE_HOME/tools/benchmark:$PATH
    

Quick Start

Model conversion

Convert ckpt model to mindir model, this step can be run on CPU/Ascend910

python ./deploy/export.py --config ./path_to_config/model.yaml --weight ./path_to_ckpt/weight.ckpt --per_batch_size 1 --file_format MINDIR --device_target [CPU/Ascend]
e.g.
#Run on CPU
python ./deploy/export.py --config ./configs/yolov5/yolov5n.yaml --weight yolov5n_300e_mAP273-9b16bd7b.ckpt --per_batch_size 1 --file_format MINDIR --device_target CPU
# Run on Ascend
python ./deploy/export.py --config ./configs/yolov5/yolov5n.yaml --weight yolov5n_300e_mAP273-9b16bd7b.ckpt --per_batch_size 1 --file_format MINDIR --device_target Ascend

Lite Test

python deploy/test.py --model_type Lite --model_path ./path_to_mindir/weight.mindir --config ./path_to_config/yolo.yaml
e.g.
python deploy/test.py --model_type Lite --model_path ./yolov5n.mindir --config ./configs/yolov5/yolov5n.yaml

Lite Predict

python ./deploy/predict.py --model_type Lite --model_path ./path_to_mindir/weight.mindir --config ./path_to_conifg/yolo.yaml --image_path ./path_to_image/image.jpg
e.g.
python deploy/predict.py --model_type Lite --model_path ./yolov5n.mindir --config ./configs/yolov5/yolov5n.yaml --image_path ./coco/image/val2017/image.jpg

Script description

  • predict.py supports single image inference
  • test.py supports COCO data set inference
  • Note: currently only supports inference on Ascend 310

MindX Deployment

Environment configuration

Reference: MindX environment preparation
Note: MindX currently supports python version 3.9. Please prepare the python3.9 environment before installing MindX

  1. Obtain the [Environment Installation Package] (https://www.hiascend.com/software/mindx-sdk/commercial) from the MindX official website. Currently, version 3.0.0 of MindX infer is supported.

  2. Jump to the Download page Download Ascend-mindxsdk-mxmanufacture_{version}_linux-{arch}.run

  3. Place the installation package in the Ascend310 machine directory and unzip it

  4. If you are not a root user, you need to add executable permissions to the package:

    chmod +x Ascend-mindxsdk-mxmanufacture_{version}_linux-{arch}.run
    

  5. Enter the upload path of the development kit package and install the mxManufacture development kit package.
    ./Ascend-mindxsdk-mxmanufacture_{version}_linux-{arch}.run --install
    
    After the installation is completed, if the following echo appears, it means that the software was successfully installed.
    The installation is successful
    
    After the installation is complete, the mxManufacture software directory structure is as follows:
    .
    ├── bin
    ├── config
    ├── filelist.txt
    ├── include
    ├── lib
    ├── opensource
    ├── operators
    ├── python
    ├── samples
    ├── set_env.sh
    ├── toolkit
    └── version.info
    
  6. Enter the installation directory of mxmanufacture and run the following command to make the MindX SDK environment variables take effect.
    source set_env.sh
    
  7. Enter ./mxVision-3.0.0/python/ and install mindx-3.0.0-py3-none-any.whl
    pip install mindx-3.0.0-py3-none-any.whl
    

Model conversion

  1. Convert ckpt model to air model. This step needs to be performed on Ascend910.

    python ./deploy/export.py --config ./path_to_config/model.yaml --weight ./path_to_ckpt/weight.ckpt --per_batch_size 1 --file_format AIR
    e.g.
    python ./deploy/export.py --config ./configs/yolov5/yolov5n.yaml --weight yolov5n_300e_mAP273-9b16bd7b.ckpt --per_batch_size 1 --file_format AIR
    
    yolov7 needs to run export on an Ascend910 machine with version 2.0 or above

  2. To convert the air model to the om model, use atc conversion tool. This step requires the installation of MindX Environment, running on Ascend310

    atc --model=./path_to_air/weight.air --framework=1 --output=yolo --soc_version=Ascend310
    

MindX Test

Infer COCO data:

python ./deploy/test.py --model_type MindX --model_path ./path_to_om/weight.om --config ./path_to_config/yolo.yaml
e.g.
python ./deploy/test.py --model_type MindX --model_path ./yolov5n.om --config ./configs/yolov5/yolov5n.yaml

MindX Predict

Infer a single image:

python ./deploy/predict.py --model_type MindX --model_path ./path_to_om/weight.om --config ./path_to_config/yolo.yaml --image_path ./path_to_image/image.jpg
e.g.
python ./deploy/predict.py --model_type MindX --model_path ./yolov5n.om --config ./configs/yolov5/yolov5n.yaml --image_path ./coco/image/val2017/image.jpg

MindIR Deployment

Environmental requirements

mindspore>=2.1

Precautions

  1. Currently only supports Predict

  2. Theoretically, it can also run on Ascend910, but it has not been tested.

Model conversion

Convert the ckpt model to the mindir model, this step can be run on the CPU

python ./deploy/export.py --config ./path_to_config/model.yaml --weight ./path_to_ckpt/weight.ckpt --per_batch_size 1 --file_format MINDIR --device_target CPU
e.g.
#Run on CPU
python ./deploy/export.py --config ./configs/yolov5/yolov5n.yaml --weight yolov5n_300e_mAP273-9b16bd7b.ckpt --per_batch_size 1 --file_format MINDIR --device_target CPU

MindIR Test

Coming soon

MindIR Predict

Infer a single image:

python ./deploy/predict.py --model_type MindIR --model_path ./path_to_mindir/weight.mindir --config ./path_to_conifg/yolo.yaml --image_path ./path_to_image/image.jpg
e.g.
python deploy/predict.py --model_type MindIR --model_path ./yolov5n.mindir --config ./configs/yolov5/yolov5n.yaml --image_path ./coco/image/val2017/image.jpg

ONNX deployment

Environment configuration

pip install onnx>=1.9.0
pip install onnxruntime>=1.8.0

Precautions

  1. Currently not all mindyolo supports ONNX export and inference (only YoloV3 is used as an example)

  2. Currently only supports the Predict function

  3. Exporting ONNX requires adjusting the nn.SiLU operator and using the underlying implementation of the sigmoid operator.

For example: add the following custom layer and replace all nn.SiLU in mindyolo

class EdgeSiLU(nn.Cell):
    """
    SiLU activation function: x * sigmoid(x). To support for onnx export with nn.SiLU.
    """

    def __init__(self):
        super().__init__()

    def construct(self, x):
        return x * ops.sigmoid(x)

Model conversion

Convert the ckpt model to an ONNX model. This step and the Test step can only be run on the CPU.

python ./deploy/export.py --config ./path_to_config/model.yaml --weight ./path_to_ckpt/weight.ckpt --per_batch_size 1 --file_format ONNX --device_target [CPU]
e.g.
#Run on CPU
python ./deploy/export.py --config ./configs/yolov3/yolov3.yaml --weight yolov3-darknet53_300e_mAP455-adfb27af.ckpt --per_batch_size 1 --file_format ONNX --device_target CPU

ONNX Test

Coming soon

ONNXRuntime Predict

Infer a single image:

python ./deploy/predict.py --model_type ONNX --model_path ./path_to_onnx_model/model.onnx --config ./path_to_config/yolo.yaml --image_path ./path_to_image/image.jpg
e.g.
python ./deploy/predict.py --model_type ONNX --model_path ./yolov3.onnx --config ./configs/yolov3/yolov3.yaml --image_path ./coco/image/val2017/image.jpg

Standard and supported model libraries

Name Scale Context ImageSize Dataset Box mAP (%) Params FLOPs Recipe Download
YOLOv8 N D310x1-G 640 MS COCO 2017 37.2 3.2M 8.7G yaml ckpt
mindir
YOLOv8 S D310x1-G 640 MS COCO 2017 44.6 11.2M 28.6G yaml ckpt
mindir
YOLOv8 M D310x1-G 640 MS COCO 2017 50.5 25.9M 78.9G yaml ckpt
mindir
YOLOv8 L D310x1-G 640 MS COCO 2017 52.8 43.7M 165.2G yaml ckpt
mindir
YOLOv8 X D310x1-G 640 MS COCO 2017 53.7 68.2M 257.8G yaml ckpt
mindir
YOLOv7 Tiny D310x1-G 640 MS COCO 2017 37.5 6.2M 13.8G yaml ckpt
mindir
YOLOv7 L D310x1-G 640 MS COCO 2017 50.8 36.9M 104.7G yaml ckpt
mindir
YOLOv7 X D310x1-G 640 MS COCO 2017 52.4 71.3M 189.9G yaml ckpt
mindir
YOLOv5 N D310x1-G 640 MS COCO 2017 27.3 1.9M 4.5G yaml ckpt
mindir
YOLOv5 S D310x1-G 640 MS COCO 2017 37.6 7.2M 16.5G yaml ckpt
mindir
YOLOv5 M D310x1-G 640 MS COCO 2017 44.9 21.2M 49.0G yaml ckpt
mindir
YOLOv5 L D310x1-G 640 MS COCO 2017 48.5 46.5M 109.1G yaml ckpt
mindir
YOLOv5 X D310x1-G 640 MS COCO 2017 50.5 86.7M 205.7G yaml ckpt
mindir
YOLOv4 CSPDarknet53 D310x1-G 608 MS COCO 2017 45.4 27.6M 52G yaml ckpt
mindir
YOLOv4 CSPDarknet53(silu) D310x1-G 640 MS COCO 2017 45.8 27.6M 52G yaml ckpt
mindir
YOLOv3 Darknet53 D310x1-G 640 MS COCO 2017 45.5 61.9M 156.4G yaml ckpt
mindir
YOLOX N D310x1-G 416 MS COCO 2017 24.1 0.9M 1.1G yaml ckpt
mindir
YOLOX Tiny D310x1-G 416 MS COCO 2017 33.3 5.1M 6.5G yaml ckpt
mindir
YOLOX S D310x1-G 640 MS COCO 2017 40.7 9.0M 26.8G yaml ckpt
mindir
YOLOX M D310x1-G 640 MS COCO 2017 46.7 25.3M 73.8G yaml ckpt
mindir
YOLOX L D310x1-G 640 MS COCO 2017 49.2 54.2M 155.6G yaml ckpt
mindir
YOLOX X D310x1-G 640 MS COCO 2017 51.6 99.1M 281.9G yaml ckpt
mindir
YOLOX Darknet53 D310x1-G 640 MS COCO 2017 47.7 63.7M 185.3G yaml ckpt
mindir