0 項目簡介 | 1 環境搭建 | 2 快速開始 | 3 關於模型 | 4 關於opt模型轉化工具 | 5 FAQ |
---|---|---|---|---|---|
1.1硬件環境 | 3.1模型更新 | ||||
-------- | -------- | -------- | -------- | -------- | -------- |
1.2軟件環境 | |||||
-------- | -------- | -------- | -------- | -------- | -------- |
1.3運行環境 |
此項目仿照Paddle-Lite-Demo格式,將Android的C++口罩識別demo移植到樹莓派上,並增長了實時識別的功能。html
源碼node
已經做爲數據集的形式存放在data/data24081/mask.zip。python
————————————————————————————————————————————————————————————————————————————————————linux
經過此項目,你能夠學習到:git
1.實如今樹莓派上或者像RK3399這種開發板上實現對**圖片**或者**使用攝像頭**進行實時識別 2.opt模型轉化工具的使用 3.PaddleHub下載模型的方法
此項目使用armv8的模型github
關於armv7hf,或者使用32位系統的方法我寫到了最下面~web
先展現一下最後的效果,用手遮擋了一下仍是能夠識別的出的,效果還不錯~api
下載安裝命令 ## CPU版本安裝命令 pip install -f https://paddlepaddle.org.cn/pip/oschina/cpu paddlepaddle ## GPU版本安裝命令 pip install -f https://paddlepaddle.org.cn/pip/oschina/gpu paddlepaddle-gpu
開發板:樹莓派4B架構
CPU:博通BCM2711,4核A72的CPU,armv8架構處理器 內存:LPDDR4 4GBapp
攝像頭:(CSI接口攝像頭500萬像素+15cm軟排線,某寶不到20塊錢就能買的到) 若是有UVC免驅動的USB攝像頭的話也是能夠的,注意分辨率不要超過720p。
使用的操做系統是Debian-Pi-Aarch64,附上連接:https://gitee.com/openfans-community/Debian-Pi-Aarch64/
(armv8處理器配合64位系統才能發揮出更強的性能啦~)
這個系統系統是64位的,作的很精緻,速度比官方32位快不少,也比Ubuntu18.04sever64位環境容易搭建不少,推薦使用!
若是你使用的是這個系統的話,使用CSI攝像頭須要在/boot/config.txt文件中將
start_x=1
的註釋去掉,重啓後ls /dev/video*
能看到有/dev/video0的話,CSI攝像頭就能夠正常使用了(後面的video10 video11 video12不是CSI攝像頭)
參照Demo咱們須要安裝gcc g++ make wget unzip libopencv-dev pkg-config和CMake
$ sudo apt-get update $ sudo apt-get install gcc g++ make wget unzip libopencv-dev pkg-config $ wget https://www.cmake.org/files/v3.10/cmake-3.10.3.tar.gz $ tar -zxvf cmake-3.10.3.tar.gz $ cd cmake-3.10.3 $ ./configure && make -j4 && sudo make install
注意 在安裝libopencv-dev或者g++的時候可能會報依賴錯誤,這個時候須要使用
aptitude
來解決依賴問題
sudo apt install aptitude sudo aptitude install libopencv-dev 而後通常提供的第一個解決方案不會作任何改變,因此按N會從新推薦一套方案。 第二套方案通常是把這些高版本的包降級處理來解決依賴問題,按Y確認等待安裝完畢便可。
項目使用的Paddle-lite是我本身編譯好的,版本是v2.3.0,若是想替換成本身編譯好的庫能夠在mask_detection_demo/Paddle-Lite下自行替換include和libs
若是前面的準備工做都已經完畢的話,下載源碼data/data24129/mask.zip
。移動到樹莓派4B中解壓,下面是對項目代碼結構簡單的說明
mask_detection
Paddle-Lite:
include (編譯好的Paddle—Lite的頭文件)
libs(編譯好的Paddle—Lite的庫文件)
code:
model(模型連接(https://paddle-inference-dist.bj.bcebos.com/mask_detection.tar.gz))
images(測試圖片)
CMakeLists.txt
mask_detection.cc
run.sh(無參數時使用攝像頭實時識別,不然須要跟定測試圖片的路徑)
此項目可對圖片或實時的視頻流進行口罩檢測,進入code/文件夾在終端裏運行run.sh
若是使用攝像頭進行實時檢測直接運行,無需跟參數(按「q」退出) ./run.sh 若對圖片進行檢測,需添加圖片的路徑(按「0」退出) ./run.sh ../images/test_mask_detection.jpg
效果以下:
圖片識別
實時視頻流識別
當前使用的模型仍是v2.2.0以前的模型,若想更換最新版的模型請從PaddleHub上進行下載~
下載安裝命令 ## CPU版本安裝命令 pip install -f https://paddlepaddle.org.cn/pip/oschina/cpu paddlepaddle ## GPU版本安裝命令 pip install -f https://paddlepaddle.org.cn/pip/oschina/gpu paddlepaddle-gpu
項目裏的模型有兩個,一個是對口罩識別的模型,一個是對人臉識別的模型。基本的邏輯就是先檢測人臉,檢測到後再判斷是否佩戴口罩。
最新的代碼經過PaddleHub下載
目前口罩識別的模型最新版是1.2.0
pip安裝PaddleHub(個人PC系統是Ubuntu18.04.4LTS) pip3 install paddlehub 而後安裝最新版模型 hub install pyramidbox_lite_mobile==1.1.0 hub install pyramidbox_lite_mobile_mask==1.2.0
下載最新模型:
import paddlehub as hub pyramidbox_lite_mobile_mask = hub.Module(name="pyramidbox_lite_mobile_mask") # 將模型保存在test_program文件夾之中 pyramidbox_lite_mobile_mask.processor.save_inference_model(dirname="test_program")
下載好後使用opt模型轉化工具
轉化命令:
opt --model_file=./__model__ --param_file=./__params__ --optimize_out_type=naive_buffer --optimize_out=model
opt工具是Paddle-Lite的模型轉化工具,這裏使用它的主要目的是將PaddlePaddle上訓練好的模型轉化成Lite可用來預測的模型
關於opt工具的官方文檔以下:opt
注意opt工具是x86下使用的,是不可以在arm開發板上使用的
因此咱們須要一個x86下的Linux環境,這裏咱們能夠藉助AiStudio來實現opt模型轉化
這裏給你們感覺一下opt工具的使用方法:
#先下載opt模型轉化工具,這裏咱們使用官方編譯好的,想本身編譯的可根據官方的教程,連接以下:[教程](https://paddle-lite.readthedocs.io/zh/latest/user_guides/model_optimize_tool.html) !wget https://github.com/PaddlePaddle/Paddle-Lite/releases/download/v2.3.0/opt
--2020-03-12 12:39:29-- https://github.com/PaddlePaddle/Paddle-Lite/releases/download/v2.3.0/opt Resolving github.com (github.com)... 13.250.177.223 Connecting to github.com (github.com)|13.250.177.223|:443... connected. HTTP request sent, awaiting response... 302 Found Location: https://github-production-release-asset-2e65be.s3.amazonaws.com/104208128/1127c300-5883-11ea-8d46-3c94eb9ccdc3?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIWNJYAX4CSVEH53A%2F20200312%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20200312T043930Z&X-Amz-Expires=300&X-Amz-Signature=5bc29c36021ef6acc01b9b86a0aa6ce81802132c514b9333d70b4f1ad2f7f178&X-Amz-SignedHeaders=host&actor_id=0&response-content-disposition=attachment%3B%20filename%3Dopt&response-content-type=application%2Foctet-stream [following] --2020-03-12 12:39:30-- https://github-production-release-asset-2e65be.s3.amazonaws.com/104208128/1127c300-5883-11ea-8d46-3c94eb9ccdc3?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIWNJYAX4CSVEH53A%2F20200312%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20200312T043930Z&X-Amz-Expires=300&X-Amz-Signature=5bc29c36021ef6acc01b9b86a0aa6ce81802132c514b9333d70b4f1ad2f7f178&X-Amz-SignedHeaders=host&actor_id=0&response-content-disposition=attachment%3B%20filename%3Dopt&response-content-type=application%2Foctet-stream Resolving github-production-release-asset-2e65be.s3.amazonaws.com (github-production-release-asset-2e65be.s3.amazonaws.com)... 52.216.96.99 Connecting to github-production-release-asset-2e65be.s3.amazonaws.com (github-production-release-asset-2e65be.s3.amazonaws.com)|52.216.96.99|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 13423408 (13M) [application/octet-stream] Saving to: ‘opt’ opt 21%[===> ] 2.77M 1.07KB/s eta 28m 48s^C
而後下載須要轉化的模型
#使用hub下載命令下載模型 !hub install pyramidbox_lite_mobile==1.1.0 !hub install pyramidbox_lite_mobile_mask==1.2.0 #將模型轉化爲Combined形式的模型,即__model__ 和 __params__ import paddlehub as hub pyramidbox_lite_mobile_mask = hub.Module(name="pyramidbox_lite_mobile_mask") # 將模型保存在test_program文件夾之中 pyramidbox_lite_mobile_mask.processor.save_inference_model(dirname="test_program") %cd ~ #給opt工具加上可執行權限 !chmod +x opt #將opt工具複製到兩個模型的文件夾下 !cp opt test_program/mask_detector/ && cp opt test_program/pyramidbox_lite #進入到口罩模型文件夾 %cd test_program/mask_detector/ #使用opt工具進行模型轉化 將__model__ 和 __params__ 轉化爲model.nb !./opt --model_file=./__model__ --param_file=./__params__ --optimize_out_type=naive_buffer --optimize_out=model !ls
/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/sklearn/externals/joblib/externals/cloudpickle/cloudpickle.py:47: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses import imp Module pyramidbox_lite_mobile-1.1.0 already installed in /home/aistudio/.paddlehub/modules/pyramidbox_lite_mobile /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/sklearn/externals/joblib/externals/cloudpickle/cloudpickle.py:47: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses import imp Module pyramidbox_lite_mobile_mask-1.2.0 already installed in /home/aistudio/.paddlehub/modules/pyramidbox_lite_mobile_mask
[2020-03-12 13:11:01,130] [ INFO] - Installing pyramidbox_lite_mobile_mask module [2020-03-12 13:11:01,146] [ INFO] - Module pyramidbox_lite_mobile_mask already installed in /home/aistudio/.paddlehub/modules/pyramidbox_lite_mobile_mask [2020-03-12 13:11:01,237] [ INFO] - Installing pyramidbox_lite_mobile module [2020-03-12 13:11:01,250] [ INFO] - Module pyramidbox_lite_mobile already installed in /home/aistudio/.paddlehub/modules/pyramidbox_lite_mobile
/home/aistudio /home/aistudio/test_program/mask_detector [W 3/12 13:11: 2.441 ...dle/hongming/Paddle-Lite/lite/api/opt.cc:138 RunOptimize] Load combined-param model. Option model_dir will be ignored [I 3/12 13:11: 2.441 ...hongming/Paddle-Lite/lite/api/cxx_api.cc:244 Build] Load model from file. [I 3/12 13:11: 2.451 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: lite_quant_dequant_fuse_pass [I 3/12 13:11: 2.454 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: lite_quant_dequant_fuse_pass [I 3/12 13:11: 2.454 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: weight_quantization_preprocess_pass [I 3/12 13:11: 2.454 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: weight_quantization_preprocess_pass [I 3/12 13:11: 2.454 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: lite_conv_elementwise_fuse_pass [I 3/12 13:11: 2.457 ...le-Lite/lite/core/mir/pattern_matcher.cc:108 operator()] detected 21 subgraph [I 3/12 13:11: 2.458 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: lite_conv_elementwise_fuse_pass [I 3/12 13:11: 2.458 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: lite_conv_bn_fuse_pass [I 3/12 13:11: 2.465 ...le-Lite/lite/core/mir/pattern_matcher.cc:108 operator()] detected 21 subgraph [I 3/12 13:11: 2.467 ...le-Lite/lite/core/mir/pattern_matcher.cc:108 operator()] detected 1 subgraph [I 3/12 13:11: 2.467 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: lite_conv_bn_fuse_pass [I 3/12 13:11: 2.467 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: lite_conv_elementwise_fuse_pass [I 3/12 13:11: 2.469 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: lite_conv_elementwise_fuse_pass [I 3/12 13:11: 2.469 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: lite_conv_activation_fuse_pass [I 3/12 13:11: 2.470 ...le-Lite/lite/core/mir/pattern_matcher.cc:108 operator()] detected 14 subgraph [I 3/12 13:11: 2.474 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: lite_conv_activation_fuse_pass [I 3/12 13:11: 2.474 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: lite_var_conv_2d_activation_fuse_pass [I 3/12 13:11: 2.474 ...ngming/Paddle-Lite/lite/core/optimizer.h:177 RunPasses] - Skip lite_var_conv_2d_activation_fuse_pass because the target or kernel does not match. [I 3/12 13:11: 2.474 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: lite_fc_fuse_pass [I 3/12 13:11: 2.474 ...le-Lite/lite/core/mir/pattern_matcher.cc:108 operator()] detected 1 subgraph [I 3/12 13:11: 2.474 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: lite_fc_fuse_pass [I 3/12 13:11: 2.474 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: lite_shuffle_channel_fuse_pass [I 3/12 13:11: 2.474 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: lite_shuffle_channel_fuse_pass [I 3/12 13:11: 2.474 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: lite_transpose_softmax_transpose_fuse_pass [I 3/12 13:11: 2.474 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: lite_transpose_softmax_transpose_fuse_pass [I 3/12 13:11: 2.474 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: lite_interpolate_fuse_pass [I 3/12 13:11: 2.475 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: lite_interpolate_fuse_pass [I 3/12 13:11: 2.475 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: identity_scale_eliminate_pass [I 3/12 13:11: 2.475 ...le-Lite/lite/core/mir/pattern_matcher.cc:108 operator()] detected 1 subgraph [I 3/12 13:11: 2.475 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: identity_scale_eliminate_pass [I 3/12 13:11: 2.475 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: elementwise_mul_constant_eliminate_pass [I 3/12 13:11: 2.475 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: elementwise_mul_constant_eliminate_pass [I 3/12 13:11: 2.475 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: lite_sequence_pool_concat_fuse_pass [I 3/12 13:11: 2.475 ...ngming/Paddle-Lite/lite/core/optimizer.h:177 RunPasses] - Skip lite_sequence_pool_concat_fuse_pass because the target or kernel does not match. [I 3/12 13:11: 2.475 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: static_kernel_pick_pass [I 3/12 13:11: 2.476 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: static_kernel_pick_pass [I 3/12 13:11: 2.476 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: variable_place_inference_pass [I 3/12 13:11: 2.478 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: variable_place_inference_pass [I 3/12 13:11: 2.478 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: argument_type_display_pass [I 3/12 13:11: 2.478 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: argument_type_display_pass [I 3/12 13:11: 2.478 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: type_target_cast_pass [I 3/12 13:11: 2.478 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: type_target_cast_pass [I 3/12 13:11: 2.478 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: variable_place_inference_pass [I 3/12 13:11: 2.480 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: variable_place_inference_pass [I 3/12 13:11: 2.480 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: argument_type_display_pass [I 3/12 13:11: 2.480 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: argument_type_display_pass [I 3/12 13:11: 2.480 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: io_copy_kernel_pick_pass [I 3/12 13:11: 2.480 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: io_copy_kernel_pick_pass [I 3/12 13:11: 2.480 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: argument_type_display_pass [I 3/12 13:11: 2.480 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: argument_type_display_pass [I 3/12 13:11: 2.480 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: variable_place_inference_pass [I 3/12 13:11: 2.481 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: variable_place_inference_pass [I 3/12 13:11: 2.481 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: argument_type_display_pass [I 3/12 13:11: 2.481 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: argument_type_display_pass [I 3/12 13:11: 2.481 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: type_precision_cast_pass [I 3/12 13:11: 2.482 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: type_precision_cast_pass [I 3/12 13:11: 2.482 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: variable_place_inference_pass [I 3/12 13:11: 2.483 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: variable_place_inference_pass [I 3/12 13:11: 2.483 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: argument_type_display_pass [I 3/12 13:11: 2.483 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: argument_type_display_pass [I 3/12 13:11: 2.483 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: type_layout_cast_pass dot: digraph G { node_108[label="@HUB_pyramidbox_lite_mobile_mask@fc1_glass.b_0"] node_107[label="@HUB_pyramidbox_lite_mobile_mask@fc1_glass.w_0"] node_106[label="fc30" shape="box" style="filled" color="black" fillcolor="yellow"] node_104[label="@HUB_pyramidbox_lite_mobile_mask@age_block3_2.w_0"] node_102[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn5_1.b_0"] node_101[label="@HUB_pyramidbox_lite_mobile_mask@age_block5_1.w_0"] node_99[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn1_1.b_0"] node_103[label="conv2d29" shape="box" style="filled" color="black" fillcolor="yellow"] node_98[label="@HUB_pyramidbox_lite_mobile_mask@age_block1_1.w_0"] node_97[label="conv2d27" shape="box" style="filled" color="black" fillcolor="yellow"] node_96[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn2_2.b_0"] node_29[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_9.tmp_2"] node_13[label="conv2d5" shape="box" style="filled" color="black" fillcolor="yellow"] node_38[label="@HUB_pyramidbox_lite_mobile_mask@age_block5_3.w_0"] node_36[label="@HUB_pyramidbox_lite_mobile_mask@tmp_1"] node_34[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_12.tmp_2"] node_61[label="conv2d17" shape="box" style="filled" color="black" fillcolor="yellow"] node_31[label="@HUB_pyramidbox_lite_mobile_mask@age_block4_3.w_0"] node_26[label="@HUB_pyramidbox_lite_mobile_mask@age_block3_3.w_0"] node_39[label="@HUB_pyramidbox_lite_mobile_mask@relu_10.tmp_0"] node_25[label="conv2d8" shape="box" style="filled" color="black" fillcolor="yellow"] node_59[label="@HUB_pyramidbox_lite_mobile_mask@glass_block5_2.w_0"] node_42[label="conv2d12" shape="box" style="filled" color="black" fillcolor="yellow"] node_27[label="@HUB_pyramidbox_lite_mobile_mask@relu_6.tmp_0"] node_24[label="@HUB_pyramidbox_lite_mobile_mask@tmp_0"] node_40[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn5_3.b_0"] node_9[label="concat3" shape="box" style="filled" color="black" fillcolor="yellow"] node_32[label="@HUB_pyramidbox_lite_mobile_mask@relu_8.tmp_0"] node_92[label="@HUB_pyramidbox_lite_mobile_mask@age_block1_2.w_0"] node_4[label="@HUB_pyramidbox_lite_mobile_mask@crelu.w_0"] node_10[label="@HUB_pyramidbox_lite_mobile_mask@concat_0.tmp_0"] node_18[label="conv2d6" shape="box" style="filled" color="black" fillcolor="yellow"] node_84[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn2_1.b_0"] node_94[label="conv2d26" shape="box" style="filled" color="black" fillcolor="yellow"] node_43[label="@HUB_pyramidbox_lite_mobile_mask@glass_block4_3.w_0"] node_30[label="conv2d9" shape="box" style="filled" color="black" fillcolor="yellow"] node_11[label="relu4" shape="box" style="filled" color="black" fillcolor="yellow"] node_15[label="@HUB_pyramidbox_lite_mobile_mask@relu_2.tmp_0"] node_35[label="elementwise_add10" shape="box" style="filled" color="black" fillcolor="yellow"] node_71[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn5_2.b_0"] node_7[label="scale2" shape="box" style="filled" color="black" fillcolor="yellow"] node_47[label="conv2d13" shape="box" style="filled" color="black" fillcolor="yellow"] node_78[label="@HUB_pyramidbox_lite_mobile_mask@relu_7.tmp_0"] node_85[label="@HUB_pyramidbox_lite_mobile_mask@relu_3.tmp_0"] node_5[label="@HUB_pyramidbox_lite_mobile_mask@crelu_bn.b_0"] node_2[label="@HUB_pyramidbox_lite_mobile_mask@image"] node_28[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn3_3.b_0"] node_3[label="conv2d1" shape="box" style="filled" color="black" fillcolor="yellow"] node_66[label="@HUB_pyramidbox_lite_mobile_mask@glass_block_bn4_1.b_0"] node_6[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_0.tmp_2"] node_33[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn4_3.b_0"] node_51[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_21.tmp_2"] node_44[label="@HUB_pyramidbox_lite_mobile_mask@relu_12.tmp_0"] node_14[label="@HUB_pyramidbox_lite_mobile_mask@age_block1_3.w_0"] node_1[label="feed"] node_89[label="@HUB_pyramidbox_lite_mobile_mask@relu_5.tmp_0"] node_8[label="@HUB_pyramidbox_lite_mobile_mask@scale_0.tmp_0"] node_20[label="@HUB_pyramidbox_lite_mobile_mask@relu_4.tmp_0"] node_21[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn2_3.b_0"] node_65[label="@HUB_pyramidbox_lite_mobile_mask@glass_block4_1.w_0"] node_23[label="elementwise_add7" shape="box" style="filled" color="black" fillcolor="yellow"] node_16[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn1_3.b_0"] node_87[label="@HUB_pyramidbox_lite_mobile_mask@age_block3_1.w_0"] node_17[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_3.tmp_2"] node_19[label="@HUB_pyramidbox_lite_mobile_mask@age_block2_3.w_0"] node_46[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_18.tmp_2"] node_41[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_15.tmp_2"] node_79[label="conv2d22" shape="box" style="filled" color="black" fillcolor="yellow"] node_81[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn4_2.b_0"] node_48[label="@HUB_pyramidbox_lite_mobile_mask@glass_block5_3.w_0"] node_100[label="conv2d28" shape="box" style="filled" color="black" fillcolor="yellow"] node_62[label="@HUB_pyramidbox_lite_mobile_mask@glass_block5_1.w_0"] node_82[label="conv2d23" shape="box" style="filled" color="black" fillcolor="yellow"] node_49[label="@HUB_pyramidbox_lite_mobile_mask@relu_14.tmp_0"] node_73[label="@HUB_pyramidbox_lite_mobile_mask@glass_block4_2.w_0"] node_53[label="@HUB_pyramidbox_lite_mobile_mask@fc_0.tmp_1"] node_54[label="@HUB_pyramidbox_lite_mobile_mask@save_infer_model/scale_0"] node_55[label="fetch15" shape="box" style="filled" color="black" fillcolor="yellow"] node_56[label="fetch"] node_45[label="@HUB_pyramidbox_lite_mobile_mask@glass_block_bn4_3.b_0"] node_52[label="softmax14" shape="box" style="filled" color="black" fillcolor="yellow"] node_57[label="conv2d16" shape="box" style="filled" color="black" fillcolor="yellow"] node_58[label="@HUB_pyramidbox_lite_mobile_mask@relu_13.tmp_0"] node_60[label="@HUB_pyramidbox_lite_mobile_mask@glass_block_bn5_2.b_0"] node_12[label="@HUB_pyramidbox_lite_mobile_mask@relu_0.tmp_0"] node_63[label="@HUB_pyramidbox_lite_mobile_mask@glass_block_bn5_1.b_0"] node_64[label="conv2d18" shape="box" style="filled" color="black" fillcolor="yellow"] node_68[label="conv2d19" shape="box" style="filled" color="black" fillcolor="yellow"] node_80[label="@HUB_pyramidbox_lite_mobile_mask@age_block4_2.w_0"] node_22[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_6.tmp_2"] node_70[label="@HUB_pyramidbox_lite_mobile_mask@age_block5_2.w_0"] node_50[label="@HUB_pyramidbox_lite_mobile_mask@glass_block_bn5_3.b_0"] node_67[label="@HUB_pyramidbox_lite_mobile_mask@relu_11.tmp_0"] node_72[label="conv2d20" shape="box" style="filled" color="black" fillcolor="yellow"] node_37[label="conv2d11" shape="box" style="filled" color="black" fillcolor="yellow"] node_74[label="@HUB_pyramidbox_lite_mobile_mask@glass_block_bn4_2.b_0"] node_77[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn4_1.b_0"] node_105[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn3_2.b_0"] node_69[label="@HUB_pyramidbox_lite_mobile_mask@relu_9.tmp_0"] node_75[label="conv2d21" shape="box" style="filled" color="black" fillcolor="yellow"] node_76[label="@HUB_pyramidbox_lite_mobile_mask@age_block4_1.w_0"] node_0[label="feed0" shape="box" style="filled" color="black" fillcolor="yellow"] node_83[label="@HUB_pyramidbox_lite_mobile_mask@age_block2_1.w_0"] node_86[label="conv2d24" shape="box" style="filled" color="black" fillcolor="yellow"] node_88[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn3_1.b_0"] node_90[label="conv2d25" shape="box" style="filled" color="black" fillcolor="yellow"] node_91[label="@HUB_pyramidbox_lite_mobile_mask@relu_1.tmp_0"] node_93[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn1_2.b_0"] node_95[label="@HUB_pyramidbox_lite_mobile_mask@age_block2_2.w_0"] node_1->node_0 node_0->node_2 node_4->node_3 node_2->node_3 node_5->node_3 node_3->node_6 node_6->node_7 node_7->node_8 node_6->node_9 node_8->node_9 node_9->node_10 node_10->node_11 node_11->node_12 node_14->node_13 node_15->node_13 node_16->node_13 node_13->node_17 node_19->node_18 node_20->node_18 node_21->node_18 node_18->node_22 node_22->node_23 node_17->node_23 node_23->node_24 node_26->node_25 node_27->node_25 node_28->node_25 node_25->node_29 node_31->node_30 node_32->node_30 node_33->node_30 node_30->node_34 node_34->node_35 node_29->node_35 node_35->node_36 node_38->node_37 node_39->node_37 node_40->node_37 node_37->node_41 node_43->node_42 node_44->node_42 node_45->node_42 node_42->node_46 node_48->node_47 node_49->node_47 node_50->node_47 node_47->node_51 node_53->node_52 node_52->node_54 node_54->node_55 node_55->node_56 node_58->node_57 node_59->node_57 node_60->node_57 node_57->node_49 node_46->node_61 node_62->node_61 node_63->node_61 node_61->node_58 node_41->node_64 node_65->node_64 node_66->node_64 node_64->node_67 node_69->node_68 node_70->node_68 node_71->node_68 node_68->node_39 node_67->node_72 node_73->node_72 node_74->node_72 node_72->node_44 node_29->node_75 node_76->node_75 node_77->node_75 node_75->node_78 node_78->node_79 node_80->node_79 node_81->node_79 node_79->node_32 node_17->node_82 node_83->node_82 node_84->node_82 node_82->node_85 node_24->node_86 node_87->node_86 node_88->node_86 node_86->node_89 node_91->node_90 node_92->node_90 node_93->node_90 node_90->node_15 node_85->node_94 node_95->node_94 node_96->node_94 node_94->node_20 node_12->node_97 node_98->node_97 node_99->node_97 node_97->node_91 node_36->node_100 node_101->node_100 node_102->node_100 node_100->node_69 node_89->node_103 node_104->node_103 node_105->node_103 node_103->node_27 node_107->node_106 node_51->node_106 node_108->node_106 node_106->node_53 } // end G dot: digraph G { node_217[label="@HUB_pyramidbox_lite_mobile_mask@fc1_glass.b_0"] node_216[label="@HUB_pyramidbox_lite_mobile_mask@fc1_glass.w_0"] node_215[label="fc30" shape="box" style="filled" color="black" fillcolor="yellow"] node_213[label="@HUB_pyramidbox_lite_mobile_mask@age_block3_2.w_0"] node_211[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn5_1.b_0"] node_210[label="@HUB_pyramidbox_lite_mobile_mask@age_block5_1.w_0"] node_208[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn1_1.b_0"] node_212[label="conv2d29" shape="box" style="filled" color="black" fillcolor="yellow"] node_207[label="@HUB_pyramidbox_lite_mobile_mask@age_block1_1.w_0"] node_206[label="conv2d27" shape="box" style="filled" color="black" fillcolor="yellow"] node_205[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn2_2.b_0"] node_138[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_9.tmp_2"] node_122[label="conv2d5" shape="box" style="filled" color="black" fillcolor="yellow"] node_147[label="@HUB_pyramidbox_lite_mobile_mask@age_block5_3.w_0"] node_145[label="@HUB_pyramidbox_lite_mobile_mask@tmp_1"] node_143[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_12.tmp_2"] node_170[label="conv2d17" shape="box" style="filled" color="black" fillcolor="yellow"] node_140[label="@HUB_pyramidbox_lite_mobile_mask@age_block4_3.w_0"] node_135[label="@HUB_pyramidbox_lite_mobile_mask@age_block3_3.w_0"] node_148[label="@HUB_pyramidbox_lite_mobile_mask@relu_10.tmp_0"] node_134[label="conv2d8" shape="box" style="filled" color="black" fillcolor="yellow"] node_168[label="@HUB_pyramidbox_lite_mobile_mask@glass_block5_2.w_0"] node_151[label="conv2d12" shape="box" style="filled" color="black" fillcolor="yellow"] node_136[label="@HUB_pyramidbox_lite_mobile_mask@relu_6.tmp_0"] node_133[label="@HUB_pyramidbox_lite_mobile_mask@tmp_0"] node_149[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn5_3.b_0"] node_118[label="concat3" shape="box" style="filled" color="black" fillcolor="yellow"] node_141[label="@HUB_pyramidbox_lite_mobile_mask@relu_8.tmp_0"] node_201[label="@HUB_pyramidbox_lite_mobile_mask@age_block1_2.w_0"] node_113[label="@HUB_pyramidbox_lite_mobile_mask@crelu.w_0"] node_119[label="@HUB_pyramidbox_lite_mobile_mask@concat_0.tmp_0"] node_127[label="conv2d6" shape="box" style="filled" color="black" fillcolor="yellow"] node_193[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn2_1.b_0"] node_203[label="conv2d26" shape="box" style="filled" color="black" fillcolor="yellow"] node_152[label="@HUB_pyramidbox_lite_mobile_mask@glass_block4_3.w_0"] node_139[label="conv2d9" shape="box" style="filled" color="black" fillcolor="yellow"] node_120[label="relu4" shape="box" style="filled" color="black" fillcolor="yellow"] node_124[label="@HUB_pyramidbox_lite_mobile_mask@relu_2.tmp_0"] node_144[label="elementwise_add10" shape="box" style="filled" color="black" fillcolor="yellow"] node_180[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn5_2.b_0"] node_116[label="scale2" shape="box" style="filled" color="black" fillcolor="yellow"] node_156[label="conv2d13" shape="box" style="filled" color="black" fillcolor="yellow"] node_187[label="@HUB_pyramidbox_lite_mobile_mask@relu_7.tmp_0"] node_194[label="@HUB_pyramidbox_lite_mobile_mask@relu_3.tmp_0"] node_114[label="@HUB_pyramidbox_lite_mobile_mask@crelu_bn.b_0"] node_111[label="@HUB_pyramidbox_lite_mobile_mask@image"] node_137[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn3_3.b_0"] node_112[label="conv2d1" shape="box" style="filled" color="black" fillcolor="yellow"] node_175[label="@HUB_pyramidbox_lite_mobile_mask@glass_block_bn4_1.b_0"] node_115[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_0.tmp_2"] node_142[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn4_3.b_0"] node_160[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_21.tmp_2"] node_153[label="@HUB_pyramidbox_lite_mobile_mask@relu_12.tmp_0"] node_123[label="@HUB_pyramidbox_lite_mobile_mask@age_block1_3.w_0"] node_110[label="feed"] node_198[label="@HUB_pyramidbox_lite_mobile_mask@relu_5.tmp_0"] node_117[label="@HUB_pyramidbox_lite_mobile_mask@scale_0.tmp_0"] node_129[label="@HUB_pyramidbox_lite_mobile_mask@relu_4.tmp_0"] node_130[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn2_3.b_0"] node_174[label="@HUB_pyramidbox_lite_mobile_mask@glass_block4_1.w_0"] node_132[label="elementwise_add7" shape="box" style="filled" color="black" fillcolor="yellow"] node_125[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn1_3.b_0"] node_196[label="@HUB_pyramidbox_lite_mobile_mask@age_block3_1.w_0"] node_126[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_3.tmp_2"] node_128[label="@HUB_pyramidbox_lite_mobile_mask@age_block2_3.w_0"] node_155[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_18.tmp_2"] node_150[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_15.tmp_2"] node_188[label="conv2d22" shape="box" style="filled" color="black" fillcolor="yellow"] node_190[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn4_2.b_0"] node_157[label="@HUB_pyramidbox_lite_mobile_mask@glass_block5_3.w_0"] node_209[label="conv2d28" shape="box" style="filled" color="black" fillcolor="yellow"] node_171[label="@HUB_pyramidbox_lite_mobile_mask@glass_block5_1.w_0"] node_191[label="conv2d23" shape="box" style="filled" color="black" fillcolor="yellow"] node_158[label="@HUB_pyramidbox_lite_mobile_mask@relu_14.tmp_0"] node_182[label="@HUB_pyramidbox_lite_mobile_mask@glass_block4_2.w_0"] node_162[label="@HUB_pyramidbox_lite_mobile_mask@fc_0.tmp_1"] node_163[label="@HUB_pyramidbox_lite_mobile_mask@save_infer_model/scale_0"] node_164[label="fetch15" shape="box" style="filled" color="black" fillcolor="yellow"] node_165[label="fetch"] node_154[label="@HUB_pyramidbox_lite_mobile_mask@glass_block_bn4_3.b_0"] node_161[label="softmax14" shape="box" style="filled" color="black" fillcolor="yellow"] node_166[label="conv2d16" shape="box" style="filled" color="black" fillcolor="yellow"] node_167[label="@HUB_pyramidbox_lite_mobile_mask@relu_13.tmp_0"] node_169[label="@HUB_pyramidbox_lite_mobile_mask@glass_block_bn5_2.b_0"] node_121[label="@HUB_pyramidbox_lite_mobile_mask@relu_0.tmp_0"] node_172[label="@HUB_pyramidbox_lite_mobile_mask@glass_block_bn5_1.b_0"] node_173[label="conv2d18" shape="box" style="filled" color="black" fillcolor="yellow"] node_177[label="conv2d19" shape="box" style="filled" color="black" fillcolor="yellow"] node_189[label="@HUB_pyramidbox_lite_mobile_mask@age_block4_2.w_0"] node_131[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_6.tmp_2"] node_179[label="@HUB_pyramidbox_lite_mobile_mask@age_block5_2.w_0"] node_159[label="@HUB_pyramidbox_lite_mobile_mask@glass_block_bn5_3.b_0"] node_176[label="@HUB_pyramidbox_lite_mobile_mask@relu_11.tmp_0"] node_181[label="conv2d20" shape="box" style="filled" color="black" fillcolor="yellow"] node_146[label="conv2d11" shape="box" style="filled" color="black" fillcolor="yellow"] node_183[label="@HUB_pyramidbox_lite_mobile_mask@glass_block_bn4_2.b_0"] node_186[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn4_1.b_0"] node_214[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn3_2.b_0"] node_178[label="@HUB_pyramidbox_lite_mobile_mask@relu_9.tmp_0"] node_184[label="conv2d21" shape="box" style="filled" color="black" fillcolor="yellow"] node_185[label="@HUB_pyramidbox_lite_mobile_mask@age_block4_1.w_0"] node_109[label="feed0" shape="box" style="filled" color="black" fillcolor="yellow"] node_192[label="@HUB_pyramidbox_lite_mobile_mask@age_block2_1.w_0"] node_195[label="conv2d24" shape="box" style="filled" color="black" fillcolor="yellow"] node_197[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn3_1.b_0"] node_199[label="conv2d25" shape="box" style="filled" color="black" fillcolor="yellow"] node_200[label="@HUB_pyramidbox_lite_mobile_mask@relu_1.tmp_0"] node_202[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn1_2.b_0"] node_204[label="@HUB_pyramidbox_lite_mobile_mask@age_block2_2.w_0"] node_110->node_109 node_109->node_111 node_113->node_112 node_111->node_112 node_114->node_112 node_112->node_115 node_115->node_116 node_116->node_117 node_115->node_118 node_117->node_118 node_118->node_119 node_119->node_120 node_120->node_121 node_123->node_122 node_124->node_122 node_125->node_122 node_122->node_126 node_128->node_127 node_129->node_127 node_130->node_127 node_127->node_131 node_131->node_132 node_126->node_132 node_132->node_133 node_135->node_134 node_136->node_134 node_137->node_134 node_134->node_138 node_140->node_139 node_141->node_139 node_142->node_139 node_139->node_143 node_143->node_144 node_138->node_144 node_144->node_145 node_147->node_146 node_148->node_146 node_149->node_146 node_146->node_150 node_152->node_151 node_153->node_151 node_154->node_151 node_151->node_155 node_157->node_156 node_158->node_156 node_159->node_156 node_156->node_160 node_162->node_161 node_161->node_163 node_163->node_164 node_164->node_165 node_167->node_166 node_168->node_166 node_169->node_166 node_166->node_158 node_155->node_170 node_171->node_170 node_172->node_170 node_170->node_167 node_150->node_173 node_174->node_173 node_175->node_173 node_173->node_176 node_178->node_177 node_179->node_177 node_180->node_177 node_177->node_148 node_176->node_181 node_182->node_181 node_183->node_181 node_181->node_153 node_138->node_184 node_185->node_184 node_186->node_184 node_184->node_187 node_187->node_188 node_189->node_188 node_190->node_188 node_188->node_141 node_126->node_191 node_192->node_191 node_193->node_191 node_191->node_194 node_133->node_195 node_196->node_195 node_197->node_195 node_195->node_198 node_200->node_199 node_201->node_199 node_202->node_199 node_199->node_124 node_194->node_203 node_204->node_203 node_205->node_203 node_203->node_129 node_121->node_206 node_207->node_206 node_208->node_206 node_206->node_200 node_145->node_209 node_210->node_209 node_211->node_209 node_209->node_178 node_198->node_212 node_213->node_212 node_214->node_212 node_212->node_136 node_216->node_215 node_160->node_215 node_217->node_215 node_215->node_162 } // end G [I 3/12 13:11: 2.485 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: type_layout_cast_pass [I 3/12 13:11: 2.485 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: argument_type_display_pass [I 3/12 13:11: 2.485 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: argument_type_display_pass [I 3/12 13:11: 2.485 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: variable_place_inference_pass [I 3/12 13:11: 2.487 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: variable_place_inference_pass [I 3/12 13:11: 2.487 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: argument_type_display_pass [I 3/12 13:11: 2.487 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: argument_type_display_pass [I 3/12 13:11: 2.487 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: runtime_context_assign_pass [I 3/12 13:11: 2.487 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: runtime_context_assign_pass [I 3/12 13:11: 2.487 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: argument_type_display_pass [I 3/12 13:11: 2.487 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: argument_type_display_pass [I 3/12 13:11: 2.487 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: memory_optimize_pass [I 3/12 13:11: 2.487 ...te/lite/core/mir/memory_optimize_pass.cc:104 CollectLifeCycleByDevice] There are 1 types device var. [I 3/12 13:11: 2.487 ...te/lite/core/mir/memory_optimize_pass.cc:153 MakeReusePlan] cluster: @HUB_pyramidbox_lite_mobile_mask@fc_0.tmp_1 [I 3/12 13:11: 2.487 ...te/lite/core/mir/memory_optimize_pass.cc:153 MakeReusePlan] cluster: @HUB_pyramidbox_lite_mobile_mask@batch_norm_21.tmp_2 [I 3/12 13:11: 2.487 ...te/lite/core/mir/memory_optimize_pass.cc:153 MakeReusePlan] cluster: @HUB_pyramidbox_lite_mobile_mask@tmp_0 [I 3/12 13:11: 2.487 ...te/lite/core/mir/memory_optimize_pass.cc:153 MakeReusePlan] cluster: @HUB_pyramidbox_lite_mobile_mask@batch_norm_3.tmp_2 [I 3/12 13:11: 2.491 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: memory_optimize_pass [I 3/12 13:11: 2.491 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: npu_subgraph_pass [I 3/12 13:11: 2.491 ...ngming/Paddle-Lite/lite/core/optimizer.h:177 RunPasses] - Skip npu_subgraph_pass because the target or kernel does not match. [I 3/12 13:11: 2.491 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: xpu_subgraph_pass [I 3/12 13:11: 2.491 ...ngming/Paddle-Lite/lite/core/optimizer.h:177 RunPasses] - Skip xpu_subgraph_pass because the target or kernel does not match. dot: digraph G { node_326[label="@HUB_pyramidbox_lite_mobile_mask@fc1_glass.b_0"] node_325[label="@HUB_pyramidbox_lite_mobile_mask@fc1_glass.w_0"] node_324[label="fc30" shape="box" style="filled" color="black" fillcolor="yellow"] node_322[label="@HUB_pyramidbox_lite_mobile_mask@age_block3_2.w_0"] node_320[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn5_1.b_0"] node_319[label="@HUB_pyramidbox_lite_mobile_mask@age_block5_1.w_0"] node_317[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn1_1.b_0"] node_321[label="conv2d29" shape="box" style="filled" color="black" fillcolor="yellow"] node_316[label="@HUB_pyramidbox_lite_mobile_mask@age_block1_1.w_0"] node_315[label="conv2d27" shape="box" style="filled" color="black" fillcolor="yellow"] node_314[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn2_2.b_0"] node_231[label="conv2d5" shape="box" style="filled" color="black" fillcolor="yellow"] node_256[label="@HUB_pyramidbox_lite_mobile_mask@age_block5_3.w_0"] node_252[label="@HUB_pyramidbox_lite_mobile_mask@fc_0.tmp_1(11)"] node_249[label="@HUB_pyramidbox_lite_mobile_mask@age_block4_3.w_0"] node_243[label="conv2d8" shape="box" style="filled" color="black" fillcolor="yellow"] node_277[label="@HUB_pyramidbox_lite_mobile_mask@glass_block5_2.w_0"] node_260[label="conv2d12" shape="box" style="filled" color="black" fillcolor="yellow"] node_242[label="@HUB_pyramidbox_lite_mobile_mask@tmp_0"] node_258[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn5_3.b_0"] node_259[label="@HUB_pyramidbox_lite_mobile_mask@fc_0.tmp_1(15)"] node_310[label="@HUB_pyramidbox_lite_mobile_mask@age_block1_2.w_0"] node_240[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_21.tmp_2(5)"] node_276[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_21.tmp_2(19)"] node_227[label="concat3" shape="box" style="filled" color="black" fillcolor="yellow"] node_222[label="@HUB_pyramidbox_lite_mobile_mask@crelu.w_0"] node_236[label="conv2d6" shape="box" style="filled" color="black" fillcolor="yellow"] node_302[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn2_1.b_0"] node_312[label="conv2d26" shape="box" style="filled" color="black" fillcolor="yellow"] node_261[label="@HUB_pyramidbox_lite_mobile_mask@glass_block4_3.w_0"] node_248[label="conv2d9" shape="box" style="filled" color="black" fillcolor="yellow"] node_229[label="relu4" shape="box" style="filled" color="black" fillcolor="yellow"] node_309[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_21.tmp_2(1)"] node_225[label="scale2" shape="box" style="filled" color="black" fillcolor="yellow"] node_265[label="conv2d13" shape="box" style="filled" color="black" fillcolor="yellow"] node_223[label="@HUB_pyramidbox_lite_mobile_mask@crelu_bn.b_0"] node_220[label="@HUB_pyramidbox_lite_mobile_mask@image"] node_254[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_21.tmp_2(12)"] node_230[label="@HUB_pyramidbox_lite_mobile_mask@fc_0.tmp_1(0)"] node_266[label="@HUB_pyramidbox_lite_mobile_mask@glass_block5_3.w_0"] node_300[label="conv2d23" shape="box" style="filled" color="black" fillcolor="yellow"] node_246[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn3_3.b_0"] node_221[label="conv2d1" shape="box" style="filled" color="black" fillcolor="yellow"] node_284[label="@HUB_pyramidbox_lite_mobile_mask@glass_block_bn4_1.b_0"] node_244[label="@HUB_pyramidbox_lite_mobile_mask@age_block3_3.w_0"] node_250[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_21.tmp_2(10)"] node_224[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_0.tmp_2"] node_251[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn4_3.b_0"] node_269[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_21.tmp_2"] node_253[label="elementwise_add10" shape="box" style="filled" color="black" fillcolor="yellow"] node_289[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn5_2.b_0"] node_232[label="@HUB_pyramidbox_lite_mobile_mask@age_block1_3.w_0"] node_247[label="@HUB_pyramidbox_lite_mobile_mask@tmp_0(8)"] node_291[label="@HUB_pyramidbox_lite_mobile_mask@glass_block4_2.w_0"] node_271[label="@HUB_pyramidbox_lite_mobile_mask@fc_0.tmp_1"] node_226[label="@HUB_pyramidbox_lite_mobile_mask@scale_0.tmp_0"] node_219[label="feed"] node_287[label="@HUB_pyramidbox_lite_mobile_mask@fc_0.tmp_1(13)"] node_262[label="@HUB_pyramidbox_lite_mobile_mask@fc_0.tmp_1(17)"] node_267[label="@HUB_pyramidbox_lite_mobile_mask@fc_0.tmp_1(20)"] node_235[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_3.tmp_2"] node_245[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_21.tmp_2(7)"] node_305[label="@HUB_pyramidbox_lite_mobile_mask@age_block3_1.w_0"] node_237[label="@HUB_pyramidbox_lite_mobile_mask@age_block2_3.w_0"] node_239[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn2_3.b_0"] node_283[label="@HUB_pyramidbox_lite_mobile_mask@glass_block4_1.w_0"] node_238[label="@HUB_pyramidbox_lite_mobile_mask@fc_0.tmp_1(4)"] node_307[label="@HUB_pyramidbox_lite_mobile_mask@fc_0.tmp_1(6)"] node_272[label="@HUB_pyramidbox_lite_mobile_mask@save_infer_model/scale_0"] node_233[label="@HUB_pyramidbox_lite_mobile_mask@fc_0.tmp_1(2)"] node_273[label="fetch15" shape="box" style="filled" color="black" fillcolor="yellow"] node_274[label="fetch"] node_263[label="@HUB_pyramidbox_lite_mobile_mask@glass_block_bn4_3.b_0"] node_270[label="softmax14" shape="box" style="filled" color="black" fillcolor="yellow"] node_275[label="conv2d16" shape="box" style="filled" color="black" fillcolor="yellow"] node_278[label="@HUB_pyramidbox_lite_mobile_mask@glass_block_bn5_2.b_0"] node_279[label="conv2d17" shape="box" style="filled" color="black" fillcolor="yellow"] node_281[label="@HUB_pyramidbox_lite_mobile_mask@glass_block_bn5_1.b_0"] node_257[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_21.tmp_2(14)"] node_282[label="conv2d18" shape="box" style="filled" color="black" fillcolor="yellow"] node_241[label="elementwise_add7" shape="box" style="filled" color="black" fillcolor="yellow"] node_234[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn1_3.b_0"] node_285[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_21.tmp_2(16)"] node_306[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn3_1.b_0"] node_286[label="conv2d19" shape="box" style="filled" color="black" fillcolor="yellow"] node_298[label="@HUB_pyramidbox_lite_mobile_mask@age_block4_2.w_0"] node_288[label="@HUB_pyramidbox_lite_mobile_mask@age_block5_2.w_0"] node_268[label="@HUB_pyramidbox_lite_mobile_mask@glass_block_bn5_3.b_0"] node_290[label="conv2d20" shape="box" style="filled" color="black" fillcolor="yellow"] node_255[label="conv2d11" shape="box" style="filled" color="black" fillcolor="yellow"] node_292[label="@HUB_pyramidbox_lite_mobile_mask@glass_block_bn4_2.b_0"] node_295[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn4_1.b_0"] node_323[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn3_2.b_0"] node_293[label="conv2d21" shape="box" style="filled" color="black" fillcolor="yellow"] node_294[label="@HUB_pyramidbox_lite_mobile_mask@age_block4_1.w_0"] node_318[label="conv2d28" shape="box" style="filled" color="black" fillcolor="yellow"] node_280[label="@HUB_pyramidbox_lite_mobile_mask@glass_block5_1.w_0"] node_296[label="@HUB_pyramidbox_lite_mobile_mask@fc_0.tmp_1(9)"] node_297[label="conv2d22" shape="box" style="filled" color="black" fillcolor="yellow"] node_264[label="@HUB_pyramidbox_lite_mobile_mask@tmp_0(18)"] node_299[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn4_2.b_0"] node_218[label="feed0" shape="box" style="filled" color="black" fillcolor="yellow"] node_301[label="@HUB_pyramidbox_lite_mobile_mask@age_block2_1.w_0"] node_228[label="@HUB_pyramidbox_lite_mobile_mask@concat_0.tmp_0"] node_303[label="@HUB_pyramidbox_lite_mobile_mask@batch_norm_21.tmp_2(3)"] node_304[label="conv2d24" shape="box" style="filled" color="black" fillcolor="yellow"] node_308[label="conv2d25" shape="box" style="filled" color="black" fillcolor="yellow"] node_311[label="@HUB_pyramidbox_lite_mobile_mask@age_block_bn1_2.b_0"] node_313[label="@HUB_pyramidbox_lite_mobile_mask@age_block2_2.w_0"] node_219->node_218 node_218->node_220 node_222->node_221 node_220->node_221 node_223->node_221 node_221->node_224 node_224->node_225 node_225->node_226 node_224->node_227 node_226->node_227 node_227->node_228 node_228->node_229 node_229->node_230 node_232->node_231 node_233->node_231 node_234->node_231 node_231->node_235 node_237->node_236 node_238->node_236 node_239->node_236 node_236->node_240 node_240->node_241 node_235->node_241 node_241->node_242 node_244->node_243 node_245->node_243 node_246->node_243 node_243->node_247 node_249->node_248 node_250->node_248 node_251->node_248 node_248->node_252 node_252->node_253 node_247->node_253 node_253->node_254 node_256->node_255 node_257->node_255 node_258->node_255 node_255->node_259 node_261->node_260 node_262->node_260 node_263->node_260 node_260->node_264 node_266->node_265 node_267->node_265 node_268->node_265 node_265->node_269 node_271->node_270 node_270->node_272 node_272->node_273 node_273->node_274 node_276->node_275 node_277->node_275 node_278->node_275 node_275->node_267 node_264->node_279 node_280->node_279 node_281->node_279 node_279->node_276 node_259->node_282 node_283->node_282 node_284->node_282 node_282->node_285 node_287->node_286 node_288->node_286 node_289->node_286 node_286->node_257 node_285->node_290 node_291->node_290 node_292->node_290 node_290->node_262 node_247->node_293 node_294->node_293 node_295->node_293 node_293->node_296 node_296->node_297 node_298->node_297 node_299->node_297 node_297->node_250 node_235->node_300 node_301->node_300 node_302->node_300 node_300->node_303 node_242->node_304 node_305->node_304 node_306->node_304 node_304->node_307 node_309->node_308 node_310->node_308 node_311->node_308 node_308->node_233 node_303->node_312 node_313->node_312 node_314->node_312 node_312->node_238 node_230->node_315 node_316->node_315 node_317->node_315 node_315->node_309 node_254->node_318 node_319->node_318 node_320->node_318 node_318->node_287 node_307->node_321 node_322->node_321 node_323->node_321 node_321->node_245 node_325->node_324 node_269->node_324 node_326->node_324 node_324->node_271 } // end G [I 3/12 13:11: 2.492 ...te/lite/core/mir/generate_program_pass.h:37 GenProgram] insts.size 31 [I 3/12 13:11: 2.508 ...e-Lite/lite/model_parser/model_parser.cc:589 SaveModelNaive] Save naive buffer model in 'model' successfully model __model__ model.nb opt __params__
而後咱們在test_program/mask_detector/路徑下能夠看到model.nb ,這個是已經使用opt工具轉化好的模型,下載下來後放到樹莓派裏進行模型替換,這裏就完成了口罩模型的替換。
轉化完mask模型還須要轉化face模型,方式是與轉化mask模型是同樣的。這個項目是對人臉和口罩分別都有對應的模型的,目前2.3.0版本的opt工具在轉化face模型時會報錯,這個我已經提交了issue,等待修復。修復好後更新模型的方式我已經寫入FAQ中~
%cd /home/aistudio/test_program/pyramidbox_lite
!ls && pwd
#使用opt工具進行模型轉化 將__model__ 和 __params__ 轉化爲model.nb (這裏會報錯 等待官方修復opt工具)
!./opt --model_file=./__model__ --param_file=./__params__ --optimize_out_type=naive_buffer --optimize_out=model
/home/aistudio/test_program/pyramidbox_lite __model__ opt __params__ /home/aistudio/test_program/pyramidbox_lite [W 3/12 13:12:36.990 ...dle/hongming/Paddle-Lite/lite/api/opt.cc:138 RunOptimize] Load combined-param model. Option model_dir will be ignored [I 3/12 13:12:36.990 ...hongming/Paddle-Lite/lite/api/cxx_api.cc:244 Build] Load model from file. [I 3/12 13:12:37. 10 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: lite_quant_dequant_fuse_pass [I 3/12 13:12:37. 16 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: lite_quant_dequant_fuse_pass [I 3/12 13:12:37. 16 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: weight_quantization_preprocess_pass [I 3/12 13:12:37. 17 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: weight_quantization_preprocess_pass [I 3/12 13:12:37. 17 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: lite_conv_elementwise_fuse_pass [I 3/12 13:12:37. 20 ...le-Lite/lite/core/mir/pattern_matcher.cc:108 operator()] detected 12 subgraph [I 3/12 13:12:37. 21 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: lite_conv_elementwise_fuse_pass [I 3/12 13:12:37. 21 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: lite_conv_bn_fuse_pass [I 3/12 13:12:37. 32 ...le-Lite/lite/core/mir/pattern_matcher.cc:108 operator()] detected 25 subgraph [I 3/12 13:12:37. 35 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: lite_conv_bn_fuse_pass [I 3/12 13:12:37. 35 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: lite_conv_elementwise_fuse_pass [I 3/12 13:12:37. 39 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: lite_conv_elementwise_fuse_pass [I 3/12 13:12:37. 39 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: lite_conv_activation_fuse_pass [I 3/12 13:12:37. 43 ...le-Lite/lite/core/mir/pattern_matcher.cc:108 operator()] detected 28 subgraph [I 3/12 13:12:37. 52 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: lite_conv_activation_fuse_pass [I 3/12 13:12:37. 52 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: lite_var_conv_2d_activation_fuse_pass [I 3/12 13:12:37. 52 ...ngming/Paddle-Lite/lite/core/optimizer.h:177 RunPasses] - Skip lite_var_conv_2d_activation_fuse_pass because the target or kernel does not match. [I 3/12 13:12:37. 52 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: lite_fc_fuse_pass [I 3/12 13:12:37. 52 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: lite_fc_fuse_pass [I 3/12 13:12:37. 52 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: lite_shuffle_channel_fuse_pass [I 3/12 13:12:37. 52 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: lite_shuffle_channel_fuse_pass [I 3/12 13:12:37. 52 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: lite_transpose_softmax_transpose_fuse_pass [I 3/12 13:12:37. 53 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: lite_transpose_softmax_transpose_fuse_pass [I 3/12 13:12:37. 53 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: lite_interpolate_fuse_pass [I 3/12 13:12:37. 53 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: lite_interpolate_fuse_pass [I 3/12 13:12:37. 53 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: identity_scale_eliminate_pass [I 3/12 13:12:37. 53 ...le-Lite/lite/core/mir/pattern_matcher.cc:108 operator()] detected 3 subgraph [I 3/12 13:12:37. 54 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: identity_scale_eliminate_pass [I 3/12 13:12:37. 54 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: elementwise_mul_constant_eliminate_pass [I 3/12 13:12:37. 54 ...ngming/Paddle-Lite/lite/core/optimizer.h:181 RunPasses] == Finished running: elementwise_mul_constant_eliminate_pass [I 3/12 13:12:37. 54 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: lite_sequence_pool_concat_fuse_pass [I 3/12 13:12:37. 54 ...ngming/Paddle-Lite/lite/core/optimizer.h:177 RunPasses] - Skip lite_sequence_pool_concat_fuse_pass because the target or kernel does not match. [I 3/12 13:12:37. 54 ...ngming/Paddle-Lite/lite/core/optimizer.h:164 RunPasses] == Running pass: static_kernel_pick_pass [F 3/12 13:12:37. 57 ...hongming/Paddle-Lite/lite/core/kernel.cc:44 GetOutputDeclType] Check failed: type: no type registered for kernel [multiclass_nms/def] output argument [Index] Aborted (core dumped)
爲何項目中的模型是__model__.nb和params.nb兩個文件呢?
由於項目中使用的模型是Paddle-Lite v2.2.0版本以前的模型,v2.3.0之後將兩個文件合併成一個文件,方便使用。等待v3.0.0後加載老版本模型的APi也將移除。
爲何不在項目裏使用最新的模型呢?
由於我在使用opt轉化pyramidbox_lite模型是會報錯,是opt工具的問題,現已經提交了issue等待攻城獅們修復~
後續等待模型可用時應該如何更新模型呢?
由於v2.2.0與v2.3.0的模型形式以及加載模型的API都發生了變化,這裏都須要修改一下。
因此具體步驟以下:
而後的使用方式和以前同樣,目前的效果還闊以接受。使用體驗上畫面會有些延遲,也是由於樹莓派的CPU對視頻流的處理能力仍是有限的。
若是我是armv7hf/32位系統的該怎麼運行項目?
對於armv7hf/32位系統須要對應的Paddle-Lite庫,項目裏是armv8針對64位的系統的。armv7hf/32位系統須要自行編譯
編譯方式以下:
sudo apt update
#安裝源碼編譯Paddle-lite時必要的工具
sudo apt-get install gcc g++ make wget python unzip
CMake3.10前面配置軟件環境時已經安裝好了的就不用再次安裝了,沒有安裝的參照上面1.3運行環境進行安裝
#下載Paddle-Lite源碼(兩種方式) 方式一: git clone https://github.com/PaddlePaddle/Paddle-Lite.git cd Paddle-Lite #檢查分支 默認是develop分支 git checkout <release-version-tag> —————————————————————————————————————————————————————————————————————————— (推薦)方式二: 這裏下載2.3.0 release版本 wget https://github.com/PaddlePaddle/Paddle-Lite/archive/v2.3.0.zip 而後解壓後cd Paddle-Lite 最後開始編譯: ./lite/tools/build.sh \ --build_extra=OFF \ --arm_os=armlinux \ --arm_abi=armv7hf \ --arm_lang=gcc \ tiny_publish
關於具體參數可參考官方文檔:連接
編譯好後最後生成的文件位於build.lite.armlinux.armv7hf.gcc。取出include和lib放在樹莓派進行對應的文件替換便可。 別忘了還須要在run.sh裏將第四行TARGET_ARCH_ABI=armv8註釋掉,取消第5行#TARGET_ARCH_ABI=armv7hf的註釋
最後以爲有幫助的話別忘了fork一下喔
能再收藏關注三連就更好了啊哈哈哈~
使用AI Studio一鍵上手實踐項目吧:https://aistudio.baidu.com/aistudio/projectdetail/315730
下載安裝命令 ## CPU版本安裝命令 pip install -f https://paddlepaddle.org.cn/pip/oschina/cpu paddlepaddle ## GPU版本安裝命令 pip install -f https://paddlepaddle.org.cn/pip/oschina/gpu paddlepaddle-gpu
>> 訪問 PaddlePaddle 官網,瞭解更多相關內容。