• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1///
2/// Copyright (c) 2017-2019 Arm Limited.
3///
4/// SPDX-License-Identifier: MIT
5///
6/// Permission is hereby granted, free of charge, to any person obtaining a copy
7/// of this software and associated documentation files (the "Software"), to
8/// deal in the Software without restriction, including without limitation the
9/// rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
10/// sell copies of the Software, and to permit persons to whom the Software is
11/// furnished to do so, subject to the following conditions:
12///
13/// The above copyright notice and this permission notice shall be included in all
14/// copies or substantial portions of the Software.
15///
16/// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
17/// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
18/// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
19/// AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
20/// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
21/// OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
22/// SOFTWARE.
23///
24namespace arm_compute
25{
26/**
27@page data_import Importing data from existing models
28
29@tableofcontents
30
31@section caffe_data_extractor Extract data from pre-trained caffe model
32
33One can find caffe <a href="https://github.com/BVLC/caffe/wiki/Model-Zoo">pre-trained models</a> on
34caffe's official github repository.
35
36The caffe_data_extractor.py provided in the scripts folder is an example script that shows how to
37extract the parameter values from a trained model.
38
39@note complex networks might require altering the script to properly work.
40
41@subsection caffe_how_to How to use the script
42
43Install caffe following <a href="http://caffe.berkeleyvision.org/installation.html">caffe's document</a>.
44Make sure the pycaffe has been added into the PYTHONPATH.
45
46Download the pre-trained caffe model.
47
48Run the caffe_data_extractor.py script by
49
50        python caffe_data_extractor.py -m <caffe model> -n <caffe netlist>
51
52For example, to extract the data from pre-trained caffe Alex model to binary file:
53
54        python caffe_data_extractor.py -m /path/to/bvlc_alexnet.caffemodel -n /path/to/caffe/models/bvlc_alexnet/deploy.prototxt
55
56The script has been tested under Python2.7.
57
58@subsection caffe_result  What is the expected output from the script
59
60If the script runs successfully, it prints the names and shapes of each layer onto the standard
61output and generates *.npy files containing the weights and biases of each layer.
62
63The arm_compute::utils::load_trained_data shows how one could load
64the weights and biases into tensor from the .npy file by the help of Accessor.
65
66@section tensorflow_data_extractor Extract data from pre-trained tensorflow model
67
68The script tensorflow_data_extractor.py extracts trainable parameters (e.g. values of weights and biases) from a
69trained tensorflow model. A tensorflow model consists of the following two files:
70
71{model_name}.data-{step}-{global_step}: A binary file containing values of each variable.
72
73{model_name}.meta:  A binary file containing a MetaGraph struct which defines the graph structure of the neural
74network.
75
76@note Since Tensorflow version 0.11 the binary checkpoint file which contains the values for each parameter has the format of:
77    {model_name}.data-{step}-of-{max_step}
78instead of:
79    {model_name}.ckpt
80When dealing with binary files with version >= 0.11, only pass {model_name} to -m option;
81when dealing with binary files with version < 0.11, pass the whole file name {model_name}.ckpt to -m option.
82
83@note This script relies on the parameters to be extracted being in the
84'trainable_variables' tensor collection. By default all variables are automatically added to this collection unless
85specified otherwise by the user. Thus should a user alter this default behavior and/or want to extract parameters from other
86collections, tf.GraphKeys.TRAINABLE_VARIABLES should be replaced accordingly.
87
88@subsection tensorflow_how_to How to use the script
89
90Install tensorflow and numpy.
91
92Download the pre-trained tensorflow model.
93
94Run tensorflow_data_extractor.py with
95
96        python tensorflow_data_extractor -m <path_to_binary_checkpoint_file> -n <path_to_metagraph_file>
97
98For example, to extract the data from pre-trained tensorflow Alex model to binary files:
99
100        python tensorflow_data_extractor -m /path/to/bvlc_alexnet -n /path/to/bvlc_alexnet.meta
101
102Or for binary checkpoint files before Tensorflow 0.11:
103
104        python tensorflow_data_extractor -m /path/to/bvlc_alexnet.ckpt -n /path/to/bvlc_alexnet.meta
105
106@note with versions >= Tensorflow 0.11 only model name is passed to the -m option
107
108The script has been tested with Tensorflow 1.2, 1.3 on Python 2.7.6 and Python 3.4.3.
109
110@subsection tensorflow_result What is the expected output from the script
111
112If the script runs successfully, it prints the names and shapes of each parameter onto the standard output and generates
113 *.npy files containing the weights and biases of each layer.
114
115The arm_compute::utils::load_trained_data shows how one could load
116the weights and biases into tensor from the .npy file by the help of Accessor.
117
118@section tf_frozen_model_extractor Extract data from pre-trained frozen tensorflow model
119
120The script tf_frozen_model_extractor.py extracts trainable parameters (e.g. values of weights and biases) from a
121frozen trained Tensorflow model.
122
123@subsection tensorflow_frozen_how_to How to use the script
124
125Install Tensorflow and NumPy.
126
127Download the pre-trained Tensorflow model and freeze the model using the architecture and the checkpoint file.
128
129Run tf_frozen_model_extractor.py with
130
131        python tf_frozen_model_extractor -m <path_to_frozen_pb_model_file> -d <path_to_store_parameters>
132
133For example, to extract the data from pre-trained Tensorflow model to binary files:
134
135        python tf_frozen_model_extractor -m /path/to/inceptionv3.pb -d ./data
136
137@subsection tensorflow_frozen_result What is the expected output from the script
138
139If the script runs successfully, it prints the names and shapes of each parameter onto the standard output and generates
140 *.npy files containing the weights and biases of each layer.
141
142The arm_compute::utils::load_trained_data shows how one could load
143the weights and biases into tensor from the .npy file by the help of Accessor.
144
145@section validate_examples Validating examples
146
147Compute Library provides a list of graph examples that are used in the context of integration and performance testing.
148The provenance of each model is part of its documentation and no structural or data alterations have been applied to any
149of them unless explicitly specified otherwise in the documentation.
150
151Using one of the provided scripts will generate files containing the trainable parameters.
152
153You can validate a given graph example on a list of inputs by running:
154
155    LD_LIBRARY_PATH=lib ./<graph_example> --validation-range='<validation_range>' --validation-file='<validation_file>' --validation-path='/path/to/test/images/' --data='/path/to/weights/'
156
157e.g:
158
159LD_LIBRARY_PATH=lib ./bin/graph_alexnet --target=CL --layout=NHWC --type=F32 --threads=4 --validation-range='16666,24998' --validation-file='val.txt' --validation-path='images/' --data='data/'
160
161where:
162    validation file is a plain document containing a list of images along with their expected label value.
163    e.g:
164
165        val_00000001.JPEG 65
166        val_00000002.JPEG 970
167        val_00000003.JPEG 230
168        val_00000004.JPEG 809
169        val_00000005.JPEG 516
170
171    --validation-range is the index range of the images within the validation file you want to check:
172    e.g:
173
174       --validation-range='100,200' will validate 100 images starting from 100th one in the validation file.
175
176    This can be useful when parallelizing the validation process is needed.
177*/
178}
179