• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1# TensorFlow Lite Task Library
2
3TensorFlow Lite Task Library contains a set of powerful and easy-to-use
4task-specific libraries for app developers to create ML experiences with TFLite.
5It provides optimized out-of-box model interfaces for popular machine learning
6tasks, such as image classification, question and answer, etc. The model
7interfaces are specifically designed for each task to achieve the best
8performance and usability. Task Library works cross-platform and is supported on
9Java, C++, and Swift.
10
11## What to expect from the Task Library
12
13*   **Clean and well-defined APIs usable by non-ML-experts** \
14    Inference can be done within just 5 lines of code. Use the powerful and
15    easy-to-use APIs in the Task library as building blocks to help you easily
16    develop ML with TFLite on mobile devices.
17
18*   **Complex but common data processing** \
19    Supports common vision and natural language processing logic to convert
20    between your data and the data format required by the model. Provides the
21    same, shareable processing logic for training and inference.
22
23*   **High performance gain** \
24    Data processing would take no more than a few milliseconds, ensuring the
25    fast inference experience using TensorFlow Lite.
26
27*   **Extensibility and customization** \
28    You can leverage all benefits the Task Library infrastructure provides and
29    easily build your own Android/iOS inference APIs.
30
31## Supported tasks
32
33Below is the list of the supported task types. The list is expected to grow as
34we continue enabling more and more use cases.
35
36*   **Vision APIs**
37
38    *   [ImageClassifier](image_classifier.md)
39    *   [ObjectDetector](object_detector.md)
40    *   [ImageSegmenter](image_segmenter.md)
41    *   [ImageSearcher](image_searcher.md)
42    *   [ImageEmbedder](image_embedder.md)
43
44*   **Natural Language (NL) APIs**
45
46    *   [NLClassifier](nl_classifier.md)
47    *   [BertNLClassifier](bert_nl_classifier.md)
48    *   [BertQuestionAnswerer](bert_question_answerer.md)
49    *   [TextSearcher](text_searcher.md)
50    *   [TextEmbedder](text_embedder.md)
51
52*   **Audio APIs**
53
54    *   [AudioClassifier](audio_classifier.md)
55
56*   **Custom APIs**
57
58    *   Extend Task API infrastructure and build
59        [customized API](customized_task_api.md).
60
61## Run Task Library with Delegates
62
63[Delegates](https://www.tensorflow.org/lite/performance/delegates) enable
64hardware acceleration of TensorFlow Lite models by leveraging on-device
65accelerators such as the [GPU](https://www.tensorflow.org/lite/performance/gpu)
66and [Coral Edge TPU](https://coral.ai/). Utilizing them for neural network
67operations provides huge benefits in terms of latency and power efficiency. For
68example, GPUs can provide upto a
69[5x speedup](https://blog.tensorflow.org/2020/08/faster-mobile-gpu-inference-with-opencl.html)
70in latency on mobile devices, and Coral Edge TPUs inference
71[10x faster](https://coral.ai/docs/edgetpu/benchmarks/) than desktop CPUs.
72
73Task Library provides easy configuration and fall back options for you to set up
74and use delegates. The following accelerators are now supported in the Task API:
75
76*   Android
77    *   [GPU](https://www.tensorflow.org/lite/performance/gpu): Java / C++
78    *   [NNAPI](https://www.tensorflow.org/lite/android/delegates/nnapi):
79        Java / C++
80    *   [Hexagon](https://www.tensorflow.org/lite/android/delegates/hexagon):
81        C++
82*   Linux / Mac
83    *   [Coral Edge TPU](https://coral.ai/): C++
84*   iOS
85    *   [Core ML delegate](https://www.tensorflow.org/lite/performance/coreml_delegate):
86        C++
87
88Acceleration support in Task Swift / Web API are coming soon.
89
90### Example usage of GPU on Android in Java
91
92Step 1. Add the GPU delegate plugin library to your module's `build.gradle`
93file:
94
95```java
96dependencies {
97    // Import Task Library dependency for vision, text, or audio.
98
99    // Import the GPU delegate plugin Library for GPU inference
100    implementation 'org.tensorflow:tensorflow-lite-gpu-delegate-plugin'
101}
102```
103
104Note: NNAPI comes with the Task Library targets for vision, text, and audio by
105default.
106
107Step 2. Configure GPU delegate in the task options through
108[BaseOptions](https://www.tensorflow.org/lite/api_docs/java/org/tensorflow/lite/task/core/BaseOptions.Builder).
109For example, you can set up GPU in `ObjectDetecor` as follows:
110
111```java
112// Turn on GPU delegation.
113BaseOptions baseOptions = BaseOptions.builder().useGpu().build();
114// Configure other options in ObjectDetector
115ObjectDetectorOptions options =
116    ObjectDetectorOptions.builder()
117        .setBaseOptions(baseOptions)
118        .setMaxResults(1)
119        .build();
120
121// Create ObjectDetector from options.
122ObjectDetector objectDetector =
123    ObjectDetector.createFromFileAndOptions(context, modelFile, options);
124
125// Run inference
126List<Detection> results = objectDetector.detect(image);
127```
128
129### Example usage of GPU on Android in C++
130
131Step 1. Depend on the GPU delegate plugin in your bazel build target, such as:
132
133```
134deps = [
135  "//tensorflow_lite_support/acceleration/configuration:gpu_plugin", # for GPU
136]
137```
138
139Note: the `gpu_plugin` target is a separate one from the
140[GPU delegate target](https://github.com/tensorflow/tensorflow/tree/master/tensorflow/lite/delegates/gpu).
141`gpu_plugin` wraps the GPU delegate target, and can provide safety guard, i.e.
142fallback to TFLite CPU path on delegation errors.
143
144Other delegate options include:
145
146```
147"//tensorflow_lite_support/acceleration/configuration:nnapi_plugin", # for NNAPI
148"//tensorflow_lite_support/acceleration/configuration:hexagon_plugin", # for Hexagon
149```
150
151Step 2. Configure GPU delegate in the task options. For example, you can set up
152GPU in `BertQuestionAnswerer` as follows:
153
154```c++
155// Initialization
156BertQuestionAnswererOptions options;
157// Load the TFLite model.
158auto base_options = options.mutable_base_options();
159base_options->mutable_model_file()->set_file_name(model_file);
160// Turn on GPU delegation.
161auto tflite_settings = base_options->mutable_compute_settings()->mutable_tflite_settings();
162tflite_settings->set_delegate(Delegate::GPU);
163// (optional) Turn on automatical fallback to TFLite CPU path on delegation errors.
164tflite_settings->mutable_fallback_settings()->set_allow_automatic_fallback_on_execution_error(true);
165
166// Create QuestionAnswerer from options.
167std::unique_ptr<QuestionAnswerer> answerer = BertQuestionAnswerer::CreateFromOptions(options).value();
168
169// Run inference on GPU.
170std::vector<QaAnswer> results = answerer->Answer(context_of_question, question_to_ask);
171```
172
173Explore more advanced accelerator settings
174[here](https://github.com/tensorflow/tensorflow/blob/1a8e885b864c818198a5b2c0cbbeca5a1e833bc8/tensorflow/lite/experimental/acceleration/configuration/configuration.proto).
175
176### Example usage of Coral Edge TPU in Python
177
178Configure Coral Edge TPU in the base options of the task. For example, you can
179set up Coral Edge TPU in `ImageClassifier` as follows:
180
181```python
182# Imports
183from tflite_support.task import vision
184from tflite_support.task import core
185
186# Initialize options and turn on Coral Edge TPU delegation.
187base_options = core.BaseOptions(file_name=model_path, use_coral=True)
188options = vision.ImageClassifierOptions(base_options=base_options)
189
190# Create ImageClassifier from options.
191classifier = vision.ImageClassifier.create_from_options(options)
192
193# Run inference on Coral Edge TPU.
194image = vision.TensorImage.create_from_file(image_path)
195classification_result = classifier.classify(image)
196```
197
198### Example usage of Coral Edge TPU in C++
199
200Step 1. Depend on the Coral Edge TPU delegate plugin in your bazel build target,
201such as:
202
203```
204deps = [
205  "//tensorflow_lite_support/acceleration/configuration:edgetpu_coral_plugin", # for Coral Edge TPU
206]
207```
208
209Step 2. Configure Coral Edge TPU in the task options. For example, you can set
210up Coral Edge TPU in `ImageClassifier` as follows:
211
212```c++
213// Initialization
214ImageClassifierOptions options;
215// Load the TFLite model.
216options.mutable_base_options()->mutable_model_file()->set_file_name(model_file);
217// Turn on Coral Edge TPU delegation.
218options.mutable_base_options()->mutable_compute_settings()->mutable_tflite_settings()->set_delegate(Delegate::EDGETPU_CORAL);
219// Create ImageClassifier from options.
220std::unique_ptr<ImageClassifier> image_classifier = ImageClassifier::CreateFromOptions(options).value();
221
222// Run inference on Coral Edge TPU.
223const ClassificationResult result = image_classifier->Classify(*frame_buffer).value();
224```
225
226Step 3. Install the `libusb-1.0-0-dev` package as below. If it is already
227installed, skip to the next step.
228
229```bash
230# On the Linux
231sudo apt-get install libusb-1.0-0-dev
232
233# On the macOS
234port install libusb
235# or
236brew install libusb
237```
238
239Step 4. Compile with the following configurations in your bazel command:
240
241```bash
242# On the Linux
243--define darwinn_portable=1 --linkopt=-lusb-1.0
244
245# On the macOS, add '--linkopt=-lusb-1.0 --linkopt=-L/opt/local/lib/' if you are
246# using MacPorts or '--linkopt=-lusb-1.0 --linkopt=-L/opt/homebrew/lib' if you
247# are using Homebrew.
248--define darwinn_portable=1 --linkopt=-L/opt/local/lib/ --linkopt=-lusb-1.0
249
250# Windows is not supported yet.
251```
252
253Try out the
254[Task Library CLI demo tool](https://github.com/tensorflow/tflite-support/tree/master/tensorflow_lite_support/examples/task/vision/desktop)
255with your Coral Edge TPU devices. Explore more on the
256[pretrained Edge TPU models](https://coral.ai/models/) and
257[advanced Edge TPU settings](https://github.com/tensorflow/tensorflow/blob/1a8e885b864c818198a5b2c0cbbeca5a1e833bc8/tensorflow/lite/experimental/acceleration/configuration/configuration.proto#L275).
258
259### Example usage of Core ML Delegate in C++
260
261A complete example can be found at
262[Image Classifier Core ML Delegate Test](https://github.com/tensorflow/tflite-support/blob/master/tensorflow_lite_support/ios/test/task/vision/image_classifier/TFLImageClassifierCoreMLDelegateTest.mm).
263
264Step 1. Depend on the Core ML delegate plugin in your bazel build target, such
265as:
266
267```
268deps = [
269  "//tensorflow_lite_support/acceleration/configuration:coreml_plugin", # for Core ML Delegate
270]
271```
272
273Step 2. Configure Core ML Delegate in the task options. For example, you can set
274up Core ML Delegate in `ImageClassifier` as follows:
275
276```c++
277// Initialization
278ImageClassifierOptions options;
279// Load the TFLite model.
280options.mutable_base_options()->mutable_model_file()->set_file_name(model_file);
281// Turn on Core ML delegation.
282options.mutable_base_options()->mutable_compute_settings()->mutable_tflite_settings()->set_delegate(::tflite::proto::Delegate::CORE_ML);
283// Set DEVICES_ALL to enable Core ML delegation on any device (in contrast to
284// DEVICES_WITH_NEURAL_ENGINE which creates Core ML delegate only on devices
285// with Apple Neural Engine).
286options.mutable_base_options()->mutable_compute_settings()->mutable_tflite_settings()->mutable_coreml_settings()->set_enabled_devices(::tflite::proto::CoreMLSettings::DEVICES_ALL);
287// Create ImageClassifier from options.
288std::unique_ptr<ImageClassifier> image_classifier = ImageClassifier::CreateFromOptions(options).value();
289
290// Run inference on Core ML.
291const ClassificationResult result = image_classifier->Classify(*frame_buffer).value();
292```
293