• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1# TensorFlow Lite for Android
2
3TensorFlow Lite lets you run TensorFlow machine learning (ML) models in your
4Android apps. The TensorFlow Lite system provides prebuilt and customizable
5execution environments for running models on Android quickly and efficiently,
6including options for hardware acceleration.
7
8## Learning roadmap {:.hide-from-toc}
9
10<section class="devsite-landing-row devsite-landing-row-3-up devsite-landing-row-100" header-position="top">
11<div class="devsite-landing-row-inner">
12<div class="devsite-landing-row-group">
13  <div class="devsite-landing-row-item devsite-landing-row-item-no-media tfo-landing-page-card" description-position="bottom">
14    <div class="devsite-landing-row-item-description">
15    <div class="devsite-landing-row-item-body">
16    <div class="devsite-landing-row-item-description-content">
17      <a href="#machine_learning_models">
18      <h3 class="no-link hide-from-toc" id="code-design" data-text="Code design">Code design</h3></a>
19      Learn concepts and code design for building Android apps with TensorFlow
20      Lite, just <a href="#machine_learning_models">keep reading</a>.
21    </div>
22    </div>
23    </div>
24  </div>
25  <div class="devsite-landing-row-item devsite-landing-row-item-no-media tfo-landing-page-card" description-position="bottom">
26    <div class="devsite-landing-row-item-description">
27    <div class="devsite-landing-row-item-body">
28    <div class="devsite-landing-row-item-description-content">
29      <a href="/tutorials/keras/classification">
30      <h3 class="no-link hide-from-toc" id="coding-quickstart" data-text="Coding Quickstart">Coding Quickstart</h3></a>
31      Start coding an Android app with TensorFlow Lite right away with the
32      <a href="./quickstart">Quickstart</a>.
33    </div>
34    </div>
35    </div>
36  </div>
37  <div class="devsite-landing-row-item devsite-landing-row-item-no-media tfo-landing-page-card" description-position="bottom">
38    <div class="devsite-landing-row-item-description">
39    <div class="devsite-landing-row-item-body">
40    <div class="devsite-landing-row-item-description-content">
41      <a href="../models">
42      <h3 class="no-link hide-from-toc" id="ml-models" data-text="ML models">ML models</h3></a>
43      Learn about choosing and using ML models with TensorFlow Lite, see the
44      <a href="../models">Models</a> docs.
45    </div>
46    </div>
47    </div>
48  </div>
49</div>
50</div>
51</section>
52
53## Machine learning models
54
55TensorFlow Lite uses TensorFlow models that are converted into a smaller,
56portable, more efficient machine learning model format. You can use pre-built
57models with TensorFlow Lite on Android, or build your own TensorFlow models and
58convert them to TensorFlow Lite format.
59
60**Key Point:** TensorFlow Lite models and TensorFlow models have a *different
61format and are not interchangeable.* TensorFlow models can be converted into the
62TensorFlow Lite models, but that process is not reversible.
63
64This page discusses using already-built machine learning models and does not
65cover building, training, testing, or converting models. Learn more about
66picking, modifying, building, and converting machine learning models for
67TensorFlow Lite in the [Models](../models) section.
68
69## Run models on Android
70
71A TensorFlow Lite model running inside an Android app takes in data, processes
72the data, and generates a prediction based on the model's logic. A TensorFlow
73Lite model requires a special runtime environment in order to execute, and the
74data that is passed into the model must be in a specific data format, called a
75[*tensor*](../../guide/tensor). When a model processes the data, known as running
76an *inference*, it generates prediction results as new tensors, and passes them
77to the Android app so it can take action, such as showing the result to a user
78or executing additional business logic.
79
80![Functional execution flow for TensorFlow Lite models in Android
81apps](../../images/lite/android/tf_execution_flow_android.png)
82
83**Figure 1.** Functional execution flow for TensorFlow Lite models in Android
84apps.
85
86At the functional design level, your Android app needs the following elements to
87run a TensorFlow Lite model:
88
89-   TensorFlow Lite **runtime environment** for executing the model
90-   **Model input handler** to transform data into tensors
91-   **Model output handler** to receive output result tensors and interpret them
92    as prediction results
93
94The following sections describe how the TensorFlow Lite libraries and tools
95provide these functional elements.
96
97## Build apps with TensorFlow Lite
98
99This section describes the recommended, most common path for implementing
100TensorFlow Lite in your Android App. You should pay most attention to the
101[runtime environment](#runtime) and [development
102libraries](#apis) sections. If you have developed a custom
103model, make sure to review the [Advanced development
104paths](#adv_development) section.
105
106### Runtime environment options {:#runtime}
107
108There are several ways you can enable a runtime environment for executing models
109in your Android app. These are the preferred options:
110
111-   **Standard TensorFlow Lite runtime environment (recommended)**
112-   [Google Play services runtime environment](./play_services)
113    for TensorFlow Lite (Beta)
114
115In general, you should use the standard TensorFlow Lite runtime environment,
116since this is the most versatile environment for running models on Android. The
117runtime environment provided by Google Play services is more convenient and
118space-efficient than the standard environment, since it is loaded from Google
119Play resources and not bundled into your app. Some advanced use cases require
120customization of model runtime environment, which are described in the
121[Advanced runtime environments](#adv_runtime) section.
122
123You access these runtime environments in your Android app by adding TensorFlow
124Lite development libraries to your app development environment. For information
125about how to use the standard runtime environment in your app, see the next
126section. For information about other runtime environments, see
127[Advanced runtime environments](#adv_runtime).
128
129### Development APIs and libraries {:#apis}
130
131There are two main APIs you can use to integrate TensorFlow Lite machine
132learning models into your Android app:
133
134*   **[TensorFlow Lite Task API](../api_docs/java/org/tensorflow/lite/task/core/package-summary) (recommended)**
135*   [TensorFlow Lite Interpreter API](../api_docs/java/org/tensorflow/lite/InterpreterApi)
136
137The
138[Interpreter API](../api_docs/java/org/tensorflow/lite/InterpreterApi)
139provides classes and methods for running inferences with existing TensorFlow
140Lite models. The TensorFlow Lite
141[Task API](../api_docs/java/org/tensorflow/lite/task/core/package-summary)
142wraps the Interpreter API and provides a higher-level programming interface
143for performing common machine learning tasks on handling visual, audio, and
144text data. You should use the Task API unless you find it does not support
145your specific use case.
146
147#### Libraries
148
149You can access the Task API by including the [TensorFlow Lite Task Library](../inference_with_metadata/task_library/overview)
150in your Android app. The Task library also includes the Interpreter API classes
151and methods if you need them.
152
153If just want to use the Interpreter API, you can include the [TensorFlow Lite
154library](./development#lite_lib). Alternatively, you can include [Google Play
155services library](./play_services#1_add_project_dependencies)
156for TensorFlow Lite, and access the Interpreter API through Play services,
157without bundling a separate library into your app.
158
159The [TensorFlow Lite Support library](./development#support_lib) is also
160available to provide additional functionality for managing data for models,
161model metadata, and model inference results.
162
163For programming details about using TensorFlow Lite libraries and runtime
164environments, see
165[Development tools for Android](./development).
166
167### Obtain models {:#models}
168
169Running a model in an Android app requires a TensorFlow Lite-format model. You
170can use prebuilt models or build one with TensorFlow and convert it to the Lite
171format. For more information on obtaining models for your Android app, see the
172TensorFlow Lite [Models](../models)
173section.
174
175### Handle input data {:#input_data}
176
177Any data you pass into a ML model must be a tensor with a specific data
178structure, often called the *shape* of the tensor. To process data with a model,
179your app code must transform data from its native format, such as image, text,
180or audio data, into a tensor in the required shape for your model.
181
182**Note:** Many TensorFlow Lite models come with embedded
183[metadata](../inference_with_metadata/overview)
184that describes the required input data.
185
186The
187[TensorFlow Lite Task library](../inference_with_metadata/task_library/overview)
188provides data handling logic for transforming visual, text, and audio data into
189tensors with the correct shape to be processed by a TensorFlow Lite model.
190
191### Run inferences {:#inferences}
192
193Processing data through a model to generate a prediction result is known as
194running an *inference*. Running an inference in an Android app requires a
195TensorFlow Lite [runtime environment](#runtime), a
196[model](#models) and [input data](#input_data).
197
198The speed at which a model can generate an inference on a particular device
199depends on the size of the data processed, the complexity of the model, and the
200available computing resources such as memory and CPU, or specialized processors
201called *accelerators*. Machine learning models can run faster on these
202specialized processors such as graphics processing units (GPUs) and tensor
203processing units (TPUs), using TensorFlow Lite hardware drivers called
204*delegates*. For more information about delegates and hardware acceleration of
205model processing, see the
206[Hardware acceleration overview](../performance/delegates).
207
208### Handle output results {:#output_results}
209
210Models generate prediction results as tensors, which must be handled by your
211Android app by taking action or displaying a result to the user. Model output
212results can be as simple as a number corresponding to a single result (0 = dog,
2131 = cat, 2 = bird) for an image classification, to much more complex results,
214such as multiple bounding boxes for several classified objects in an image, with
215prediction confidence ratings between 0 and 1.
216
217**Note:** Many TensorFlow Lite models come with embedded
218[metadata](../inference_with_metadata/overview)
219that describes the output results of a model and how to interpret it.
220
221## Advanced development paths {:#adv_development}
222
223When using more sophisticated and customized TensorFlow Lite models, you may
224need to use more advanced development approaches than what is described above.
225The following sections describe advanced techniques for executing models and
226developing them for TensorFlow Lite in Android apps.
227
228### Advanced runtime environments {:#adv_runtime}
229
230In addition to the standard runtime and Google Play services runtime
231environments for TensorFlow Lite, there are additional runtime environments you
232can use with your Android app. The most likely use for these environments is if
233you have a machine learning model that uses ML operations that are not supported
234by the standard runtime environment for TensorFlow Lite.
235
236-   [Flex runtime](../guide/ops_select) for TensorFlow Lite
237-   Custom-built TensorFlow Lite runtime
238
239The TensorFlow Lite [Flex runtime](../guide/ops_select) allows you to include
240specific operators required for your model. As an advanced option for running
241your model, you can build TensorFlow Lite for Android to include operators and
242other functionality required for running your TensorFlow machine learning model.
243For more information, see [Build TensorFlow Lite for Android](./lite_build).
244
245### C and C++ APIs
246
247TensorFlow Lite also provides an API for running models using C and C++. If your
248app uses the [Android NDK](https://developer.android.com/ndk), you should
249consider using this API. You may also want to consider using this API if you
250want to be able to share code between multiple platforms. For more information
251about this development option, see the
252[Development tools](./development#tools_for_building_with_c_and_c) page.
253
254### Server-based model execution
255
256In general, you should run models in your app on an Android device to take
257advantage of lower latency and improved data privacy for your users. However,
258there are cases where running a model on a cloud server, off device, is a better
259solution. For example, if you have a large model which does not easily compress
260down to a size that fits on your users' Android devices, or can be executed with
261reasonable performance on those devices. This approach may also be your
262preferred solution if consistent performance of the model across a wide range of
263devices is top priority.
264
265Google Cloud offers a full suite of services for running TensorFlow machine
266learning models. For more information, see Google Cloud's [AI and machine
267learning products](https://cloud.google.com/products/ai) page.
268
269### Custom model development and optimization
270
271More advanced development paths are likely to include developing custom machine
272learning models and optimizing those models for use on Android devices. If you
273plan to build custom models, make sure you consider applying
274[quantization techniques](../performance/post_training_quantization)
275to models to reduce memory and processing costs. For more information on how to
276build high-performance models for use with TensorFlow Lite, see
277[Performance best practices](../performance/best_practices)
278in the Models section.
279
280## Next Steps
281
282-   Try the Android [Quickstart](./quickstart) or tutorials
283-   Explore the TensorFlow Lite [examples](../examples)
284-   Learn how to find or build [TensorFlow Lite models](../models)
285