• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1# Quickstart for Android
2
3This tutorial shows you how to build an Android app using TensorFlow Lite
4to analyze a live camera feed and identify objects using a machine learning
5model, using a minimal amount of code. If you are updating an existing project
6you can use the example app as a reference and skip ahead to the instructions
7for [modifying your project](#add_dependencies).
8
9
10## Object detection with machine learning
11
12![Object detection animated demo](https://storage.googleapis.com/download.tensorflow.org/tflite/examples/qs-obj-detect.gif){: .attempt-right width="250px"}
13The machine learning model in this tutorial performs object detection. An object
14detection model takes image data in a specific format, analyzes it, and attempts
15to categorize items in the image as one of a set of known classes it was trained
16to recognize. The speed at which a model can identify a known object (called an
17object *prediction* or *inference*) is usually measured in milliseconds. In
18practice, inference speed varies based on the hardware hosting the model, the
19size of data being processed, and the size of the machine learning model.
20
21
22## Setup and run example
23
24For the first part of this tutorial, download the example app from GitHub and
25run it using [Android Studio](https://developer.android.com/studio/). The
26following sections of this tutorial explore the relevant sections of the code
27example, so you can apply them to your own Android apps. You need the following
28versions of these tools installed:
29
30* Android Studio 4.2.2 or higher
31* Android SDK version 31 or higher
32
33Note: This example uses the camera, so you should run it on a physical Android
34device.
35
36### Get the example code
37
38Create a local copy of the example code. You will use this code to create a
39project in Android Studio and run the example application.
40
41To clone and setup the example code:
42
431.  Clone the git repository
44    <pre class="devsite-click-to-copy">
45    git clone https://github.com/android/camera-samples.git
46    </pre>
472.  Configure your git instance to use sparse checkout, so you have only
48    the files for the object detection example app:
49    ```
50    cd camera-samples
51    git sparse-checkout init --cone
52    git sparse-checkout set CameraXAdvanced
53    ```
54
55### Import and run the project
56
57Create a project from the downloaded example code, build the project, and then
58run it.
59
60To import and build the example code project:
61
621.  Start [Android Studio](https://developer.android.com/studio).
631.  From the Android Studio **Welcome** page, choose **Import Project**, or
64    select **File > New > Import Project**.
651.  Navigate to the example code directory containing the build.gradle file
66    (`.../android/camera-samples/CameraXAdvanced/build.gradle`) and select that
67    directory.
68
69If you select the correct directory, Android Studio creates a new project and
70builds it. This process can take a few minutes, depending on the speed of your
71computer and if you have used Android Studio for other projects. When the build
72completes, the Android Studio displays a `BUILD SUCCESSFUL` message in the
73**Build Output** status panel.
74
75Note: The example code is built with Android Studio 4.2.2, but works with
76earlier versions of Studio. If you are using an earlier version of Android
77Studio you can try to adjust the version number of the Android plugin so that
78the build completes, instead of upgrading Studio.
79
80**Optional:** To fix build errors by updating the Android plugin version:
81
821.  Open the build.gradle file in the project directory.
831.  Change the Android tools version as follows:
84    ```
85    // from:
86    classpath 'com.android.tools.build:gradle:4.2.2'
87    // to:
88    classpath 'com.android.tools.build:gradle:4.1.2'
89    ```
901.  Sync the project by selecting: **File > Sync Project with Gradle Files**.
91
92To run the project:
93
941.  From Android Studio, run the project by selecting **Run > Run…** and
95    **CameraActivity**.
961.  Select an attached Android device with a camera to test the app.
97
98The next sections show you the modifications you need to make to your existing
99project to add this functionality to your own app, using this example app as a
100reference point.
101
102## Add project dependencies {:#add_dependencies}
103
104In your own application, you must add specific project dependencies to run
105TensorFlow Lite machine learning models, and access utility functions that
106convert data such as images, into a tensor data format that can be processed by
107the model you are using.
108
109The example app uses several TensorFlow Lite libraries to enable the execution
110of the object detection machine learning model:
111
112-   *TensorFlow Lite main library* - Provides the required data input
113    classes,  execution of the machine learning model, and output results from
114    the model processing.
115-   *TensorFlow Lite Support library* - This library provides a helper class
116    to translate images from the camera into a
117    [`TensorImage`](../api_docs/java/org/tensorflow/lite/support/image/TensorImage)
118    data object that can be processed by the machine learning model.
119-   *TensorFlow Lite GPU library* - This library provides support to
120    accelerate model execution using GPU processors on the device, if they are
121    available.
122
123The following instructions explain how to add the required project and module
124dependencies to your own Android app project.
125
126To add module dependencies:
127
1281.  In the module that uses TensorFlow Lite, update the module's
129    `build.gradle` file to include the following dependencies. In the example
130    code, this file is located here:
131    `.../android/camera-samples/CameraXAdvanced/tflite/build.gradle`
132    ([code reference](https://github.com/android/camera-samples/blob/b0f4ec3a81ec30e622bb1ccd55f30e54ddac223f/CameraXAdvanced/tflite/build.gradle#L69-L71))
133    ```
134    ...
135    dependencies {
136    ...
137        // Tensorflow lite dependencies
138        implementation 'org.tensorflow:tensorflow-lite:2.8.0'
139        implementation 'org.tensorflow:tensorflow-lite-gpu:2.8.0'
140        implementation 'org.tensorflow:tensorflow-lite-support:2.8.0'
141    ...
142    }
143    ```
1441.  In Android Studio, sync the project dependencies by selecting: **File >
145    Sync Project with Gradle Files**.
146
147## Initialize the ML model interpreter
148
149In your Android app, you must initialize the TensorFlow Lite machine learning
150model interpreter with parameters before running predictions with the model.
151These initialization parameters are dependent on the model you are using, and
152can include settings such as minimum accuracy thresholds for predictions and
153labels for identified object classes.
154
155A TensorFlow Lite model includes a `.tflite` file containing the model code and
156frequently includes a labels file containing the names of the classes predicted
157by the model. In the case of object detection, classes are objects such as a
158person, dog, cat, or car. Models are generally stored in the `src/main/assets`
159directory of the primary module, as in the code example:
160
161- CameraXAdvanced/tflite/src/main/assets/coco_ssd_mobilenet_v1_1.0_quant.tflite
162- CameraXAdvanced/tflite/src/main/assets/coco_ssd_mobilenet_v1_1.0_labels.txt
163
164For convenience and code readability, the example declares a companion object
165that defines the settings for the model.
166
167To initialize the model in your app:
168
1691.  Create a companion object to define the settings for the model:
170    ([code reference](https://github.com/android/camera-samples/blob/b0f4ec3a81ec30e622bb1ccd55f30e54ddac223f/CameraXAdvanced/tflite/src/main/java/com/example/android/camerax/tflite/CameraActivity.kt#L342-L347))
171    ```
172    companion object {
173       private val TAG = CameraActivity::class.java.simpleName
174
175       private const val ACCURACY_THRESHOLD = 0.5f
176       private const val MODEL_PATH = "coco_ssd_mobilenet_v1_1.0_quant.tflite"
177       private const val LABELS_PATH = "coco_ssd_mobilenet_v1_1.0_labels.txt"
178    }
179    ```
1801.  Use the settings from this object to construct a TensorFlow Lite
181    [Interpreter](https://www.tensorflow.org/lite/api_docs/java/org/tensorflow/lite/Interpreter)
182    object that contains the model:
183    ([code reference](https://github.com/android/camera-samples/blob/b0f4ec3a81ec30e622bb1ccd55f30e54ddac223f/CameraXAdvanced/tflite/src/main/java/com/example/android/camerax/tflite/CameraActivity.kt#L90-L94))
184    ```
185    private val tflite by lazy {
186       Interpreter(
187           FileUtil.loadMappedFile(this, MODEL_PATH),
188           Interpreter.Options().addDelegate(nnApiDelegate))
189    }
190    ```
191
192### Configure hardware accelerator
193
194When initializing a TensorFlow Lite model in your application, you can
195use hardware acceleration features to speed up the prediction
196calculations of the model. The code example above uses the NNAPI Delegate to
197handle hardware acceleration of the model execution:
198```
199Interpreter.Options().addDelegate(nnApiDelegate)
200```
201
202TensorFlow Lite *delegates* are software modules that accelerate the execution
203of machine learning models using specialized processing hardware on a mobile
204device, such as GPUs, TPUs, or DSPs. Using delegates for running TensorFlow Lite
205models is recommended, but not required.
206
207For more information about using delegates with TensorFlow Lite, see
208[TensorFlow Lite Delegates](../performance/delegates).
209
210
211## Provide data to the model
212
213In your Android app, your code provides data to the model for interpretation by
214transforming existing data such as images into a
215[Tensor](../api_docs/java/org/tensorflow/lite/Tensor)
216data format that can be processed by your model. The data in a Tensor must have
217specific dimensions, or shape, that matches the format of data used to train the
218model.
219
220To determine the required tensor shape for a model:
221
222-   Use the initialized
223    [Interpreter](https://www.tensorflow.org/lite/api_docs/java/org/tensorflow/lite/Interpreter)
224    object to determine the shape of the tensor used by your model, as shown in
225    the code snippet below:
226    ([code reference](https://github.com/android/camera-samples/blob/b0f4ec3a81ec30e622bb1ccd55f30e54ddac223f/CameraXAdvanced/tflite/src/main/java/com/example/android/camerax/tflite/CameraActivity.kt#L102-L106))
227    ```
228    private val tfInputSize by lazy {
229       val inputIndex = 0
230       val inputShape = tflite.getInputTensor(inputIndex).shape()
231       Size(inputShape[2], inputShape[1]) // Order of axis is: {1, height, width, 3}
232    }
233    ```
234
235The object detection model used in the example code expects square images with a
236size of 300 by 300 pixels.
237
238Before you can provide images from the camera, your app must take the image,
239make it conform to the expected size, adjust its rotation, and normalize the
240image data. When processing images with a TensorFlow Lite model, you can use the
241TensorFlow Lite Support Library
242[ImageProcessor](../api_docs/java/org/tensorflow/lite/support/image/ImageProcessor)
243class to handle this data pre-processing, as show below.
244
245To transform image data for a model:
246
2471.  Use the Support Library
248    [ImageProcessor](https://www.tensorflow.org/lite/api_docs/java/org/tensorflow/lite/support/image/ImageProcessor)
249    to create an object for transforming image data into a format that your
250    model can use to run predictions:
251    ([code reference](https://github.com/android/camera-samples/blob/b0f4ec3a81ec30e622bb1ccd55f30e54ddac223f/CameraXAdvanced/tflite/src/main/java/com/example/android/camerax/tflite/CameraActivity.kt#L75-L84))
252    ```
253    private val tfImageProcessor by lazy {
254       val cropSize = minOf(bitmapBuffer.width, bitmapBuffer.height)
255       ImageProcessor.Builder()
256           .add(ResizeWithCropOrPadOp(cropSize, cropSize))
257           .add(ResizeOp(
258               tfInputSize.height, tfInputSize.width, ResizeOp.ResizeMethod.NEAREST_NEIGHBOR))
259           .add(Rot90Op(-imageRotationDegrees / 90))
260           .add(NormalizeOp(0f, 1f))
261           .build()
262    }
263    ```
2641.  Copy the image data from the Android camera system and prepare it for
265    analysis with your
266    [ImageProcessor](https://www.tensorflow.org/lite/api_docs/java/org/tensorflow/lite/support/image/ImageProcessor)
267    object:
268    ([code reference](https://github.com/android/camera-samples/blob/b0f4ec3a81ec30e622bb1ccd55f30e54ddac223f/CameraXAdvanced/tflite/src/main/java/com/example/android/camerax/tflite/CameraActivity.kt#L198-L202))
269    ```
270    // Copy out RGB bits to the shared buffer
271    image.use { bitmapBuffer.copyPixelsFromBuffer(image.planes[0].buffer)  }
272
273    // Process the image in Tensorflow
274    val tfImage =  tfImageProcessor.process(tfImageBuffer.apply { load(bitmapBuffer) })
275    ```
276
277Note: When extracting image information from the Android camera subsystem, make
278sure to get an image in RGB format. This format is required by the TensorFlow
279Lite
280[ImageProcessor](../api_docs/java/org/tensorflow/lite/support/image/ImageProcessor)
281class which you use to prepare the image for analysis by a model. If the
282RGB-format image contains an Alpha channel, that transparency data is ignored.
283
284
285## Run predictions
286
287In your Android app, once you create a
288[TensorImage](../api_docs/java/org/tensorflow/lite/support/image/TensorImage)
289object with image data in the correct format, you can run the model against that
290data to produce a prediction, or *inference*. The example code for this tutorial
291uses an
292[ObjectDetectionHelper](https://github.com/android/camera-samples/blob/main/CameraXAdvanced/tflite/src/main/java/com/example/android/camerax/tflite/ObjectDetectionHelper.kt)
293class that encapsulates this code in a `predict()` method.
294
295To run a prediction on a set of image data:
296
2971.  Run the prediction by passing the image data to your predict function:
298    ([code reference](https://github.com/android/camera-samples/blob/b0f4ec3a81ec30e622bb1ccd55f30e54ddac223f/CameraXAdvanced/tflite/src/main/java/com/example/android/camerax/tflite/CameraActivity.kt#L204-L205))
299    ```
300    // Perform the object detection for the current frame
301    val predictions = detector.predict(tfImage)
302    ```
3031.  Call the run method on your `tflite` object instance with
304    the image data to generate predictions:
305    ([code reference](https://github.com/android/camera-samples/blob/b0f4ec3a81ec30e622bb1ccd55f30e54ddac223f/CameraXAdvanced/tflite/src/main/java/com/example/android/camerax/tflite/ObjectDetectionHelper.kt#L60-L63))
306    ```
307    fun predict(image: TensorImage): List<ObjectPrediction> {
308       tflite.runForMultipleInputsOutputs(arrayOf(image.buffer), outputBuffer)
309       return predictions
310    }
311    ```
312
313The TensorFlow Lite Interpreter object receives this data, runs it against the
314model, and produces a list of predictions. For continuous processing of data by
315the model, use the `runForMultipleInputsOutputs()` method so that Interpreter
316objects are not created and then removed by the system for each prediction run.
317
318## Handle model output
319
320In your Android app, after you run image data against the object detection
321model, it produces a list of predictions that your app code must handle by
322executing additional business logic, displaying results to the user, or taking
323other actions.
324
325The output of any given TensorFlow Lite model varies in terms of the number of
326predictions it produces (one or many), and the descriptive information for each
327prediction. In the case of an object detection model, predictions typically
328include data for a bounding box that indicates where an object is detected in
329the image. In the example code, the returned data is formatted as a list of
330[ObjectPrediction](https://github.com/android/camera-samples/blob/b0f4ec3a81ec30e622bb1ccd55f30e54ddac223f/CameraXAdvanced/tflite/src/main/java/com/example/android/camerax/tflite/ObjectDetectionHelper.kt#L42-L58)
331objects, as shown below:
332([code reference](https://github.com/android/camera-samples/blob/b0f4ec3a81ec30e622bb1ccd55f30e54ddac223f/CameraXAdvanced/tflite/src/main/java/com/example/android/camerax/tflite/ObjectDetectionHelper.kt#L42-L58))
333
334```
335val predictions get() = (0 until OBJECT_COUNT).map {
336   ObjectPrediction(
337
338       // The locations are an array of [0, 1] floats for [top, left, bottom, right]
339       location = locations[0][it].let {
340           RectF(it[1], it[0], it[3], it[2])
341       },
342
343       // SSD Mobilenet V1 Model assumes class 0 is background class
344       // in label file and class labels start from 1 to number_of_classes + 1,
345       // while outputClasses correspond to class index from 0 to number_of_classes
346       label = labels[1 + labelIndices[0][it].toInt()],
347
348       // Score is a single value of [0, 1]
349       score = scores[0][it]
350   )
351}
352```
353
354![Object detection screenshot](../../images/lite/android/qs-obj-detect.jpeg){: .attempt-right width="250px"}
355For the model used in this example, each prediction includes a bounding box
356location for the object, a label for the object, and a prediction score between
3570 and 1 as a Float representing the confidence of the prediction, with 1 being
358the highest confidence rating. In general, predictions with a score below 50%
359(0.5) are considered inconclusive. However, how you handle low-value prediction
360results is up to you and the needs of your application.
361
362Once the model has returned a prediction result, your application can act on
363that prediction by presenting the result to your user or executing additional
364logic. In the case of the example code, the application draws a bounding box
365around the identified object and displays the class name on the screen. Review
366the
367[`CameraActivity.reportPrediction()`](https://github.com/android/camera-samples/blob/b0f4ec3a81ec30e622bb1ccd55f30e54ddac223f/CameraXAdvanced/tflite/src/main/java/com/example/android/camerax/tflite/CameraActivity.kt#L236-L262)
368function in the example code for details.
369
370## Next steps
371
372*   Explore various uses of TensorFlow Lite in the [examples](../examples).
373*   Learn more about using machine learning models with TensorFlow Lite
374    in the [Models](../models) section.
375*   Learn more about implementing machine learning in your mobile
376    application in the [TensorFlow Lite Developer Guide](../guide).
377