• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1# Integrate Natural language classifier
2
3The Task Library's `NLClassifier` API classifies input text into different
4categories, and is a versatile and configurable API that can handle most text
5classification models.
6
7## Key features of the NLClassifier API
8
9*   Takes a single string as input, performs classification with the string and
10    outputs <Label, Score> pairs as classification results.
11
12*   Optional Regex Tokenization available for input text.
13
14*   Configurable to adapt different classification models.
15
16## Supported NLClassifier models
17
18The following models are guaranteed to be compatible with the `NLClassifier`
19API.
20
21*   The <a href="../../examples/text_classification/overview">movie review
22    sentiment classification</a> model.
23
24*   Models with `average_word_vec` spec created by
25    [TensorFlow Lite Model Maker for text Classification](https://www.tensorflow.org/lite/models/modify/model_maker/text_classification).
26
27*   Custom models that meet the
28    [model compatibility requirements](#model-compatibility-requirements).
29
30## Run inference in Java
31
32See the
33[Text Classification reference app](https://github.com/tensorflow/examples/blob/master/lite/examples/text_classification/android/lib_task_api/src/main/java/org/tensorflow/lite/examples/textclassification/client/TextClassificationClient.java)
34for an example of how to use `NLClassifier` in an Android app.
35
36### Step 1: Import Gradle dependency and other settings
37
38Copy the `.tflite` model file to the assets directory of the Android module
39where the model will be run. Specify that the file should not be compressed, and
40add the TensorFlow Lite library to the module’s `build.gradle` file:
41
42```java
43android {
44    // Other settings
45
46    // Specify tflite file should not be compressed for the app apk
47    aaptOptions {
48        noCompress "tflite"
49    }
50
51}
52
53dependencies {
54    // Other dependencies
55
56    // Import the Task Vision Library dependency (NNAPI is included)
57    implementation 'org.tensorflow:tensorflow-lite-task-text:0.3.0'
58    // Import the GPU delegate plugin Library for GPU inference
59    implementation 'org.tensorflow:tensorflow-lite-gpu-delegate-plugin:0.3.0'
60}
61```
62
63Note: starting from version 4.1 of the Android Gradle plugin, .tflite will be
64added to the noCompress list by default and the aaptOptions above is not needed
65anymore.
66
67### Step 2: Run inference using the API
68
69```java
70// Initialization, use NLClassifierOptions to configure input and output tensors
71NLClassifierOptions options =
72    NLClassifierOptions.builder()
73        .setBaseOptions(BaseOptions.builder().useGpu().build())
74        .setInputTensorName(INPUT_TENSOR_NAME)
75        .setOutputScoreTensorName(OUTPUT_SCORE_TENSOR_NAME)
76        .build();
77NLClassifier classifier =
78    NLClassifier.createFromFileAndOptions(context, modelFile, options);
79
80// Run inference
81List<Category> results = classifier.classify(input);
82```
83
84See the
85[source code](https://github.com/tensorflow/tflite-support/blob/master/tensorflow_lite_support/java/src/java/org/tensorflow/lite/task/text/nlclassifier/NLClassifier.java)
86for more options to configure `NLClassifier`.
87
88## Run inference in Swift
89
90### Step 1: Import CocoaPods
91
92Add the TensorFlowLiteTaskText pod in Podfile
93
94```
95target 'MySwiftAppWithTaskAPI' do
96  use_frameworks!
97  pod 'TensorFlowLiteTaskText', '~> 0.2.0'
98end
99```
100
101### Step 2: Run inference using the API
102
103```swift
104// Initialization
105var modelOptions:TFLNLClassifierOptions = TFLNLClassifierOptions()
106modelOptions.inputTensorName = inputTensorName
107modelOptions.outputScoreTensorName = outputScoreTensorName
108let nlClassifier = TFLNLClassifier.nlClassifier(
109      modelPath: modelPath,
110      options: modelOptions)
111
112// Run inference
113let categories = nlClassifier.classify(text: input)
114```
115
116See the
117[source code](https://github.com/tensorflow/tflite-support/blob/master/tensorflow_lite_support/ios/task/text/nlclassifier/Sources/TFLNLClassifier.h)
118for more details.
119
120## Run inference in C++
121
122```c++
123// Initialization
124NLClassifierOptions options;
125options.mutable_base_options()->mutable_model_file()->set_file_name(model_path);
126std::unique_ptr<NLClassifier> classifier = NLClassifier::CreateFromOptions(options).value();
127
128// Run inference with your input, `input_text`.
129std::vector<core::Category> categories = classifier->Classify(input_text);
130```
131
132See the
133[source code](https://github.com/tensorflow/tflite-support/blob/master/tensorflow_lite_support/cc/task/text/nlclassifier/nl_classifier.h)
134for more details.
135
136## Example results
137
138Here is an example of the classification results of the
139[movie review model](https://www.tensorflow.org/lite/examples/text_classification/overview).
140
141Input: "What a waste of my time."
142
143Output:
144
145```
146category[0]: 'Negative' : '0.81313'
147category[1]: 'Positive' : '0.18687'
148```
149
150Try out the simple
151[CLI demo tool for NLClassifier](https://github.com/tensorflow/tflite-support/blob/master/tensorflow_lite_support/examples/task/text/desktop/README.md#nlclassifier)
152with your own model and test data.
153
154## Model compatibility requirements
155
156Depending on the use case, the `NLClassifier` API can load a TFLite model with
157or without [TFLite Model Metadata](../../models/convert/metadata). See examples
158of creating metadata for natural language classifiers using the
159[TensorFlow Lite Metadata Writer API](../../models/convert/metadata_writer_tutorial.ipynb#nl_classifiers).
160
161The compatible models should meet the following requirements:
162
163*   Input tensor: (kTfLiteString/kTfLiteInt32)
164
165    -   Input of the model should be either a kTfLiteString tensor raw input
166        string or a kTfLiteInt32 tensor for regex tokenized indices of raw input
167        string.
168    -   If input type is kTfLiteString, no
169        [Metadata](../../models/convert/metadata) is required for the model.
170    -   If input type is kTfLiteInt32, a `RegexTokenizer` needs to be set up in
171        the input tensor's
172        [Metadata](https://www.tensorflow.org/lite/models/convert/metadata_writer_tutorial#natural_language_classifiers).
173
174*   Output score tensor:
175    (kTfLiteUInt8/kTfLiteInt8/kTfLiteInt16/kTfLiteFloat32/kTfLiteFloat64)
176
177    -   Mandatory output tensor for the score of each category classified.
178
179    -   If type is one of the Int types, dequantize it to double/float to
180        corresponding platforms
181
182    -   Can have an optional associated file in the output tensor's
183        corresponding [Metadata](../../models/convert/metadata) for category
184        labels, the file should be a plain text file with one label per line,
185        and the number of labels should match the number of categories as the
186        model outputs. See the
187        [example label file](https://github.com/tensorflow/tflite-support/blob/master/tensorflow_lite_support/metadata/python/tests/testdata/nl_classifier/labels.txt).
188
189*   Output label tensor: (kTfLiteString/kTfLiteInt32)
190
191    -   Optional output tensor for the label for each category, should be of the
192        same length as the output score tensor. If this tensor is not present,
193        the API uses score indices as classnames.
194
195    -   Will be ignored if the associated label file is present in output score
196        tensor's Metadata.
197