• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1page.title=Optimizing Content for the Assistant
2page.metaDescription=Support contextually relevant actions through the Assist API.
3page.tags=assist, accessibility, now, now on tap
4meta.tags="assistant", "marshmallow", "now"
5page.image=images/cards/card-assist_16-9_2x.png
6
7page.article=true
8@jd:body
9
10<div id="tb-wrapper">
11<div id="tb">
12    <h2>In this document</h2>
13    <ol>
14      <li><a href="#assist_api">Using the Assistant</a>
15      <ol>
16        <li><a href="#source_app">Source app</a></li>
17        <li><a href="#destination_app">Destination app</a></li>
18      </ol>
19      </li>
20      <li><a href="#implementing_your_own_assistant">Implementing Your
21      Own Assistant</a></li>
22    </ol>
23  </div>
24</div>
25
26<p>
27  Android 6.0 Marshmallow introduces a new way for users to engage with apps
28  through the assistant. The assistant is a top-level window that users can view to obtain
29  contextually relevant actions for the current activity. These actions might include deep links
30  to other apps on the device.</p>
31
32<p>
33  Users activate the assistant with a long press on the Home button or by saying a
34  <a href="{@docRoot}reference/android/service/voice/AlwaysOnHotwordDetector.html">keyphrase</a>.
35  In response, the system opens a top-level window that displays contextually
36  relevant actions.
37</p>
38
39<p>
40  Google App implements the assistant overlay window through a feature called
41  Now on Tap, which works with the Android platform-level functionality. The system allows
42  the user to select an assistant app, which obtains contextual information from your app
43  using Android’s Assist API.
44</p>
45<p>
46  This guide explains how Android apps use Android's Assist API to improve the assistant
47  user experience.
48<p/>
49</p>
50
51
52<h2 id="assist_api">Using the Assistant</h2>
53
54<p>
55  Figure 1 illustrates a typical user interaction with the assistant. When the user long-presses
56  the Home button, the Assist API callbacks are invoked
57  in the <em>source</em> app (step 1). The assistant renders the overlay window (steps 2 and 3),
58  and then the user selects the action to perform. The assistant executes the selected action,
59  such as firing an intent with a deep link to the (<em>destination</em>) restaurant app (step 4).
60</p>
61
62<div>
63  <img src="{@docRoot}images/training/assistant/image01.png">
64  <p class="img-caption" style="text-align:center;">
65    Figure 1. Assistant interaction example with the Now on Tap feature of
66    the Google App
67  </p>
68</div>
69
70<p>
71  Users can configure the assistant by selecting <strong>Settings > Apps > Default Apps >
72  Assist &amp; voice input</strong>. Users can change system options such as accessing
73  the screen contents as text and accessing a screenshot, as shown in Figure 2.
74</p>
75
76<div id="assist-input-settings" style="float:right;margin:1em;max-width:300px">
77  <img src="{@docRoot}images/training/assistant/image02.png">
78  <p class="img-caption" style="text-align:center;">
79    Figure 2. Assist &amp; voice input settings
80  </p>
81</div>
82
83<h3 id="source_app">Source app</h3>
84
85<p>
86  To ensure that your app works with the assistant as a source of information for the user,
87  you need only follow <a href=
88  "{@docRoot}guide/topics/ui/accessibility/apps.html">accessibility best
89  practices</a>. This section describes how to provide additional information
90  to help improve the assistant user experience as well as scenarios
91  that need special handling, such as custom Views.
92</p>
93<h4 id="share_additional_information_with_the_assistant">Share additional information
94 with the assistant</h4>
95
96<p>
97  In addition to the text and the screenshot, your app can share
98  other information with the assistant. For example, your music
99  app can choose to pass current album information so that the assistant can
100  suggest smarter actions tailored to the current activity.
101</p>
102
103<p>
104  To provide additional information to the assistant, your app provides
105  <em>global application context</em> by registering an app listener and
106  supplies activity-specific information with activity callbacks as shown in
107  Figure 3:
108</p>
109
110<div>
111  <img src="{@docRoot}images/training/assistant/image03.png">
112  <p class="img-caption" style="text-align:center;">
113    Figure 3. Assist API lifecycle sequence diagram
114  </p>
115</div>
116
117<p>
118  To provide global application context, the app creates an implementation of
119  {@link android.app.Application.OnProvideAssistDataListener} and registers it
120  using {@link
121  android.app.Application#registerOnProvideAssistDataListener(android.app.Application.OnProvideAssistDataListener) registerOnProvideAssistDataListener()}.
122  To provide activity-specific contextual information, the activity
123  overrides {@link android.app.Activity#onProvideAssistData(android.os.Bundle) onProvideAssistData()}
124  and {@link
125  android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()}.
126  The two activity methods are called <em>after</em> the optional global
127  callback is invoked. Because the callbacks execute on the main thread, they should
128  complete <a href="{@docRoot}training/articles/perf-anr.html">promptly</a>.
129  The callbacks are invoked only when the activity is <a href=
130  "{@docRoot}reference/android/app/Activity.html#ActivityLifecycle">running</a>.
131</p>
132
133<h5 id="providing_context">Providing context</h5>
134
135<p>
136  When the user activates the assistant,
137  {@link android.app.Activity#onProvideAssistData(android.os.Bundle) onProvideAssistData()} is called to build a full
138  {@link
139  android.content.Intent#ACTION_ASSIST} Intent with all of the context of the
140  current application represented as an instance of the {@link
141  android.app.assist.AssistStructure}. You can override this method to place
142  anything you like into the bundle to appear in the
143  {@link android.content.Intent#EXTRA_ASSIST_CONTEXT} part of the assist intent.
144</p>
145
146<h5 id="describing_content">Describing content</h5>
147
148<p>
149  Your app can implement {@link
150  android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()}
151  to improve the assistant user experience by providing content-related references
152  related to the current activity. You can describe the app content using the
153  common vocabulary defined by <a href="https://schema.org" class="external-link">Schema.org</a>
154  through a JSON-LD object. In the example below, a music app provides
155  structured data to describe the music album that the user is currently
156  viewing:
157</p>
158
159<pre class="prettyprint">
160&commat;Override
161public void onProvideAssistContent(AssistContent <strong>assistContent</strong>) {
162  super.onProvideAssistContent(<strong>assistContent</strong>);
163
164  String structuredJson = <strong>new </strong>JSONObject()
165       .put(<strong>"@type"</strong>, <strong>"MusicRecording"</strong>)
166       .put(<strong>"@id"</strong>, <strong>"https://example.com/music/recording"</strong>)
167       .put(<strong>"name"</strong>, <strong>"Album Title"</strong>)
168       .toString();
169
170  <strong>assistContent</strong>.setStructuredData(structuredJson);
171}
172</pre>
173
174<p>
175 You can also improve the user experience with custom implementations of
176 {@link
177 android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()},
178 which can provide the following benefits:
179</p>
180<ul>
181  <li><a href="{@docRoot}reference/android/app/assist/AssistContent.html#setIntent(android.content.Intent)">
182  Adjusts the provided content
183  intent</a> to
184  better reflect the top-level context of the activity.</li>
185  <li><a href="{@docRoot}reference/android/app/assist/AssistContent.html#setWebUri(android.net.Uri)">
186  Supplies the URI</a>
187  of the displayed content.</li>
188  <li>Fills in {@link
189  android.app.assist.AssistContent#setClipData(android.content.ClipData) setClipData()} with additional
190  content of interest that the user is currently viewing.</li>
191</ul>
192<p class="note">
193  <strong>Note: </strong>Apps that use a custom text selection implementation likely need
194  to implement {@link
195  android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()}
196  and call {@link android.app.assist.AssistContent#setClipData(android.content.ClipData) setClipData()}.
197</p>
198
199<h4 id="default_implementation">Default implementation</h4>
200
201<p>
202  If neither the {@link
203  android.app.Activity#onProvideAssistData(android.os.Bundle) onProvideAssistData()} nor the {@link
204  android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()}
205  callback is implemented, the system still proceeds and passes the
206  automatically collected information to the assistant unless the current
207  window is flagged as <a href="#excluding_views">secure</a>.
208  As shown in Figure 3, the system uses the default implementations of {@link
209  android.view.View#onProvideStructure(android.view.ViewStructure) onProvideStructure()} and {@link
210  android.view.View#onProvideVirtualStructure(android.view.ViewStructure) onProvideVirtualStructure()} to
211  collect text and view hierarchy information. If your view implements custom
212  text drawing, override {@link
213  android.view.View#onProvideStructure(android.view.ViewStructure) onProvideStructure()} to provide
214  the assistant with the text shown to the user by calling {@link
215  android.view.ViewStructure#setText(java.lang.CharSequence) setText(CharSequence)}.
216</p>
217
218<p>
219  <em>In most cases, implementing accessibility support enables the
220  assistant to obtain the information it needs.</em> To implement accessibility support,
221  observe the best practices described in <a href=
222  "{@docRoot}guide/topics/ui/accessibility/apps.html">Making Applications
223  Accessible</a>, including the following:</p>
224
225<ul>
226  <li>Provide {@link android.R.attr#contentDescription
227  android:contentDescription} attributes.</li>
228  <li>Populate {@link
229  android.view.accessibility.AccessibilityNodeInfo} for custom views.</li>
230  <li>Make
231  sure that custom {@link android.view.ViewGroup ViewGroup} objects correctly
232  <a href="{@docRoot}reference/android/view/ViewGroup.html#getChildAt(int)">expose</a>
233  their children.</li>
234</ul>
235
236<h4 id="excluding_views">Excluding views from the assistant</h4>
237
238<p>
239  To handle sensitive information, your app can exclude the current view from the assistant
240  by setting the {@link android.view.WindowManager.LayoutParams#FLAG_SECURE
241  FLAG_SECURE} layout parameter of the {@link android.view.WindowManager}. You must set {@link
242  android.view.WindowManager.LayoutParams#FLAG_SECURE
243  FLAG_SECURE} explicitly for
244  every window created by the activity, including dialogs. Your app can also use
245  {@link android.view.SurfaceView#setSecure(boolean) setSecure()} to exclude
246  a surface from the assistant. There is no
247  global (app-level) mechanism to exclude all views from the assistant. Note
248  that {@link android.view.WindowManager.LayoutParams#FLAG_SECURE
249  FLAG_SECURE} does not cause the Assist API callbacks to stop
250  firing. The activity that uses {@link android.view.WindowManager.LayoutParams#FLAG_SECURE
251  FLAG_SECURE} can still explicitly
252  provide information to the assistant using the callbacks described earlier
253  this guide.
254</p>
255
256<p class="note"><strong>Note: </strong>For enterprise accounts (Android for Work),
257 the administrator can disable
258 the collection of assistant data for the work profile by using the {@link
259 android.app.admin.DevicePolicyManager#setScreenCaptureDisabled(android.content.ComponentName, boolean)
260 setScreenCaptureDisabled()} method of the {@link android.app.admin.DevicePolicyManager} API.</p>
261
262<h4 id="voice_interactions">Voice interactions</h4>
263
264<p>
265  Assist API callbacks are also invoked upon
266  <a href="{@docRoot}reference/android/service/voice/AlwaysOnHotwordDetector.html">keyphrase
267  detection</a>. For more information, see the
268  <a href="https://developers.google.com/voice-actions/" class="external-link">Voice
269  Actions</a> documentation.
270</p>
271
272<h4 id="z-order_considerations">Z-order considerations</h4>
273
274<p>
275  The assistant uses a lightweight overlay window displayed on top of the
276  current activity. Because the user can activate the assistant at any time,
277  don't create permanent <a
278  href="{@docRoot}reference/android/Manifest.permission.html#SYSTEM_ALERT_WINDOW">
279  system alert</a> windows that interfere with the overlay window, as shown in
280  Figure 4.
281</p>
282
283<div style="">
284  <img src="{@docRoot}images/training/assistant/image04.png">
285  <p class="img-caption" style="text-align:center;">
286    Figure 4. Assist layer Z-order
287  </p>
288</div>
289
290<p>
291  If your app uses <a
292  href="{@docRoot}reference/android/Manifest.permission.html#SYSTEM_ALERT_WINDOW">
293  system alert</a> windows, remove them promptly because leaving them on the
294  screen degrades the user experience.
295</p>
296
297<h3 id="destination_app">Destination app</h3>
298
299<p>
300  The assistant typically takes advantage of deep linking to find destination apps. To make your
301  app a potential destination app, consider adding <a href=
302  "{@docRoot}training/app-indexing/deep-linking.html">deep linking</a> support. The matching
303  between the current user context and deep links or other potential actions displayed in the
304  overlay window (shown in step 3 in Figure 1) is specific to the assistant’s implementation.
305  For
306  example, the Google App uses deep linking and <a href=
307  "https://developers.google.com/app-indexing/" class="external-link">Firebase App Indexing</a> in order to
308  drive traffic to destination apps.
309</p>
310
311<h2 id="implementing_your_own_assistant">Implementing Your Own Assistant </h2>
312
313<p>
314  You may wish to implement your own assistant. As shown in <a href="#assist-input-settings">Figure
315  2</a>, the user can select the active assistant app. The
316  assistant app must provide an implementation of {@link
317  android.service.voice.VoiceInteractionSessionService} and {@link
318  android.service.voice.VoiceInteractionSession} as shown in <a href=
319  "https://android.googlesource.com/platform/frameworks/base/+/marshmallow-release/tests/VoiceInteraction/" class="external-link">
320  this <code>VoiceInteraction</code> example</a>. It also requires the {@link
321  android.Manifest.permission#BIND_VOICE_INTERACTION} permission. The assistant can then
322  receive the text and view hierarchy represented as an instance of the {@link
323  android.app.assist.AssistStructure} in {@link
324  android.service.voice.VoiceInteractionSession#onHandleAssist(android.os.Bundle,
325  android.app.assist.AssistStructure,android.app.assist.AssistContent) onHandleAssist()}.
326  It receives the screenshot through {@link
327  android.service.voice.VoiceInteractionSession#onHandleScreenshot(android.graphics.Bitmap)
328  onHandleScreenshot()}.
329</p>
330