• Home
  • Line#
  • Scopes#
  • Navigate#
  • Raw
  • Download
1page.title=Android 4.3 APIs
2excludeFromSuggestions=true
3sdk.platform.version=4.3
4sdk.platform.apiLevel=18
5@jd:body
6
7
8<div id="qv-wrapper">
9<div id="qv">
10
11<h2>In this document
12    <a href="#" onclick="hideNestedItems('#toc43',this);return false;" class="header-toggle">
13        <span class="more">show more</span>
14        <span class="less" style="display:none">show less</span></a></h2>
15
16<ol id="toc43" class="hide-nested">
17  <li><a href="#ApiLevel">Update your target API level</a></li>
18  <li><a href="#Behaviors">Important Behavior Changes</a>
19    <ol>
20      <li><a href="#BehaviorsIntents">If your app uses implicit intents...</a></li>
21      <li><a href="#BehaviorsAccounts">If your app depends on accounts...</a></li>
22      <li><a href="#BehaviorsVideoView">If your app uses VideoView...</a></li>
23    </ol>
24  </li>
25  <li><a href="#RestrictedProfiles">Restricted Profiles</a>
26    <ol>
27      <li><a href="#AccountsInProfile">Supporting accounts in a restricted profile</a></li>
28    </ol>
29  </li>
30  <li><a href="#Wireless">Wireless and Connectivity</a>
31    <ol>
32      <li><a href="#BTLE">Bluetooth Low Energy (Smart Ready)</a></li>
33      <li><a href="#WiFiScan">Wi-Fi scan-only mode</a></li>
34      <li><a href="#WiFiConfig">Wi-Fi configuration</a></li>
35      <li><a href="#QuickResponse">Quick response for incoming calls</a></li>
36    </ol>
37  </li>
38  <li><a href="#Multimedia">Multimedia</a>
39    <ol>
40      <li><a href="#MediaExtractor">MediaExtractor and MediaCodec enhancements</a></li>
41      <li><a href="#DRM">Media DRM</a></li>
42      <li><a href="#EncodingSurface">Video encoding from a Surface</a></li>
43      <li><a href="#MediaMuxing">Media muxing</a></li>
44      <li><a href="#ProgressAndScrubbing">Playback progress and scrubbing for RemoteControlClient</a></li>
45    </ol>
46  </li>
47  <li><a href="#Graphics">Graphics</a>
48    <ol>
49      <li><a href="#OpenGL">Support for OpenGL ES 3.0</a></li>
50      <li><a href="#MipMap">Mipmapping for drawables</a></li>
51    </ol>
52  </li>
53  <li><a href="#UI">User Interface</a>
54    <ol>
55      <li><a href="#ViewOverlay">View overlays</a></li>
56      <li><a href="#OpticalBounds">Optical bounds layout</a></li>
57      <li><a href="#AnimationRect">Animation for Rect values</a></li>
58      <li><a href="#AttachFocus">Window attach and focus listener</a></li>
59      <li><a href="#Overscan">TV overscan support</a></li>
60      <li><a href="#Orientation">Screen orientation</a></li>
61      <li><a href="#RotationAnimation">Rotation animations</a></li>
62    </ol>
63  </li>
64  <li><a href="#UserInput">User Input</a>
65    <ol>
66      <li><a href="#Sensors">New sensor types</a></li>
67    </ol>
68  </li>
69  <li><a href="#NotificationListener">Notification Listener</a></li>
70  <li><a href="#Contacts">Contacts Provider</a>
71    <ol>
72      <li><a href="#Contactables">Query for "contactables"</a></li>
73      <li><a href="#ContactsDelta">Query for contacts deltas</a></li>
74    </ol>
75  </li>
76  <li><a href="#Localization">Localization</a>
77    <ol>
78      <li><a href="#BiDi">Improved support for bi-directional text</a></li>
79    </ol>
80  </li>
81  <li><a href="#A11yService">Accessibility Services</a>
82    <ol>
83      <li><a href="#A11yKeyEvents">Handle key events</a></li>
84      <li><a href="#A11yText">Select text and copy/paste</a></li>
85      <li><a href="#A11yFeatures">Declare accessibility features</a></li>
86    </ol>
87  </li>
88  <li><a href="#Testing">Testing and Debugging</a>
89    <ol>
90      <li><a href="#UiAutomation">Automated UI testing</a></li>
91      <li><a href="#Systrace">Systrace events for apps</a></li>
92    </ol>
93  </li>
94  <li><a href="#Security">Security</a>
95    <ol>
96      <li><a href="#KeyStore">Android key store for app-private keys</a></li>
97      <li><a href="#HardwareKeyChain">Hardware credential storage</a></li>
98    </ol>
99  </li>
100  <li><a href="#Manifest">Manifest Declarations</a>
101    <ol>
102      <li><a href="#ManifestFeatures">Declarable required features</a></li>
103      <li><a href="#ManifestPermissions">User permissions</a></li>
104    </ol>
105  </li>
106</ol>
107
108<h2>See also</h2>
109<ol>
110<li><a href="{@docRoot}sdk/api_diff/18/changes.html">API
111Differences Report &raquo;</a> </li>
112<li><a
113href="{@docRoot}tools/support-library/index.html">Support Library</a></li>
114</ol>
115
116</div>
117</div>
118
119
120
121<p>API Level: {@sdkPlatformApiLevel}</p>
122
123<p>Android {@sdkPlatformVersion} ({@link android.os.Build.VERSION_CODES#JELLY_BEAN_MR2})
124is an update to the Jelly Bean release that offers new features for users and app
125developers. This document provides an introduction to the most notable
126new APIs.</p>
127
128<p>As an app developer, you should download the Android {@sdkPlatformVersion} system image
129and SDK platform from the <a href="{@docRoot}tools/help/sdk-manager.html">SDK Manager</a> as
130soon as possible. If you don't have a device running Android {@sdkPlatformVersion} on which to
131test your app, use the Android {@sdkPlatformVersion} system
132image to test your app on the <a href="{@docRoot}tools/devices/emulator.html">Android emulator</a>.
133Then build your apps against the Android {@sdkPlatformVersion} platform to begin using the
134latest APIs.</p>
135
136
137<h3 id="ApiLevel">Update your target API level</h3>
138
139<p>To better optimize your app for devices running Android {@sdkPlatformVersion},
140  you should set your <a
141href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#target">{@code targetSdkVersion}</a> to
142<code>"{@sdkPlatformApiLevel}"</code>, install it on an Android {@sdkPlatformVersion} system image,
143test it, then publish an update with this change.</p>
144
145<p>You can use APIs in Android {@sdkPlatformVersion} while also supporting older versions by adding
146conditions to your code that check for the system API level before executing
147APIs not supported by your <a
148href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#min">{@code minSdkVersion}</a>.
149To learn more about maintaining backward compatibility, read <a
150href="{@docRoot}training/basics/supporting-devices/platforms.html">Supporting Different
151Platform Versions</a>.</p>
152
153<p>Various APIs are also available in the Android <a
154href="{@docRoot}tools/support-library/index.html">Support Library</a> that allow you to implement
155new features on older versions of the platform.</p>
156
157<p>For more information about how API levels work, read <a
158href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#ApiLevels">What is API
159Level?</a></p>
160
161
162
163
164
165<h2 id="Behaviors">Important Behavior Changes</h2>
166
167<p>If you have previously published an app for Android, be aware that your app might
168be affected by changes in Android {@sdkPlatformVersion}.</p>
169
170
171<h3 id="BehaviorsIntents">If your app uses implicit intents...</h3>
172
173<p>Your app might misbehave in a restricted profile environment.</p>
174
175<p>Users in a <a href="#RestrictedProfiles">restricted profile</a> environment might not
176have all the standard Android apps available. For example, a restricted profile might have the
177web browser and camera app disabled. So your app should not make assumptions about which apps are
178available, because if you call {@link android.app.Activity#startActivity startActivity()} without
179verifying whether an app is available to handle the {@link android.content.Intent},
180your app might crash in a restricted profile.</p>
181
182<p>When using an implicit intent, you should always verify that an app is available to handle the intent by calling {@link android.content.Intent#resolveActivity resolveActivity()} or {@link android.content.pm.PackageManager#queryIntentActivities queryIntentActivities()}. For example:</p>
183
184<pre>
185Intent intent = new Intent(Intent.ACTION_SEND);
186...
187if (intent.resolveActivity(getPackageManager()) != null) {
188    startActivity(intent);
189} else {
190    Toast.makeText(context, R.string.app_not_available, Toast.LENGTH_LONG).show();
191}
192</pre>
193
194
195<h3 id="BehaviorsAccounts">If your app depends on accounts...</h3>
196
197<p>Your app might misbehave in a restricted profile environment.</p>
198
199<p>Users within a restricted profile environment do not have access to user accounts by default.
200If your app depends on an {@link android.accounts.Account}, then your app might crash or behave
201unexpectedly when used in a restricted profile.</p>
202
203<p>If you'd like to prevent restricted profiles from using your app entirely because your
204app depends on account information that's sensitive, specify the <a
205href="{@docRoot}guide/topics/manifest/application-element.html#requiredAccountType">{@code
206android:requiredAccountType}</a> attribute in your manifest's <a
207href="{@docRoot}guide/topics/manifest/application-element.html">{@code &lt;application>}</a>
208element.</p>
209
210<p>If you’d like to allow restricted profiles to continue using your app even though they can’t
211create their own accounts, then you can either disable your app features that require an account
212or allow restricted profiles to access the accounts created by the primary user. For more
213information, see the section
214below about <a href="#AccountsInProfile">Supporting accounts in a restricted profile</a>.</p>
215
216
217<h3 id="BehaviorsVideoView">If your app uses VideoView...</h3>
218
219<p>Your video might appear smaller on Android 4.3.</p>
220
221<p>On previous versions of Android, the {@link android.widget.VideoView} widget incorrectly
222calculated the {@code "wrap_content"} value for {@link android.R.attr#layout_height} and {@link
223android.R.attr#layout_width} to be the same as {@code "match_parent"}. So while using {@code
224"wrap_content"} for the height or width may have previously provided your desired video layout,
225doing so may result in a much smaller video on Android 4.3 and higher. To fix the issue, replace
226{@code "wrap_content"} with {@code "match_parent"} and verify your video appears as expected on
227Android 4.3 as well as on older versions.</p>
228
229
230
231
232
233
234<h2 id="RestrictedProfiles">Restricted Profiles</h2>
235
236<p>On Android tablets, users can now create restricted profiles based on the primary user.
237When users create a restricted profile, they can enable restrictions such as which apps are
238available to the profile. A new set of APIs in Android 4.3 also allow you to build fine-grain
239restriction settings for the apps you develop. For example, by using the new APIs, you can
240allow users to control what type of content is available within your app when running in a
241restricted profile environment.</p>
242
243<p>The UI for users to control the restrictions you've built is managed by the system's
244Settings application. To make your app's restriction settings appear to the user,
245you must declare the restrictions your app provides by creating a {@link
246android.content.BroadcastReceiver} that receives the {@link android.content.Intent#ACTION_GET_RESTRICTION_ENTRIES} intent. The system invokes this intent to query
247all apps for available restrictions, then builds the UI to allow the primary user to
248manage restrictions for each restricted profile. </p>
249
250<p>In the {@link android.content.BroadcastReceiver#onReceive onReceive()} method of
251your {@link android.content.BroadcastReceiver}, you must create a {@link
252android.content.RestrictionEntry} for each restriction your app provides. Each {@link
253android.content.RestrictionEntry} defines a restriction title, description, and one of the
254following data types:</p>
255
256<ul>
257  <li>{@link android.content.RestrictionEntry#TYPE_BOOLEAN} for a restriction that is
258  either true or false.
259  <li>{@link android.content.RestrictionEntry#TYPE_CHOICE} for a restriction that has
260  multiple choices that are mutually exclusive (radio button choices).
261  <li>{@link android.content.RestrictionEntry#TYPE_MULTI_SELECT} for a restriction that
262  has multiple choices that are <em>not</em> mutually exclusive (checkbox choices).
263</ul>
264
265<p>You then put all the {@link android.content.RestrictionEntry} objects into an {@link
266java.util.ArrayList} and put it into the broadcast receiver's result as the value for the
267{@link android.content.Intent#EXTRA_RESTRICTIONS_LIST} extra.</p>
268
269<p>The system creates the UI for your app's restrictions in the Settings app and saves each
270restriction with the unique key you provided for each {@link android.content.RestrictionEntry}
271object. When the user opens your app, you can query for any current restrictions by
272calling {@link android.os.UserManager#getApplicationRestrictions getApplicationRestrictions()}.
273This returns a {@link android.os.Bundle} containing the key-value pairs for each restriction
274you defined with the {@link android.content.RestrictionEntry} objects.</p>
275
276<p>If you want to provide more specific restrictions that can't be handled by boolean, single
277choice, and multi-choice values, then you can create an activity where the user can specify the
278restrictions and allow users to open that activity from the restriction settings. In your
279broadcast receiver, include the {@link android.content.Intent#EXTRA_RESTRICTIONS_INTENT} extra
280in the result {@link android.os.Bundle}. This extra must specify an {@link android.content.Intent}
281indicating the {@link android.app.Activity} class to launch (use the
282{@link android.os.Bundle#putParcelable putParcelable()} method to pass {@link
283android.content.Intent#EXTRA_RESTRICTIONS_INTENT} with the intent).
284When the primary user enters your activity to set custom restrictions, your
285activity must then return a result containing the restriction values in an extra using either
286the {@link android.content.Intent#EXTRA_RESTRICTIONS_LIST} or {@link
287android.content.Intent#EXTRA_RESTRICTIONS_BUNDLE} key, depending on whether you specify
288{@link android.content.RestrictionEntry} objects or key-value pairs, respectively.</p>
289
290
291<h3 id="AccountsInProfile">Supporting accounts in a restricted profile</h3>
292
293<p>Any accounts added to the primary user are available to a restricted profile, but the
294accounts are not accessible from the {@link android.accounts.AccountManager} APIs by default.
295If you attempt to add an account with {@link android.accounts.AccountManager} while in a restricted
296profile, you will get a failure result. Due to these restrictions, you have the following
297three options:</p>
298
299<li><strong>Allow access to the owner’s accounts from a restricted profile.</strong>
300<p>To get access to an account from a restricted profile, you must add the <a href="{@docRoot}guide/topics/manifest/application-element.html#restrictedAccountType">{@code android:restrictedAccountType}</a> attribute to the <a
301href="{@docRoot}guide/topics/manifest/application-element.html">&lt;application></a> tag:</p>
302<pre>
303&lt;application ...
304    android:restrictedAccountType="com.example.account.type" >
305</pre>
306
307<p class="caution"><strong>Caution:</strong> Enabling this attribute provides your
308app access to the primary user's accounts from restricted profiles. So you should allow this
309only if the information displayed by your app does not reveal personally identifiable
310information (PII) that’s considered sensitive. The system settings will inform the primary
311user that your app grants restricted profiles to their accounts, so it should be clear to the user
312that account access is important for your app's functionality. If possible, you should also
313provide adequate restriction controls for the primary user that define how much account access
314is allowed in your app.</p>
315</li>
316
317
318<li><strong>Disable certain functionality when unable to modify accounts.</strong>
319<p>If you want to use accounts, but don’t actually require them for your app’s primary
320functionality, you can check for account availability and disable features when not available.
321You should first check if there is an existing account available. If not, then query whether
322it’s possible to create a new account by calling {@link
323android.os.UserManager#getUserRestrictions()} and check the {@link
324android.os.UserManager#DISALLOW_MODIFY_ACCOUNTS} extra in the result. If it is {@code true},
325then you should disable whatever functionality of your app requires access to accounts.
326For example:</p>
327<pre>
328UserManager um = (UserManager) context.getSystemService(Context.USER_SERVICE);
329Bundle restrictions = um.getUserRestrictions();
330if (restrictions.getBoolean(UserManager.DISALLOW_MODIFY_ACCOUNTS, false)) {
331   // cannot add accounts, disable some functionality
332}
333</pre>
334<p class="note"><strong>Note:</strong> In this scenario, you should <em>not</em> declare
335any new attributes in your manifest file.</p>
336</li>
337
338<li><strong>Disable your app when unable to access private accounts.</strong>
339<p>If it’s instead important that your app not be available to restricted profiles because
340your app depends on sensitive personal information in an account (and because restricted profiles
341currently cannot add new accounts), add
342the <a href="{@docRoot}guide/topics/manifest/application-element.html#requiredAccountType">{@code
343android:requiredAccountType}</a> attribute to the <a
344href="{@docRoot}guide/topics/manifest/application-element.html">&lt;application></a> tag:</p>
345<pre>
346&lt;application ...
347    android:requiredAccountType="com.example.account.type" >
348</pre>
349<p>For example, the Gmail app uses this attribute to disable itself for restricted profiles,
350because the owner's personal email should not be available to restricted profiles.</p>
351</li>
352
353
354
355<h2 id="Wireless">Wireless and Connectivity</h2>
356
357
358<h3 id="BTLE">Bluetooth Low Energy (Smart Ready)</h3>
359
360<p>Android now supports Bluetooth Low Energy (LE) with new APIs in {@link android.bluetooth}.
361With the new APIs, you can build Android apps that communicate with Bluetooth Low Energy
362peripherals such as heart rate monitors and pedometers.</p>
363
364<p>Because Bluetooth LE is a hardware feature that is not available on all
365Android-powered devices, you must declare in your manifest file a <a
366href="{@docRoot}guide/topics/manifest/uses-feature-element.html">{@code &lt;uses-feature>}</a>
367element for {@code "android.hardware.bluetooth_le"}:</p>
368<pre>
369&lt;uses-feature android:name="android.hardware.bluetooth_le" android:required="true" />
370</pre>
371
372<p>If you're already familiar with Android's Classic Bluetooth APIs, notice that using the
373Bluetooth LE APIs has some differences. Most importantly is that there's now a {@link
374android.bluetooth.BluetoothManager} class that you should use for some high level operations
375such as acquiring a {@link android.bluetooth.BluetoothAdapter}, getting a list of connected
376devices, and checking the state of a device. For example, here's how you should now get the
377{@link android.bluetooth.BluetoothAdapter}:</p>
378<pre>
379final BluetoothManager bluetoothManager =
380        (BluetoothManager) getSystemService(Context.BLUETOOTH_SERVICE);
381mBluetoothAdapter = bluetoothManager.getAdapter();
382</pre>
383
384<p>To discover Bluetooth LE peripherals, call {@link android.bluetooth.BluetoothAdapter#startLeScan
385startLeScan()} on the {@link android.bluetooth.BluetoothAdapter}, passing it an implementation
386of the {@link android.bluetooth.BluetoothAdapter.LeScanCallback} interface. When the Bluetooth
387adapter detects a Bluetooth LE peripheral, your {@link
388android.bluetooth.BluetoothAdapter.LeScanCallback} implementation receives a call to the
389{@link android.bluetooth.BluetoothAdapter.LeScanCallback#onLeScan onLeScan()} method. This
390method provides you with a {@link android.bluetooth.BluetoothDevice} object representing the
391detected device, the RSSI value for the device, and a byte array containing the device's
392advertisement record.</p>
393
394<p>If you want to scan for only specific types of peripherals, you can instead call {@link
395android.bluetooth.BluetoothAdapter#startLeScan startLeScan()} and include an array of {@link
396java.util.UUID} objects that specify the GATT services your app supports.</p>
397
398<p class="note"><strong>Note:</strong> You can only scan for Bluetooth LE devices <em>or</em>
399scan for Classic Bluetooth devices using previous APIs. You cannot scan for both LE and Classic
400Bluetooth devices at once.</p>
401
402<p>To then connect to a Bluetooth LE peripheral, call {@link
403android.bluetooth.BluetoothDevice#connectGatt connectGatt()} on the corresponding
404{@link android.bluetooth.BluetoothDevice} object, passing it an implementation of
405{@link android.bluetooth.BluetoothGattCallback}. Your implementation of {@link
406android.bluetooth.BluetoothGattCallback} receives callbacks regarding the connectivity
407state with the device and other events. It's during the {@link
408android.bluetooth.BluetoothGattCallback#onConnectionStateChange onConnectionStateChange()}
409callback that you can begin communicating with the device if the method passes {@link
410android.bluetooth.BluetoothProfile#STATE_CONNECTED} as the new state.</p>
411
412<p>Accessing Bluetooth features on a device also requires that your app request certain
413Bluetooth user permissions. For more information, see the <a
414href="{@docRoot}guide/topics/connectivity/bluetooth-le.html">Bluetooth Low Energy</a> API guide.</p>
415
416
417<h3 id="WiFiScan">Wi-Fi scan-only mode</h3>
418
419<p>When attempting to identify the user's location, Android may use Wi-Fi to help determine
420the location by scanning nearby access points. However, users often keep Wi-Fi turned off to
421conserve battery, resulting in location data that's less accurate. Android now includes a
422scan-only mode that allows the device Wi-Fi to scan access points to help obtain the location
423without connecting to an access point, thus greatly reducing battery usage.</p>
424
425<p>If you want to acquire the user's location but Wi-Fi is currently off, you can request the
426user to enable Wi-Fi scan-only mode by calling {@link android.content.Context#startActivity
427startActivity()} with the action {@link
428android.net.wifi.WifiManager#ACTION_REQUEST_SCAN_ALWAYS_AVAILABLE}.</p>
429
430
431<h3 id="WiFiConfig">Wi-Fi configuration</h3>
432
433<p>New {@link android.net.wifi.WifiEnterpriseConfig} APIs allow enterprise-oriented services to
434automate Wi-Fi configuration for managed devices.</p>
435
436
437<h3 id="QuickResponse">Quick response for incoming calls</h3>
438
439<p>Since Android 4.0, a feature called "Quick response" allows users to respond to incoming
440calls with an immediate text message without needing to pick up the call or unlock the device.
441Until now, these quick messages were always handled by the default Messaging app. Now any app
442can declare its capability to handle these messages by creating a {@link android.app.Service}
443with an intent filter for {@link android.telephony.TelephonyManager#ACTION_RESPOND_VIA_MESSAGE}.</p>
444
445<p>When the user responds to an incoming call with a quick response, the Phone app sends
446the {@link android.telephony.TelephonyManager#ACTION_RESPOND_VIA_MESSAGE} intent with a URI
447describing the recipient (the caller) and the {@link android.content.Intent#EXTRA_TEXT} extra
448with the message the user wants to send. When your service receives the intent, it should deliver
449the message and immediately stop itself (your app should not show an activity).</p>
450
451<p>In order to receive this intent, you must declare the {@link
452android.Manifest.permission#SEND_RESPOND_VIA_MESSAGE} permission.</p>
453
454
455
456<h2 id="Multimedia">Multimedia</h2>
457
458<h3 id="MediaExtractor">MediaExtractor and MediaCodec enhancements</h3>
459
460<p>Android now makes it easier for you to write your own Dynamic Adaptive
461Streaming over HTTP (DASH) players in accordance with the ISO/IEC 23009-1 standard,
462using existing APIs in {@link android.media.MediaCodec} and {@link
463android.media.MediaExtractor}. The framework underlying these APIs has been updated to support
464parsing of fragmented MP4 files, but your app is still responsible for parsing the MPD metadata
465and passing the individual streams to {@link android.media.MediaExtractor}.</p>
466
467<p>If you want to use DASH with encrypted content, notice that the {@link android.media.MediaExtractor#getSampleCryptoInfo getSampleCryptoInfo()} method returns the {@link
468android.media.MediaCodec.CryptoInfo} metadata describing the structure of each encrypted media
469sample. Also, the {@link android.media.MediaExtractor#getPsshInfo()} method has been added to
470{@link android.media.MediaExtractor} so you can access the PSSH metadata for your DASH media.
471This method returns a map of {@link java.util.UUID} objects to bytes, with the
472{@link java.util.UUID} specifying the crypto scheme, and the bytes being the data specific
473to that scheme.</p>
474
475
476<h3 id="DRM">Media DRM</h3>
477
478<p>The new {@link android.media.MediaDrm} class provides a modular solution for digital rights
479management (DRM) with your media content by separating DRM concerns from media playback. For
480instance, this API separation allows you to play back Widevine-encrypted content without having
481to use the Widevine media format. This DRM solution also supports DASH Common Encryption so you
482can use a variety of DRM schemes with your streaming content.</p>
483
484<p>You can use {@link android.media.MediaDrm} to obtain opaque key-request messages and process
485key-response messages from the server for license acquisition and provisioning. Your app is
486responsible for handling the network communication with the servers; the {@link
487android.media.MediaDrm} class provides only the ability to generate and process the messages.</p>
488
489<p>The {@link android.media.MediaDrm} APIs are  intended to be used in conjunction with the
490{@link android.media.MediaCodec} APIs that were introduced in Android 4.1 (API level 16),
491including {@link android.media.MediaCodec} for encoding and decoding your content, {@link
492android.media.MediaCrypto} for handling encrypted content, and {@link android.media.MediaExtractor}
493for extracting and demuxing your content.</p>
494
495<p>You must first construct {@link android.media.MediaExtractor} and
496{@link android.media.MediaCodec} objects. You can then access the DRM-scheme-identifying
497{@link java.util.UUID}, typically from metadata in the content, and use it to construct an
498instance of a {@link android.media.MediaDrm} object with its constructor.</p>
499
500
501<h3 id="EncodingSurface">Video encoding from a Surface</h3>
502
503<p>Android 4.1 (API level 16) added the {@link android.media.MediaCodec} class for low-level
504encoding and decoding of media content. When encoding video, Android 4.1 required that you provide
505the media with a {@link java.nio.ByteBuffer} array, but Android 4.3 now allows you to use a {@link
506android.view.Surface} as the input to an encoder. For instance, this allows you to encode input
507from an existing video file or using frames generated from OpenGL ES.</p>
508
509<p>To use a {@link android.view.Surface} as the input to your encoder, first call {@link
510android.media.MediaCodec#configure configure()} for your {@link android.media.MediaCodec}.
511Then call {@link android.media.MediaCodec#createInputSurface()} to receive the {@link
512android.view.Surface} upon which you can stream your media.</p>
513
514<p>For example, you can use the given {@link android.view.Surface} as the window for an OpenGL
515context by passing it to {@link android.opengl.EGL14#eglCreateWindowSurface
516eglCreateWindowSurface()}. Then while rendering the surface, call {@link
517android.opengl.EGL14#eglSwapBuffers eglSwapBuffers()} to pass the frame to the {@link
518android.media.MediaCodec}.</p>
519
520<p>To begin encoding, call {@link android.media.MediaCodec#start()} on the {@link
521android.media.MediaCodec}. When done, call {@link android.media.MediaCodec#signalEndOfInputStream}
522to terminate encoding, and call {@link android.view.Surface#release()} on the
523{@link android.view.Surface}.</p>
524
525
526<h3 id="MediaMuxing">Media muxing</h3>
527
528<p>The new {@link android.media.MediaMuxer} class enables multiplexing between one audio stream
529and one video stream. These APIs serve as a counterpart to the {@link android.media.MediaExtractor}
530class added in Android 4.2 for de-multiplexing (demuxing) media.</p>
531
532<p>Supported output formats are defined in {@link android.media.MediaMuxer.OutputFormat}. Currently,
533MP4 is the only supported output format and {@link android.media.MediaMuxer} currently supports
534only one audio stream and/or one video stream at a time.</p>
535
536<p>{@link android.media.MediaMuxer} is mostly designed to work with {@link android.media.MediaCodec}
537so you can perform video processing through {@link android.media.MediaCodec} then save the
538output to an MP4 file through {@link android.media.MediaMuxer}. You can also use {@link
539android.media.MediaMuxer} in combination with {@link android.media.MediaExtractor} to perform
540media editing without the need to encode or decode.</p>
541
542
543<h3 id="ProgressAndScrubbing">Playback progress and scrubbing for RemoteControlClient</h3>
544
545<p>In Android 4.0 (API level 14), the {@link android.media.RemoteControlClient} was added to
546enable media playback controls from remote control clients such as the controls available on the
547lock screen. Android 4.3 now provides the ability for such controllers to display the playback
548position and controls for scrubbing the playback. If you've enabled remote control for your
549media app with the {@link android.media.RemoteControlClient} APIs, then you can allow playback
550scrubbing by implementing two new interfaces.</p>
551
552<p>First, you must enable the {@link
553android.media.RemoteControlClient#FLAG_KEY_MEDIA_POSITION_UPDATE} flag by passing it to
554{@link android.media.RemoteControlClient#setTransportControlFlags setTransportControlsFlags()}.</p>
555
556<p>Then implement the following two new interfaces:</p>
557<dl>
558  <dt>{@link android.media.RemoteControlClient.OnGetPlaybackPositionListener}</dt>
559  <dd>This includes the callback {@link android.media.RemoteControlClient.OnGetPlaybackPositionListener#onGetPlaybackPosition}, which requests the current position
560  of your media when the remote control needs to update the progress in its UI.</dd>
561
562  <dt>{@link android.media.RemoteControlClient.OnPlaybackPositionUpdateListener}</dt>
563  <dd>This includes the callback {@link android.media.RemoteControlClient.OnPlaybackPositionUpdateListener#onPlaybackPositionUpdate onPlaybackPositionUpdate()}, which
564  tells your app the new time code for your media when the user scrubs the playback with the
565  remote control UI.
566    <p>Once you update your playback with the new position, call {@link
567    android.media.RemoteControlClient#setPlaybackState setPlaybackState()} to indicate the
568    new playback state, position, and speed.</p>
569  </dd>
570</dl>
571
572<p>With these interfaces defined, you can set them for your {@link
573android.media.RemoteControlClient} by calling {@link android.media.RemoteControlClient#setOnGetPlaybackPositionListener setOnGetPlaybackPositionListener()} and
574{@link android.media.RemoteControlClient#setPlaybackPositionUpdateListener
575setPlaybackPositionUpdateListener()}, respectively.</p>
576
577
578
579<h2 id="Graphics">Graphics</h2>
580
581<h3 id="OpenGL">Support for OpenGL ES 3.0</h3>
582
583<p>Android 4.3 adds Java interfaces and native support for OpenGL ES 3.0. Key new functionality
584provided in OpenGL ES 3.0 includes:</p>
585<ul>
586  <li>Acceleration of advanced visual effects</li>
587  <li>High quality ETC2/EAC texture compression as a standard feature</li>
588  <li>A new version of the GLSL ES shading language with integer and 32-bit floating point support</li>
589  <li>Advanced texture rendering</li>
590  <li>Broader standardization of texture size and render-buffer formats</li>
591</ul>
592
593<p>The Java interface for OpenGL ES 3.0 on Android is provided with {@link android.opengl.GLES30}.
594When using OpenGL ES 3.0, be sure that you declare it in your manifest file with the
595<a href="{@docRoot}guide/topics/manifest/uses-feature-element.html">&lt;uses-feature></a>
596tag and the {@code android:glEsVersion} attribute. For example:</p>
597<pre>
598&lt;manifest>
599    &lt;uses-feature android:glEsVersion="0x00030000" />
600    ...
601&lt;/manifest>
602</pre>
603
604<p>And remember to specify the OpenGL ES context by calling {@link
605android.opengl.GLSurfaceView#setEGLContextClientVersion setEGLContextClientVersion()},
606passing {@code 3} as the version.</p>
607
608<p>For more information about using OpenGL ES, including how to check the device's supported
609OpenGL ES version at runtime, see the <a href="{@docRoot}guide/topics/graphics/opengl.html"
610>OpenGL ES</a> API guide.</p>
611
612
613<h3 id="MipMap">Mipmapping for drawables</h3>
614
615<p>Using a mipmap as the source for your bitmap or drawable is a simple way to provide a
616quality image and various image scales, which can be particularly useful if you expect your
617image to be scaled during an animation.</p>
618
619<p>Android 4.2 (API level 17) added support for mipmaps in the {@link android.graphics.Bitmap}
620class&mdash;Android swaps the mip images in your {@link android.graphics.Bitmap} when you've
621supplied a mipmap source and have enabled {@link android.graphics.Bitmap#setHasMipMap
622setHasMipMap()}. Now in Android 4.3, you can enable mipmaps for a {@link
623android.graphics.drawable.BitmapDrawable} object as well, by providing a mipmap asset and
624setting the {@code android:mipMap} attribute in a bitmap resource file or by calling {@link
625android.graphics.drawable.BitmapDrawable#hasMipMap hasMipMap()}.
626</p>
627
628
629
630<h2 id="UI">User Interface</h2>
631
632<h3 id="ViewOverlay">View overlays</h3>
633
634<p>The new {@link android.view.ViewOverlay} class provides a transparent layer on top of
635a {@link android.view.View} on which you can add visual content and which does not affect
636the layout hierarchy. You can get a {@link android.view.ViewOverlay} for any {@link
637android.view.View} by calling {@link android.view.View#getOverlay}. The overlay
638always has the same size and position as its host view (the view from which it was created),
639allowing you to add content that appears in front of the host view, but which cannot extend
640the bounds of that host view.
641</p>
642
643<p>Using a {@link android.view.ViewOverlay} is particularly useful when you want to create
644animations such as sliding a view outside of its container or moving items around the screen
645without affecting the view hierarchy. However, because the usable area of an overlay is
646restricted to the same area as its host view, if you want to animate a view moving outside
647its position in the layout, you must use an overlay from a parent view that has the desired
648layout bounds.</p>
649
650<p>When you create an overlay for a widget view such as a {@link android.widget.Button}, you
651can add {@link android.graphics.drawable.Drawable} objects to the overlay by calling
652{@link android.view.ViewOverlay#add(Drawable)}. If you call {@link
653android.view.ViewGroup#getOverlay} for a layout view, such as {@link android.widget.RelativeLayout},
654the object returned is a {@link android.view.ViewGroupOverlay}. The
655{@link android.view.ViewGroupOverlay} class is a subclass
656of {@link android.view.ViewOverlay} that  also allows you to add {@link android.view.View}
657objects by calling {@link android.view.ViewGroupOverlay#add(View)}.
658</p>
659
660<p class="note"><strong>Note:</strong> All drawables and views that you add to an overlay
661are visual only. They cannot receive focus or input events.</p>
662
663<p>For example, the following code animates a view sliding to the right by placing the view
664in the parent view's overlay, then performing a translation animation on that view:</p>
665<pre>
666View view = findViewById(R.id.view_to_remove);
667ViewGroup container = (ViewGroup) view.getParent();
668container.getOverlay().add(view);
669ObjectAnimator anim = ObjectAnimator.ofFloat(view, "translationX", container.getRight());
670anim.start();
671</pre>
672
673
674<h3 id="OpticalBounds">Optical bounds layout</h3>
675
676<p>For views that contain nine-patch background images, you can now specify that they should
677be aligned with neighboring views based on the "optical" bounds of the background image rather
678than the "clip" bounds of the view.</p>
679
680<p>For example, figures 1 and 2 each show the same layout, but the version in figure 1 is
681using clip bounds (the default behavior), while figure 2 is using optical bounds. Because the
682nine-patch images used for the button and the photo frame include padding around the edges,
683they don’t appear to align with each other or the text when using clip bounds.</p>
684
685<p class="note"><strong>Note:</strong> The screenshot in figures 1 and 2 have the "Show
686layout bounds" developer setting enabled. For each view, red lines indicate the optical
687bounds, blue lines indicate the clip bounds, and pink indicates margins.</p>
688
689<script type="text/javascript">
690function toggleOpticalImages(mouseover) {
691
692  $("img.optical-img").each(function() {
693    $img = $(this);
694    var index = $img.attr('src').lastIndexOf("/") + 1;
695    var path = $img.attr('src').substr(0,index);
696    var name = $img.attr('src').substr(index);
697    var splitname;
698    var highres = false;
699    if (name.indexOf("@2x") != -1) {
700      splitname = name.split("@2x.");
701      highres = true;
702    } else {
703      splitname = name.split(".");
704    }
705
706    var newname;
707    if (mouseover) {
708      if (highres) {
709        newname = splitname[0] + "-normal@2x.png";
710      } else {
711        newname = splitname[0] + "-normal.png";
712      }
713    } else {
714      if (highres) {
715        newname = splitname[0].split("-normal")[0] + "@2x.png";
716      } else {
717        newname = splitname[0].split("-normal")[0] + ".png";
718      }
719    }
720
721    $img.attr('src', path + newname);
722
723  });
724}
725</script>
726
727<p class="table-caption"><em>Mouse over to hide the layout bounds.</em></p>
728<div style="float:left;width:296px">
729<img src="{@docRoot}images/tools/clipbounds@2x.png" width="296" alt="" class="optical-img"
730    onmouseover="toggleOpticalImages(true)" onmouseout="toggleOpticalImages(false)" />
731<p class="img-caption"><strong>Figure 1.</strong> Layout using clip bounds (default).</p>
732</div>
733<div style="float:left;width:296px;margin-left:60px">
734<img src="{@docRoot}images/tools/opticalbounds@2x.png" width="296" alt="" class="optical-img"
735    onmouseover="toggleOpticalImages(true)" onmouseout="toggleOpticalImages(false)" />
736<p class="img-caption"><strong>Figure 2.</strong> Layout using optical bounds.</p>
737</div>
738
739
740<p style="clear:left">To align the views based on their optical bounds, set the {@code android:layoutMode} attribute to {@code "opticalBounds"} in one of the parent layouts. For example:</p>
741
742<pre>
743&lt;LinearLayout android:layoutMode="opticalBounds" ... >
744</pre>
745
746
747<div class="figure" style="width:155px">
748<img src="{@docRoot}images/tools/ninepatch_opticalbounds@2x.png" width="121" alt="" />
749<p class="img-caption"><strong>Figure 3.</strong> Zoomed view of the Holo button nine-patch with
750optical bounds.
751</p>
752</div>
753
754<p>For this to work, the nine-patch images applied to the background of your views must specify
755the optical bounds using red lines along the bottom and right-side of the nine-patch file (as
756shown in figure 3). The red lines indicate the region that should be subtracted from
757the clip bounds, leaving the optical bounds of the image.</p>
758
759<p>When you enable optical bounds for a {@link android.view.ViewGroup} in your layout, all
760descendant views inherit the optical bounds layout mode unless you override it for a group by
761setting {@code android:layoutMode} to {@code "clipBounds"}. All layout elements also honor the
762optical bounds of their child views, adapting their own bounds based on the optical bounds of
763the views within them. However, layout elements (subclasses of {@link android.view.ViewGroup})
764currently do not support optical bounds for nine-patch images applied to their own background.</p>
765
766<p>If you create a custom view by subclassing {@link android.view.View}, {@link android.view.ViewGroup}, or any subclasses thereof, your view will inherit these optical bound behaviors.</p>
767
768<p class="note"><strong>Note:</strong> All widgets supported by the Holo theme have been updated
769with optical bounds, including {@link android.widget.Button},  {@link android.widget.Spinner},
770{@link android.widget.EditText}, and others. So you can immediately benefit by setting the
771{@code android:layoutMode} attribute to {@code "opticalBounds"} if your app applies a Holo theme
772({@link android.R.style#Theme_Holo Theme.Holo}, {@link android.R.style#Theme_Holo_Light
773Theme.Holo.Light}, etc.).
774</p>
775
776<p>To specify optical bounds for your own nine-patch images with the <a
777href="{@docRoot}tools/help/draw9patch.html">Draw 9-patch</a> tool, hold CTRL when clicking on
778the border pixels.</p>
779
780
781
782
783<h3 id="AnimationRect">Animation for Rect values</h3>
784
785<p>You can now animate between two {@link android.graphics.Rect} values with the new {@link
786android.animation.RectEvaluator}. This new class is an implementation of {@link
787android.animation.TypeEvaluator} that you can pass to {@link
788android.animation.ValueAnimator#setEvaluator ValueAnimator.setEvaluator()}.
789</p>
790
791<h3 id="AttachFocus">Window attach and focus listener</h3>
792
793<p>Previously, if you wanted to listen for when your view attached/detached to the window or
794when its focus changed, you needed to override the {@link android.view.View} class to
795implement {@link android.view.View#onAttachedToWindow onAttachedToWindow()} and {@link
796android.view.View#onDetachedFromWindow onDetachedFromWindow()}, or  {@link
797android.view.View#onWindowFocusChanged onWindowFocusChanged()}, respectively.
798</p>
799
800<p>Now, to receive attach and detach events you can instead implement {@link
801android.view.ViewTreeObserver.OnWindowAttachListener} and set it on a view with
802{@link android.view.ViewTreeObserver#addOnWindowAttachListener addOnWindowAttachListener()}.
803And to receive focus events, you can implement {@link
804android.view.ViewTreeObserver.OnWindowFocusChangeListener} and set it on a view with
805{@link android.view.ViewTreeObserver#addOnWindowFocusChangeListener
806addOnWindowFocusChangeListener()}.
807</p>
808
809
810<h3 id="Overscan">TV overscan support</h3>
811
812<p>To be sure your app fills the entire screen on every television, you can now enable overscan
813for you app layout. Overscan mode is determined by the {@link android.view.WindowManager.LayoutParams#FLAG_LAYOUT_IN_OVERSCAN} flag, which you can enable with platform themes such as
814{@link android.R.style#Theme_DeviceDefault_NoActionBar_Overscan} or by enabling the
815{@link android.R.attr#windowOverscan} style in a custom theme.</p>
816
817
818<h3 id="Orientation">Screen orientation</h3>
819
820<p>The <a
821href="{@docRoot}guide/topics/manifest/activity-element.html">{@code &lt;activity>}</a>
822tag's <a
823href="{@docRoot}guide/topics/manifest/activity-element.html#screen">{@code screenOrientation}</a>
824attribute now supports additional values to honor the user's preference for auto-rotation:</p>
825<dl>
826<dt>{@code "userLandscape"}</dt>
827<dd>Behaves the same as {@code "sensorLandscape"}, except if the user disables auto-rotate
828then it locks in the normal landscape orientation and will not flip.
829</dd>
830
831<dt>{@code "userPortrait"}</dt>
832<dd>Behaves the same as {@code "sensorPortrait"}, except if the user disables auto-rotate then
833it locks in the normal portrait orientation and will not flip.
834</dd>
835
836<dt>{@code "fullUser"}</dt>
837<dd>Behaves the same as {@code "fullSensor"} and allows rotation in all four directions, except
838if the user disables auto-rotate then it locks in the user's preferred orientation.
839</dd></dl>
840
841<p>Additionally, you can now also declare {@code "locked"} to lock your app's orientation into
842the screen's current orientation.</p>
843
844
845<h3 id="RotationAnimation">Rotation animations</h3>
846
847<p>The new {@link android.view.WindowManager.LayoutParams#rotationAnimation} field in
848{@link android.view.WindowManager} allows you to select between one of three animations you
849want to use when the system switches screen orientations. The three animations are:</p>
850<ul>
851  <li>{@link android.view.WindowManager.LayoutParams#ROTATION_ANIMATION_CROSSFADE}</li>
852  <li>{@link android.view.WindowManager.LayoutParams#ROTATION_ANIMATION_JUMPCUT}</li>
853  <li>{@link android.view.WindowManager.LayoutParams#ROTATION_ANIMATION_ROTATE}</li>
854</ul>
855
856<p class="note"><strong>Note:</strong> These animations are available only if you've set your activity to use "fullscreen" mode, which you can enable with themes such as {@link android.R.style#Theme_Holo_NoActionBar_Fullscreen Theme.Holo.NoActionBar.Fullscreen}.</p>
857
858<p>For example, here's how you can enable the "crossfade" animation:</p>
859<pre>
860protected void onCreate(Bundle savedInstanceState) {
861    super.onCreate(savedInstanceState);
862
863    WindowManager.LayoutParams params = getWindow().getAttributes();
864    params.rotationAnimation = WindowManager.LayoutParams.ROTATION_ANIMATION_CROSSFADE;
865    getWindow().setAttributes(params);
866    ...
867}
868</pre>
869
870
871<h2 id="UserInput">User Input</h2>
872
873<h3 id="Sensors">New sensor types</h3>
874<p>The new {@link android.hardware.Sensor#TYPE_GAME_ROTATION_VECTOR} sensor allows you to detect the device's rotations without worrying about magnetic interferences. Unlike the {@link android.hardware.Sensor#TYPE_ROTATION_VECTOR} sensor, the {@link android.hardware.Sensor#TYPE_GAME_ROTATION_VECTOR} is not based on magnetic north.</p>
875
876<p>The new {@link android.hardware.Sensor#TYPE_GYROSCOPE_UNCALIBRATED} and {@link
877android.hardware.Sensor#TYPE_MAGNETIC_FIELD_UNCALIBRATED} sensors provide raw sensor data without
878consideration for bias estimations. That is, the existing {@link
879android.hardware.Sensor#TYPE_GYROSCOPE} and {@link android.hardware.Sensor#TYPE_MAGNETIC_FIELD}
880sensors provide sensor data that takes into account estimated bias from gyro-drift and hard iron
881in the device, respectively. Whereas the new "uncalibrated" versions of these sensors instead provide
882the raw sensor data and offer the estimated bias values separately. These sensors allow you to
883provide your own custom calibration for the sensor data by enhancing the estimated bias with
884external data.</p>
885
886
887
888<h2 id="NotificationListener">Notification Listener</h2>
889
890<p>Android 4.3 adds a new service class, {@link android.service.notification.NotificationListenerService}, that allows your app to receive information about new notifications as they are posted by the system. </p>
891
892<p>If your app currently uses the accessibility service APIs to access system notifications, you should update your app to use these APIs instead.</p>
893
894
895
896
897<h2 id="Contacts">Contacts Provider</h2>
898
899<h3 id="Contactables">Query for "contactables"</h3>
900
901<p>The new Contacts Provider query, {@link android.provider.ContactsContract.CommonDataKinds.Contactables#CONTENT_URI Contactables.CONTENT_URI}, provides an efficient way to get one {@link android.database.Cursor} that contains all email addresses and phone numbers belonging to all contacts matching the specified query.</p>
902
903
904<h3 id="ContactsDelta">Query for contacts deltas</h3>
905
906<p>New APIs have been added to Contacts Provider that allow you to efficiently query recent changes to the contacts data. Previously, your app could be notified when something in the contacts data changed, but you would not know exactly what changed and would need to retrieve all contacts then iterate through them to discover the change.</p>
907
908<p>To track changes to inserts and updates, you can now include the {@link android.provider.ContactsContract.ContactsColumns#CONTACT_LAST_UPDATED_TIMESTAMP} parameter with your selection to query only the contacts that have changed since the last time you queried the provider.</p>
909
910<p>To track which contacts have been deleted, the new table {@link android.provider.ContactsContract.DeletedContacts} provides a log of contacts that have been deleted (but each contact deleted is held in this table for a limited time). Similar to {@link android.provider.ContactsContract.ContactsColumns#CONTACT_LAST_UPDATED_TIMESTAMP}, you can use the new selection parameter, {@link android.provider.ContactsContract.DeletedContacts#CONTACT_DELETED_TIMESTAMP} to check which contacts have been deleted since the last time you queried the provider. The table also contains the constant {@link android.provider.ContactsContract.DeletedContacts#DAYS_KEPT_MILLISECONDS} containing the number of days (in milliseconds) that the log will be kept.</p>
911
912<p>Additionally, the Contacts Provider now broadcasts the {@link
913android.provider.ContactsContract.Intents#CONTACTS_DATABASE_CREATED} action when the user
914clears the contacts storage through the system settings menu, effectively recreating the
915Contacts Provider database. It’s intended to signal apps that they need to drop all the contact
916information they’ve stored and reload it with a new query.</p>
917
918<p>For sample code using these APIs to check for changes to the contacts, look in the ApiDemos
919sample available in the <a href="{@docRoot}tools/samples/index.html">SDK Samples</a> download.</p>
920
921
922<h2 id="Localization">Localization</h2>
923
924<h3 id="BiDi">Improved support for bi-directional text</h3>
925
926<p>Previous versions of Android support right-to-left (RTL) languages and layout,
927but sometimes don't properly handle mixed-direction text. So Android 4.3 adds the {@link
928android.text.BidiFormatter} APIs that help you properly format text with opposite-direction
929content without garbling any parts of it.</p>
930
931<p>For example, when you want to create a sentence with a string variable, such as "Did you mean
93215 Bay Street, Laurel, CA?", you normally pass a localized string resource and the variable to
933{@link java.lang.String#format String.format()}:</p>
934<pre>
935Resources res = getResources();
936String suggestion = String.format(res.getString(R.string.did_you_mean), address);
937</pre>
938
939<p>However, if the locale is Hebrew, then the formatted string comes out like this:</p>
940
941<p dir="rtl">האם התכוונת ל 15 Bay Street, Laurel, CA?</p>
942
943<p>That's wrong because the "15" should be left of "Bay Street." The solution is to use {@link
944android.text.BidiFormatter} and its {@link android.text.BidiFormatter#unicodeWrap(String)
945unicodeWrap()} method. For example, the code above becomes:</p>
946<pre>
947Resources res = getResources();
948BidiFormatter bidiFormatter = BidiFormatter.getInstance();
949String suggestion = String.format(res.getString(R.string.did_you_mean),
950        bidiFormatter.unicodeWrap(address));
951</pre>
952
953<p>
954By default, {@link android.text.BidiFormatter#unicodeWrap(String) unicodeWrap()} uses the
955first-strong directionality estimation heuristic, which can get things wrong if the first
956signal for text direction does not represent the appropriate direction for the content as a whole.
957If necessary, you can specify a different heuristic by passing one of the {@link
958android.text.TextDirectionHeuristic} constants from {@link android.text.TextDirectionHeuristics}
959to {@link android.text.BidiFormatter#unicodeWrap(String,TextDirectionHeuristic) unicodeWrap()}.</p>
960
961<p class="note"><strong>Note:</strong> These new APIs are also available for previous versions
962of Android through the Android <a href="{@docRoot}tools/support-library/index.html">Support
963Library</a>, with the {@link android.support.v4.text.BidiFormatter} class and related APIs.</p>
964
965
966
967<h2 id="A11yService">Accessibility Services</h2>
968
969<h3 id="A11yKeyEvents">Handle key events</h3>
970
971<p>An {@link android.accessibilityservice.AccessibilityService} can now receive a callback for
972key input events with the {@link android.accessibilityservice.AccessibilityService#onKeyEvent
973onKeyEvent()} callback method. This allows your accessibility service to handle input for
974key-based input devices such as a keyboard and translate those events to special actions that
975previously may have been possible only with touch input or the device's directional pad.</p>
976
977
978<h3 id="A11yText">Select text and copy/paste</h3>
979
980<p>The {@link android.view.accessibility.AccessibilityNodeInfo} now provides APIs that allow
981an {@link android.accessibilityservice.AccessibilityService} to select, cut, copy, and paste
982text in a node.</p>
983
984<p>To specify the selection of text to cut or copy, your accessibility service can use the new
985action, {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_SET_SELECTION}, passing
986with it the selection start and end position with {@link
987android.view.accessibility.AccessibilityNodeInfo#ACTION_ARGUMENT_SELECTION_START_INT} and {@link
988android.view.accessibility.AccessibilityNodeInfo#ACTION_ARGUMENT_SELECTION_END_INT}.
989Alternatively you can select text by manipulating the cursor position using the existing
990action, {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_NEXT_AT_MOVEMENT_GRANULARITY}
991(previously only for moving the cursor position), and adding the argument {@link
992android.view.accessibility.AccessibilityNodeInfo#ACTION_ARGUMENT_EXTEND_SELECTION_BOOLEAN}.</p>
993
994<p>You can then cut or copy with {@link android.view.accessibility.AccessibilityNodeInfo#ACTION_CUT},
995{@link android.view.accessibility.AccessibilityNodeInfo#ACTION_COPY}, then later paste with
996{@link android.view.accessibility.AccessibilityNodeInfo#ACTION_PASTE}.</p>
997
998
999<p class="note"><strong>Note:</strong> These new APIs are also available for previous versions
1000of Android through the Android <a href="{@docRoot}tools/support-library/index.html">Support
1001Library</a>, with the {@link android.support.v4.view.accessibility.AccessibilityNodeInfoCompat}
1002class.</p>
1003
1004
1005
1006<h3 id="A11yFeatures">Declare accessibility features</h3>
1007
1008<p>Beginning with Android 4.3, an accessibility service must declare accessibility capabilities
1009in its metadata file in order to use certain accessibility features. If the capability is not
1010requested in the metadata file, then the feature will be a no-op. To declare your service's
1011accessibility capabilities, you must use XML attributes that correspond to the various
1012"capability" constants in the {@link android.accessibilityservice.AccessibilityServiceInfo}
1013class.</p>
1014
1015<p>For example, if a service does not request the {@link android.R.styleable#AccessibilityService_canRequestFilterKeyEvents flagRequestFilterKeyEvents} capability,
1016then it will not receive key events.</p>
1017
1018
1019<h2 id="Testing">Testing and Debugging</h2>
1020
1021<h3 id="UiAutomation">Automated UI testing</h3>
1022
1023<p>The new {@link android.app.UiAutomation} class provides APIs that allow you to simulate user
1024actions for test automation. By using the platform's {@link
1025android.accessibilityservice.AccessibilityService} APIs, the {@link android.app.UiAutomation}
1026APIs allow you to inspect the screen content and inject arbitrary keyboard and touch events.</p>
1027
1028<p>To get an instance of {@link android.app.UiAutomation}, call {@link
1029android.app.Instrumentation#getUiAutomation Instrumentation.getUiAutomation()}. In order
1030for this to work, you must supply the {@code -w} option with the {@code instrument} command
1031when running your {@link android.test.InstrumentationTestCase} from <a
1032href="{@docRoot}tools/help/adb.html#am">{@code adb shell}</a>.</p>
1033
1034<p>With the {@link android.app.UiAutomation} instance, you can execute arbitrary events to test
1035your app by calling {@link android.app.UiAutomation#executeAndWaitForEvent
1036executeAndWaitForEvent()}, passing it a {@link java.lang.Runnable} to perform, a timeout
1037period for the operation, and an implementation of the {@link
1038android.app.UiAutomation.AccessibilityEventFilter} interface. It's within your {@link
1039android.app.UiAutomation.AccessibilityEventFilter} implementation that you'll receive a call
1040that allows you to filter the events that you're interested in and determine the success or
1041failure of a given test case.</p>
1042
1043<p>To observe all the events during a test, create an implementation of {@link
1044android.app.UiAutomation.OnAccessibilityEventListener} and pass it to {@link
1045android.app.UiAutomation#setOnAccessibilityEventListener setOnAccessibilityEventListener()}.
1046Your listener interface then receives a call to {@link
1047android.app.UiAutomation.OnAccessibilityEventListener#onAccessibilityEvent onAccessibilityEvent()}
1048each time an event occurs, receiving an {@link android.view.accessibility.AccessibilityEvent} object
1049that describes the event.</p>
1050
1051<p>There is a variety of other operations that the {@link android.app.UiAutomation} APIs expose
1052at a very low level to encourage the development of UI test tools such as <a href="{@docRoot}tools/help/uiautomator/index.html">uiautomator</a>. For instance,
1053{@link android.app.UiAutomation} can also:</p>
1054<ul>
1055  <li>Inject input events
1056  <li>Change the orientation of the screen
1057  <li>Take screenshots
1058</ul>
1059
1060<p>And most importantly for UI test tools, the {@link android.app.UiAutomation} APIs work
1061across application boundaries, unlike those in {@link android.app.Instrumentation}.</p>
1062
1063
1064<h3 id="Systrace">Systrace events for apps</h3>
1065
1066<p>Android 4.3 adds the {@link android.os.Trace} class with two static methods,
1067{@link android.os.Trace#beginSection beginSection()} and {@link android.os.Trace#endSection()},
1068which allow you to define blocks of code to include with the systrace report. By creating
1069sections of traceable code in your app, the systrace logs provide you a much more detailed
1070analysis of where slowdown occurs within your app.</p>
1071
1072<p>For information about using the Systrace tool, read <a href="{@docRoot}tools/debugging/systrace.html">Analyzing Display and Performance with Systrace</a>.</p>
1073
1074
1075<h2 id="Security">Security</h2>
1076
1077<h3 id="KeyStore">Android key store for app-private keys</h3>
1078
1079<p>Android now offers a custom Java Security Provider in the {@link java.security.KeyStore}
1080facility, called Android Key Store, which allows you to generate and save private keys that
1081may be seen and used by only your app. To load the Android Key Store, pass
1082{@code "AndroidKeyStore"} to {@link java.security.KeyStore#getInstance(String)
1083KeyStore.getInstance()}.</p>
1084
1085<p>To manage your app's private credentials in the Android Key Store, generate a new key with
1086{@link java.security.KeyPairGenerator} with {@link android.security.KeyPairGeneratorSpec}. First
1087get an instance of {@link java.security.KeyPairGenerator} by calling {@link
1088java.security.KeyPairGenerator#getInstance getInstance()}. Then call
1089{@link java.security.KeyPairGenerator#initialize initialize()}, passing it an instance of
1090{@link android.security.KeyPairGeneratorSpec}, which you can get using
1091{@link android.security.KeyPairGeneratorSpec.Builder KeyPairGeneratorSpec.Builder}.
1092Finally, get your {@link java.security.KeyPair} by calling {@link
1093java.security.KeyPairGenerator#generateKeyPair generateKeyPair()}.</p>
1094
1095
1096<h3 id="HardwareKeyChain">Hardware credential storage</h3>
1097
1098<p>Android also now supports hardware-backed storage for your {@link android.security.KeyChain}
1099credentials, providing more security by making the keys unavailable for extraction. That is, once
1100keys are in a hardware-backed key store (Secure Element, TPM, or TrustZone), they can be used for
1101cryptographic operations but the private key material cannot be exported. Even the OS kernel
1102cannot access this key material. While not all Android-powered devices support storage on
1103hardware, you can check at runtime if hardware-backed storage is available by calling
1104{@link android.security.KeyChain#isBoundKeyAlgorithm KeyChain.IsBoundKeyAlgorithm()}.</p>
1105
1106
1107
1108<h2 id="Manifest">Manifest Declarations</h2>
1109
1110<h3 id="ManifestFeatures">Declarable required features</h3>
1111
1112<p>The following values are now supported in the <a
1113href="{@docRoot}guide/topics/manifest/uses-feature-element.html">{@code &lt;uses-feature>}</a>
1114element so you can ensure that your app is installed only on devices that provide the features
1115your app needs.</p>
1116
1117<dl>
1118<dt>{@link android.content.pm.PackageManager#FEATURE_APP_WIDGETS}</dt>
1119<dd>Declares that your app provides an app widget and should be installed only on devices that
1120include a Home screen or similar location where users can embed app widgets.
1121Example:
1122<pre>
1123&lt;uses-feature android:name="android.software.app_widgets" android:required="true" />
1124</pre>
1125</dd>
1126
1127<dt>{@link android.content.pm.PackageManager#FEATURE_HOME_SCREEN}</dt>
1128<dd>Declares that your app behaves as a Home screen replacement and should be installed only on
1129devices that support third-party Home screen apps.
1130Example:
1131<pre>
1132&lt;uses-feature android:name="android.software.home_screen" android:required="true" />
1133</pre>
1134</dd>
1135
1136<dt>{@link android.content.pm.PackageManager#FEATURE_INPUT_METHODS}</dt>
1137<dd>Declares that your app provides a custom input method (a keyboard built with {@link
1138android.inputmethodservice.InputMethodService}) and should be installed only on devices that
1139support third-party input methods.
1140Example:
1141<pre>
1142&lt;uses-feature android:name="android.software.input_methods" android:required="true" />
1143</pre>
1144</dd>
1145
1146<dt>{@link android.content.pm.PackageManager#FEATURE_BLUETOOTH_LE}</dt>
1147<dd>Declares that your app uses Bluetooth Low Energy APIs and should be installed only on devices
1148that are capable of communicating with other devices via Bluetooth Low Energy.
1149Example:
1150<pre>
1151&lt;uses-feature android:name="android.software.bluetooth_le" android:required="true" />
1152</pre>
1153</dd>
1154</dl>
1155
1156
1157<h3 id="ManifestPermissions">User permissions</h3>
1158<p>The following values are now supported in the <a
1159href="{@docRoot}guide/topics/manifest/uses-permission-element.html">{@code &lt;uses-permission>}</a>
1160to declare the
1161permissions your app requires in order to access certain APIs.</p>
1162
1163<dl>
1164<dt>{@link android.Manifest.permission#BIND_NOTIFICATION_LISTENER_SERVICE}
1165</dt>
1166<dd>Required to use the new {@link android.service.notification.NotificationListenerService} APIs.
1167</dd>
1168
1169<dt>{@link android.Manifest.permission#SEND_RESPOND_VIA_MESSAGE}</dt>
1170<dd>Required to receive the {@link android.telephony.TelephonyManager#ACTION_RESPOND_VIA_MESSAGE}
1171intent.</dd>
1172</dl>
1173
1174
1175
1176
1177<p class="note">For a detailed view of all API changes in Android 4.3, see the
1178<a href="{@docRoot}sdk/api_diff/18/changes.html">API Differences Report</a>.</p>
1179
1180
1181
1182