page.title=Android 3.1 APIs excludeFromSuggestions=true sdk.platform.version=3.1 sdk.platform.apiLevel=12 @jd:body
API Level: {@sdkPlatformApiLevel}
For developers, the Android {@sdkPlatformVersion} platform ({@link android.os.Build.VERSION_CODES#HONEYCOMB_MR1}) is available as a downloadable component for the Android SDK. The downloadable platform includes an Android library and system image, as well as a set of emulator skins and more. The downloadable platform includes no external libraries.
For developers, the Android {@sdkPlatformVersion} platform is available as a downloadable component for the Android SDK. The downloadable platform includes an Android library and system image, as well as a set of emulator skins and more. To get started developing or testing against Android {@sdkPlatformVersion}, use the Android SDK Manager to download the platform into your SDK.
The sections below provide a technical overview of what's new for developers in Android 3.1, including new features and changes in the framework API since the previous version.
Android 3.1 introduces powerful new APIs for integrating connected peripherals with applications running on the platform. The APIs are based on a USB (Universal Serial Bus) stack and services that are built into the platform, including support for both USB host and device interactions. Using the APIs, developers can create applications that are able to discover, communicate with, and manage a variety of device types connected over USB.
The stack and APIs distinguish two basic types of USB hardware, based on whether the Android-powered device is acting as host or the external hardware is acting as host:
For both types — USB devices and USB accessories — the platform's USB APIs support discovery by intent broadcast when attached or detached, as well as standard interfaces, endpoints, and transfer modes (control, bulk, and interrupt).
The USB APIs are available in the package {@link android.hardware.usb}. The central class is {@link android.hardware.usb.UsbManager}, which provides helper methods for identifying and communicating with both USB devices and USB accessories. Applications can acquire an instance of {@link android.hardware.usb.UsbManager} and then query for the list of attached devices or accessories and then communicate with or manage them. {@link android.hardware.usb.UsbManager} also declares intent actions that the system broadcasts, to announce when a USB device or accessory is attached or detached.
Other classes include:
Note that although the USB stack is built into the platform, actual support for USB host and open accessory modes on specific devices is determined by their manufacturers. In particular, host mode relies on appropriate USB controller hardware in the Android-powered device.
Additionally, developers can request filtering on Google Play, such that their applications are not availabe to users whose devices do not provide the appropriate USB support. To request filtering, add one or both of the elements below to the application manifest, as appropriate:
<uses-feature
android:name="android.hardware.usb.host"
android:required="true">
<uses-feature
android:name="android.hardware.usb.accessory"
android:required="true">
For complete information about how to develop applications that interact with USB accessories, please see the developer documentation.
To look at sample applications that use the USB host API, see ADB Test and Missile Launcher
Android 3.1 exposes a new MTP API that lets applications interact directly with connected cameras and other PTP devices. The new API makes it easy for an application to receive notifications when devices are attached and removed, manage files and storage on those devices, and transfer files and metadata to and from them. The MTP API implements the PTP (Picture Transfer Protocol) subset of the MTP (Media Transfer Protocol) specification.
The MTP API is available in the {@link android.mtp} package and provides these classes:
Android 3.1 extends the input subsystem to support new input devices and new types of motion events, across all views and windows. Developers can build on these capabilities to let users interact with their applications using mice, trackballs, joysticks, gamepads, and other devices, in addition to keyboards and touchscreens.
For handling mouse, scrollwheel, and trackball input, the platform supports two new motion event actions:
HOVER_MOVE
event. Hover enter and exit
notifications are not yet supported.To support joysticks and gamepads, the {@link android.view.InputDevice} class includes these new input device sources:
To describe motion events from these new sources, as well as those from mice and trackballs, the platform now defines axis codes on {@link android.view.MotionEvent}, similar to how it defines key codes on {@link android.view.KeyEvent}. New axis codes for joysticks and game controllers include {@link android.view.MotionEvent#AXIS_HAT_X}, {@link android.view.MotionEvent#AXIS_HAT_Y}, {@link android.view.MotionEvent#AXIS_RTRIGGER}, {@link android.view.MotionEvent#AXIS_ORIENTATION}, {@link android.view.MotionEvent#AXIS_THROTTLE}, and many others. Existing {@link android.view.MotionEvent} axes are represented by {@link android.view.MotionEvent#AXIS_X}, {@link android.view.MotionEvent#AXIS_Y}, {@link android.view.MotionEvent#AXIS_PRESSURE}, {@link android.view.MotionEvent#AXIS_SIZE}, {@link android.view.MotionEvent#AXIS_TOUCH_MAJOR}, {@link android.view.MotionEvent#AXIS_TOUCH_MINOR}, {@link android.view.MotionEvent#AXIS_TOOL_MAJOR}, {@link android.view.MotionEvent#AXIS_TOOL_MINOR}, and {@link android.view.MotionEvent#AXIS_ORIENTATION}.
Additionally, {@link android.view.MotionEvent} defines a number of generic axis codes that are used when the framework does not know how to map a particular axis. Specific devices can use the generic axis codes to pass custom motion data to applications. For a full list of axes and their intended interpretations, see the {@link android.view.MotionEvent} class documentation.
The platform provides motion events to applications in batches, so a single event may contain a current position and multiple so-called historical movements. Applications should use {@link android.view.MotionEvent#getHistorySize()} to get the number of historical samples, then retrieve and process all historical samples in order using {@link android.view.MotionEvent#getHistoricalAxisValue(int, int, int) getHistoricalAxisValue()}. After that, applications should process the current sample using {@link android.view.MotionEvent#getAxisValue(int) getAxisValue()}.
Some axes can be retrieved using special accessor methods. For example, instead of calling {@link android.view.MotionEvent#getAxisValue(int) getAxisValue()}, applications can call {@link android.view.MotionEvent#getX(int) getX()}. Axes that have built-in accessors include {@link android.view.MotionEvent#AXIS_X}, {@link android.view.MotionEvent#AXIS_Y}, {@link android.view.MotionEvent#AXIS_PRESSURE}, {@link android.view.MotionEvent#AXIS_SIZE}, {@link android.view.MotionEvent#AXIS_TOUCH_MAJOR}, {@link android.view.MotionEvent#AXIS_TOUCH_MINOR}, {@link android.view.MotionEvent#AXIS_TOOL_MAJOR}, {@link android.view.MotionEvent#AXIS_TOOL_MINOR}, and {@link android.view.MotionEvent#AXIS_ORIENTATION}.
Each input device has a unique, system-assigned ID and may also provide multiple sources. When a device provides multiple sources, more than one source can provide axis data using the same axis. For example, a touch event coming from the touch source uses the X axis for screen position data, while a joystick event coming from the joystick source will use the X axis for the stick position instead. For this reason, it's important for applications to interpret axis values according to the source from which they originate. When handling a motion event, applications should use methods on the {@link android.view.InputDevice} class to determine the axes supported by a device or source. Specifically, applications can use {@link android.view.InputDevice#getMotionRanges() getMotionRanges()} to query for all axes of a device or all axes of a given source of the device. In both cases, the range information for axes returned in the {@link android.view.InputDevice.MotionRange} object specifies the source for each axis value.
Finally, since the motion events from joysticks, gamepads, mice, and trackballs are not touch events, the platform adds a new callback method for passing them to a {@link android.view.View} as "generic" motion events. Specifically, it reports the non-touch motion events to {@link android.view.View}s through a call to {@link android.view.View#onGenericMotionEvent(android.view.MotionEvent) onGenericMotionEvent()}, rather than to {@link android.view.View#onTouchEvent(android.view.MotionEvent) onTouchEvent()}.
The platform dispatches generic motion events differently, depending on the event source class. {@link android.view.InputDevice#SOURCE_CLASS_POINTER} events go to the {@link android.view.View} under the pointer, similar to how touch events work. All others go to the currently focused {@link android.view.View}. For example, this means a {@link android.view.View} must take focus in order to receive joystick events. If needed, applications can handle these events at the level of Activity or Dialog by implementing {@link android.view.View#onGenericMotionEvent(android.view.MotionEvent) onGenericMotionEvent()} there instead.
To look at a sample application that uses joystick motion events, see GameControllerInput and GameView.
Android 3.1 exposes an API to its built-in RTP (Real-time Transport Protocol) stack, which applications can use to manage on-demand or interactive data streaming. In particular, apps that provide VOIP, push-to-talk, conferencing, and audio streaming can use the API to initiate sessions and transmit or receive data streams over any available network.
The RTP API is available in the {@link android.net.rtp} package. Classes include:
To support audio conferencing and similar usages, an application instantiates two classes as endpoints for the stream:
The simplest usage involves a single remote endpoint and local endpoint. For more complex usages, please refer to the limitations described for {@link android.net.rtp.AudioGroup}.
To use the RTP API, applications must request permission from the user by
declaring <uses-permission
android:name="android.permission.INTERNET">
in their manifest files. To acquire the device microphone, the <uses-permission
android:name="android.permission.RECORD_AUDIO">
permission is also required.
Starting in Android 3.1, developers can make their homescreen widgets resizeable — horizontally, vertically, or on both axes. Users touch-hold a widget to show its resize handles, then drag the horizontal and/or vertical handles to change the size on the layout grid.
Developers can make any Home screen widget resizeable by defining a
resizeMode
attribute in the widget's {@link
android.appwidget.AppWidgetProviderInfo} metadata. Values for the
resizeMode
attribute include "horizontal", "vertical", and "none".
To declare a widget as resizeable horizontally and vertically, supply the value
"horizontal|vertical".
Here's an example:
<appwidget-provider xmlns:android="http://schemas.android.com/apk/res/android" android:minWidth="294dp" android:minHeight="72dp" android:updatePeriodMillis="86400000" android:previewImage="@drawable/preview" android:initialLayout="@layout/example_appwidget" android:configure="com.example.android.ExampleAppWidgetConfigure" android:resizeMode="horizontal|vertical" > </appwidget-provider>
For more information about Home screen widgets, see the App Widgets documentation.
Using the {@link android.view.ViewPropertyAnimator} is straightforward. To animate properties for
a {@link android.view.View}, call {@link android.view.View#animate()} to
construct a {@link android.view.ViewPropertyAnimator} object for that {@link android.view.View}. Use the
methods on the {@link android.view.ViewPropertyAnimator} to specify what property to
animate and how to animate it. For example, to fade the {@link android.view.View} to transparent,
call alpha(0);
. The {@link android.view.ViewPropertyAnimator} object
handles the details of configuring the underlying {@link
android.animation.Animator} class and starting it, then rendering the
animation.
ViewAnimator
To create a high-performance lock, pass {@link android.net.wifi.WifiManager#WIFI_MODE_FULL_HIGH_PERF} as the lock mode in a call to {@link android.net.wifi.WifiManager#createWifiLock(int, java.lang.String) createWifiLock()}.
Starting from Android 3.1, the system's package manager keeps track of applications that are in a stopped state and provides a means of controlling their launch from background processes and other applications.
Note that an application's stopped state is not the same as an Activity's stopped state. The system manages those two stopped states separately.
The platform defines two new intent flags that let a sender specify whether the Intent should be allowed to activate components in stopped application.
When neither or both of these flags is defined in an intent, the default behavior is to include filters of stopped applications in the list of potential targets.
Note that the system adds {@link android.content.Intent#FLAG_EXCLUDE_STOPPED_PACKAGES} to all broadcast intents. It does this to prevent broadcasts from background services from inadvertently or unnecessarily launching components of stoppped applications. A background service or application can override this behavior by adding the {@link android.content.Intent#FLAG_INCLUDE_STOPPED_PACKAGES} flag to broadcast intents that should be allowed to activate stopped applications.
Applications are in a stopped state when they are first installed but are not yet launched and when they are manually stopped by the user (in Manage Applications).
The platform adds improved notification of application first launch and upgrades through two new intent actions:
This intent is sent directly to the application, but only if the application was upgraded while it was in started state (not in a stopped state).
int
file:
URI scheme. You can use {@link
android.webkit.CookieManager#setAcceptFileSchemeCookies(boolean)
setAcceptFileSchemeCookies()} to
enable/disable support for file scheme cookies, before constructing an instance
of WebView
or CookieManager
. In a
CookieManager
instance, you can check whether file scheme cookies
is enabled by calling {@link
android.webkit.CookieManager#allowFileSchemeCookies()}.The Browser application adds the following features to support web applications:
<video>
tag. Playback is hardware-accelerated where possible.
The platform adds new hardware feature constants that developers can declare in their application manifests, to inform external entities such as Google Play of the application's requirement for new hardware capabilities supported in this version of the platform. Developers declare these and other feature constants in {@code <uses-feature>} manifest elements.
Google Play filters applications based on features declared in {@code <uses-feature>} manifest elements. For more information about declaring features in an application manifest, read Google Play Filters.
For a detailed view of all API changes in Android {@sdkPlatformVersion} (API Level {@sdkPlatformApiLevel}), see the API Differences Report.
The Android {@sdkPlatformVersion} platform delivers an updated version of the framework API. The Android {@sdkPlatformVersion} API is assigned an integer identifier — {@sdkPlatformApiLevel} — that is stored in the system itself. This identifier, called the "API Level", allows the system to correctly determine whether an application is compatible with the system, prior to installing the application.
To use APIs introduced in Android {@sdkPlatformVersion} in your application,
you need compile the application against the Android library that is provided in
the Android {@sdkPlatformVersion} SDK platform. Depending on your needs, you
might
also need to add an android:minSdkVersion="{@sdkPlatformApiLevel}"
attribute to the <uses-sdk>
element in the application's
manifest.
For more information, read What is API Level?