Lines Matching refs:link
140 <p>The contact APIs defined by the {@link android.provider.ContactsContract} provider have been
149 {@link android.provider.ContactsContract.Profile} table. Social apps that maintain a user identity
150 can contribute to the user's profile data by creating a new {@link
151 android.provider.ContactsContract.RawContacts} entry within the {@link
153 not belong in the traditional raw contacts table defined by the {@link
155 the table at {@link android.provider.ContactsContract.Profile#CONTENT_RAW_CONTACTS_URI}. Raw
158 <p>Adding a new raw contact for the profile requires the {@link
160 table, you must request the {@link android.Manifest.permission#READ_PROFILE} permission. However,
168 <p>The {@link android.provider.ContactsContract.Intents#INVITE_CONTACT} intent action allows an app
177 app responds to the {@link android.provider.ContactsContract.Intents#INVITE_CONTACT} intent by
196 photo in the usual {@link android.provider.ContactsContract.CommonDataKinds.Photo#PHOTO} column of a
203 <p>The new {@link android.provider.ContactsContract.DataUsageFeedback} APIs allow you to help track
225 defined by {@link android.provider.CalendarContract}. All the user’s calendar data is stored in a
226 number of tables defined by various subclasses of {@link android.provider.CalendarContract}:</p>
229 <li>The {@link android.provider.CalendarContract.Calendars} table holds the calendar-specific
233 <li>The {@link android.provider.CalendarContract.Events} table holds event-specific information.
237 use the event’s {@code _ID} to link them with the event.</li>
239 <li>The {@link android.provider.CalendarContract.Instances} table holds the start and end time for
244 <li>The {@link android.provider.CalendarContract.Attendees} table holds the event attendee or guest
248 <li>The {@link android.provider.CalendarContract.Reminders} table holds the alert/notification data.
255 <li>The {@link android.provider.CalendarContract.ExtendedProperties} table hold opaque data fields
261 the {@link android.Manifest.permission#READ_CALENDAR} permission (for read access) and
262 {@link android.Manifest.permission#WRITE_CALENDAR} (for write access).</p>
267 <p>If all you want to do is add an event to the user’s calendar, you can use an {@link
268 android.content.Intent#ACTION_INSERT} intent with the data defined by {@link
274 <li>{@link android.provider.CalendarContract.EventsColumns#TITLE Events.TITLE}: Name for the
276 <li>{@link
280 <li>{@link
283 <li>{@link android.provider.CalendarContract.EventsColumns#EVENT_LOCATION Events.EVENT_LOCATION}:
285 <li>{@link android.provider.CalendarContract.EventsColumns#DESCRIPTION Events.DESCRIPTION}: Event
287 <li>{@link android.content.Intent#EXTRA_EMAIL Intent.EXTRA_EMAIL}: Email addresses of those to
289 <li>{@link android.provider.CalendarContract.EventsColumns#RRULE Events.RRULE}: The recurrence
291 <li>{@link android.provider.CalendarContract.EventsColumns#ACCESS_LEVEL Events.ACCESS_LEVEL}:
293 <li>{@link android.provider.CalendarContract.EventsColumns#AVAILABILITY Events.AVAILABILITY}:
316 <p>The {@link android.provider.VoicemailContract} class defines the content provider for the
317 Voicemail Provder. The subclasses {@link android.provider.VoicemailContract.Voicemails} and {@link
345 <li>They must be bound to a {@link android.opengl.GLES20#GL_TEXTURE_2D} texture image</li>
349 <p>An {@link android.media.effect.Effect} object defines a single media effect that you can apply to
350 an image frame. The basic workflow to create an {@link android.media.effect.Effect} is:</p>
353 <li>Call {@link android.media.effect.EffectContext#createWithCurrentGlContext
355 <li>Use the returned {@link android.media.effect.EffectContext} to call {@link
357 of {@link android.media.effect.EffectFactory}.</li>
358 <li>Call {@link android.media.effect.EffectFactory#createEffect createEffect()}, passing it an
359 effect name from @link android.media.effect.EffectFactory}, such as {@link
360 android.media.effect.EffectFactory#EFFECT_FISHEYE} or {@link
364 <p>You can adjust an effect’s parameters by calling {@link android.media.effect.Effect#setParameter
366 different parameters, which are documented with the effect name. For example, {@link
370 <p>To apply an effect on a texture, call {@link android.media.effect.Effect#apply apply()} on the
371 {@link
373 texture. The input texture must be bound to a {@link android.opengl.GLES20#GL_TEXTURE_2D} texture
374 image (usually done by calling the {@link android.opengl.GLES20#glTexImage2D glTexImage2D()}
376 texture image, it will be automatically bound by the effect as a {@link
380 <p>All effects listed in {@link android.media.effect.EffectFactory} are guaranteed to be supported.
383 {@link android.media.effect.EffectFactory#isEffectSupported isEffectSupported()}.</p>
388 <p>The new {@link android.media.RemoteControlClient} allows media players to enable playback
393 <p>To enable remote control clients for your media player, instantiate a {@link
394 android.media.RemoteControlClient} with its constructor, passing it a {@link
395 android.app.PendingIntent} that broadcasts {@link
396 android.content.Intent#ACTION_MEDIA_BUTTON}. The intent must also declare the explicit {@link
397 android.content.BroadcastReceiver} component in your app that handles the {@link
400 <p>To declare which media control inputs your player can handle, you must call {@link
402 {@link android.media.RemoteControlClient}, passing a set of {@code FLAG_KEY_MEDIA_*} flags, such as
403 {@link android.media.RemoteControlClient#FLAG_KEY_MEDIA_PREVIOUS} and {@link
406 <p>You must then register your {@link android.media.RemoteControlClient} by passing it to {@link
408 Once registered, the broadcast receiver you declared when you instantiated the {@link
409 android.media.RemoteControlClient} will receive {@link android.content.Intent#ACTION_MEDIA_BUTTON}
410 events when a button is pressed from a remote control. The intent you receive includes the {@link
411 android.view.KeyEvent} for the media key pressed, which you can retrieve from the intent with {@link
414 <p>To display information on the remote control about the media playing, call {@link
416 {@link android.media.RemoteControlClient.MetadataEditor}. You can supply a bitmap for media artwork,
418 information on available keys see the {@code METADATA_KEY_*} flags in {@link
430 <li>Streaming online media from {@link android.media.MediaPlayer} now requires the {@link
431 android.Manifest.permission#INTERNET} permission. If you use {@link android.media.MediaPlayer} to
432 play content from the Internet, be sure to add the {@link android.Manifest.permission#INTERNET}
436 <li>{@link android.media.MediaPlayer#setSurface(Surface) setSurface()} allows you define a {@link
439 <li>{@link android.media.MediaPlayer#setDataSource(Context,Uri,Map) setDataSource()} allows you to
464 <p>The {@link android.hardware.Camera} class now includes APIs for detecting faces and controlling
474 <p>To detect faces in your camera application, you must register a {@link
475 android.hardware.Camera.FaceDetectionListener} by calling {@link
477 your camera surface and start detecting faces by calling {@link
480 <p>When the system detects one or more faces in the camera scene, it calls the {@link
482 implementation of {@link android.hardware.Camera.FaceDetectionListener}, including an array of
483 {@link android.hardware.Camera.Face} objects.</p>
485 <p>An instance of the {@link android.hardware.Camera.Face} class provides various information about
488 <li>A {@link android.graphics.Rect} that specifies the bounds of the face, relative to the camera's
493 <li>Several {@link android.graphics.Point} objects that indicate where the eyes and mouth are
498 devices, so you should check by calling {@link
501 in which case, those fields in the {@link android.hardware.Camera.Face} object will be null.</p>
508 and auto-exposure. Both features use the new {@link android.hardware.Camera.Area} class to specify
509 the region of the camera’s current view that should be focused or metered. An instance of the {@link
510 android.hardware.Camera.Area} class defines the bounds of the area with a {@link
514 <p>Before setting either a focus area or metering area, you should first call {@link
515 android.hardware.Camera.Parameters#getMaxNumFocusAreas} or {@link
519 <p>To specify the focus or metering areas to use, simply call {@link
520 android.hardware.Camera.Parameters#setFocusAreas setFocusAreas()} or {@link
521 android.hardware.Camera.Parameters#setMeteringAreas setMeteringAreas()}. Each take a {@link
522 java.util.List} of {@link android.hardware.Camera.Area} objects that indicate the areas to consider
524 focus area by touching an area of the preview, which you then translate to an {@link
532 camera app, pass {@link android.hardware.Camera.Parameters#FOCUS_MODE_CONTINUOUS_PICTURE}
533 to {@link android.hardware.Camera.Parameters#setFocusMode setFocusMode()}. When ready to capture
534 a photo, call {@link android.hardware.Camera#autoFocus autoFocus()}. Your {@link
536 focus was achieved. To resume CAF after receiving the callback, you must call {@link
540 video, using {@link android.hardware.Camera.Parameters#FOCUS_MODE_CONTINUOUS_VIDEO}, which was
547 <li>While recording video, you can now call {@link android.hardware.Camera#takePicture
549 call {@link android.hardware.Camera.Parameters#isVideoSnapshotSupported} to be sure the hardware
552 <li>You can now lock auto exposure and white balance with {@link
553 android.hardware.Camera.Parameters#setAutoExposureLock setAutoExposureLock()} and {@link
557 <li>You can now call {@link android.hardware.Camera#setDisplayOrientation
566 <li>{@link android.hardware.Camera#ACTION_NEW_PICTURE Camera.ACTION_NEW_PICTURE}:
570 <li>{@link android.hardware.Camera#ACTION_NEW_VIDEO Camera.ACTION_NEW_VIDEO}:
589 <p>To transmit data between devices using Android Beam, you need to create an {@link
591 the foreground. You must then pass the {@link android.nfc.NdefMessage} to the system in one of two
595 <li>Define a single {@link android.nfc.NdefMessage} to push while in the activity:
596 <p>Call {@link android.nfc.NfcAdapter#setNdefPushMessage setNdefPushMessage()} at any time to set
597 the message you want to send. For instance, you might call this method and pass it your {@link
598 android.nfc.NdefMessage} during your activity’s {@link android.app.Activity#onCreate onCreate()}
600 foreground, the system sends the {@link android.nfc.NdefMessage} to the other device.</p></li>
602 <li>Define the {@link android.nfc.NdefMessage} to push at the time that Android Beam is initiated:
603 <p>Implement {@link android.nfc.NfcAdapter.CreateNdefMessageCallback}, in which your
604 implementation of the {@link
606 method returns the {@link android.nfc.NdefMessage} you want to send. Then pass the {@link
607 android.nfc.NfcAdapter.CreateNdefMessageCallback} implementation to {@link
610 foreground, the system calls {@link
612 the {@link android.nfc.NdefMessage} you want to send. This allows you to define the {@link
618 message to the other device, you can implement {@link
619 android.nfc.NfcAdapter.OnNdefPushCompleteCallback} and set it with {@link
621 then call {@link android.nfc.NfcAdapter.OnNdefPushCompleteCallback#onNdefPushComplete
625 tags. The system invokes an intent with the {@link android.nfc.NfcAdapter#ACTION_NDEF_DISCOVERED}
626 action to start an activity, with either a URL or a MIME type set according to the first {@link
627 android.nfc.NdefRecord} in the {@link android.nfc.NdefMessage}. For the activity you want to
632 <p>If you want your {@link android.nfc.NdefMessage} to carry a URI, you can now use the convenience
633 method {@link android.nfc.NdefRecord#createUri createUri} to construct a new {@link
634 android.nfc.NdefRecord} based on either a string or a {@link android.net.Uri} object. If the URI is
639 <p>You should also pass an “Android application record" with your {@link android.nfc.NdefMessage} in
642 calling {@link android.nfc.NdefRecord#createApplicationRecord createApplicationRecord()}, passing it
675 <p>A new package, {@link android.net.wifi.p2p}, contains all the APIs for performing peer-to-peer
676 connections with Wi-Fi. The primary class you need to work with is {@link
677 android.net.wifi.p2p.WifiP2pManager}, which you can acquire by calling {@link
678 android.app.Activity#getSystemService getSystemService(WIFI_P2P_SERVICE)}. The {@link
681 <li>Initialize your application for P2P connections by calling {@link
684 <li>Discover nearby devices by calling {@link android.net.wifi.p2p.WifiP2pManager#discoverPeers
687 <li>Start a P2P connection by calling {@link android.net.wifi.p2p.WifiP2pManager#connect
694 <li>The {@link android.net.wifi.p2p.WifiP2pManager.ActionListener} interface allows you to receive
697 <li>{@link android.net.wifi.p2p.WifiP2pManager.PeerListListener} interface allows you to receive
698 information about discovered peers. The callback provides a {@link
699 android.net.wifi.p2p.WifiP2pDeviceList}, from which you can retrieve a {@link
703 <li>The {@link android.net.wifi.p2p.WifiP2pManager.GroupInfoListener} interface allows you to
704 receive information about a P2P group. The callback provides a {@link
708 <li>{@link android.net.wifi.p2p.WifiP2pManager.ConnectionInfoListener} interface allows you to
709 receive information about the current connection. The callback provides a {@link
716 <li>{@link android.Manifest.permission#ACCESS_WIFI_STATE}</li>
717 <li>{@link android.Manifest.permission#CHANGE_WIFI_STATE}</li>
718 <li>{@link android.Manifest.permission#INTERNET} (although your app doesn’t technically connect
725 <li>{@link android.net.wifi.p2p.WifiP2pManager#WIFI_P2P_CONNECTION_CHANGED_ACTION}: The P2P
726 connection state has changed. This carries {@link
727 android.net.wifi.p2p.WifiP2pManager#EXTRA_WIFI_P2P_INFO} with a {@link
728 android.net.wifi.p2p.WifiP2pInfo} object and {@link
729 android.net.wifi.p2p.WifiP2pManager#EXTRA_NETWORK_INFO} with a {@link android.net.NetworkInfo}
732 <li>{@link android.net.wifi.p2p.WifiP2pManager#WIFI_P2P_STATE_CHANGED_ACTION}: The P2P state has
733 changed between enabled and disabled. It carries {@link
734 android.net.wifi.p2p.WifiP2pManager#EXTRA_WIFI_STATE} with either {@link
735 android.net.wifi.p2p.WifiP2pManager#WIFI_P2P_STATE_DISABLED} or {@link
738 <li>{@link android.net.wifi.p2p.WifiP2pManager#WIFI_P2P_PEERS_CHANGED_ACTION}: The list of peer
741 <li>{@link android.net.wifi.p2p.WifiP2pManager#WIFI_P2P_THIS_DEVICE_CHANGED_ACTION}: The details for
745 <p>See the {@link android.net.wifi.p2p.WifiP2pManager} documentation for more information. Also
759 <p>Similar to regular headset and A2DP profile devices, you must call {@link
760 android.bluetooth.BluetoothAdapter#getProfileProxy getProfileProxy()} with a {@link
761 android.bluetooth.BluetoothProfile.ServiceListener} and the {@link
765 <p>Once you’ve acquired the Health Profile proxy (the {@link android.bluetooth.BluetoothHealth}
769 <li>{@link android.bluetooth.BluetoothHealthCallback}: You must extend this class and implement the
772 <li>{@link android.bluetooth.BluetoothHealthAppConfiguration}: During callbacks to your {@link
775 to perform various operations such as initiate and terminate connections with the {@link
779 <p>For more information about using the Bluetooth Health Profile, see the documentation for {@link
799 by {@link android.R.attr#contentDescription android:contentDescription} and {@link
802 application, especially for {@link android.widget.ImageButton}, {@link android.widget.EditText},
803 {@link android.widget.ImageView} and other widgets that might not naturally contain descriptive
810 implement new callback methods for accessibility events in your custom {@link
813 <p>It's important to first note that the behavior of the {@link
817 {@link android.view.View#sendAccessibilityEvent sendAccessibilityEvent()}. Previously, the
818 implementation of {@link android.view.View#sendAccessibilityEvent sendAccessibilityEvent()} would
819 initialize an {@link android.view.accessibility.AccessibilityEvent} and send it to {@link
823 <li>When invoked, the {@link
824 android.view.View#sendAccessibilityEvent sendAccessibilityEvent()} and {@link
826 to {@link android.view.View#onInitializeAccessibilityEvent onInitializeAccessibilityEvent()}.
827 <p>Custom implementations of {@link android.view.View} might want to implement {@link
829 attach additional accessibility information to the {@link
835 information, the view then receives a call to {@link
837 defers to the {@link android.view.View#onPopulateAccessibilityEvent onPopulateAccessibilityEvent()}
839 <p>Custom implementations of {@link android.view.View} should usually implement {@link
841 text content to the {@link android.view.accessibility.AccessibilityEvent} if the {@link
844 {@link android.view.accessibility.AccessibilityEvent}, call {@link
845 android.view.accessibility.AccessibilityEvent#getText()}.{@link java.util.List#add add()}.</p>
847 <li>At this point, the {@link android.view.View} passes the event up the view hierarchy by calling
848 {@link android.view.ViewGroup#requestSendAccessibilityEvent requestSendAccessibilityEvent()} on the
850 adding an {@link android.view.accessibility.AccessibilityRecord}, until it
851 ultimately reaches the root view, which sends the event to the {@link
852 android.view.accessibility.AccessibilityManager} with {@link
857 <p>In addition to the new methods above, which are useful when extending the {@link
858 android.view.View} class, you can also intercept these event callbacks on any {@link
859 android.view.View} by extending {@link
861 {@link android.view.View#setAccessibilityDelegate setAccessibilityDelegate()}.
863 the delegate. For example, when the view receives a call to {@link
865 same method in the {@link android.view.View.AccessibilityDelegate}. Any methods not handled by
867 the methods necessary for any given view without extending the {@link android.view.View} class.</p>
890 <li>Upon receiving an {@link android.view.accessibility.AccessibilityEvent} from an application,
891 call the {@link android.view.accessibility.AccessibilityEvent#getRecord(int)
892 AccessibilityEvent.getRecord()} to retrieve a specific {@link
896 <li>From either {@link android.view.accessibility.AccessibilityEvent} or an individual {@link
897 android.view.accessibility.AccessibilityRecord}, you can call {@link
898 android.view.accessibility.AccessibilityRecord#getSource() getSource()} to retrieve a {@link
900 <p>An {@link android.view.accessibility.AccessibilityNodeInfo} represents a single node
902 node. The {@link android.view.accessibility.AccessibilityNodeInfo} object returned from {@link
904 an {@link android.view.accessibility.AccessibilityRecord} describes the predecessor of the event
907 <li>With the {@link android.view.accessibility.AccessibilityNodeInfo}, you can query information
908 about it, call {@link
909 android.view.accessibility.AccessibilityNodeInfo#getParent getParent()} or {@link
915 must declare an XML configuration file that corresponds to {@link
917 accessibility service, see {@link
918 android.accessibilityservice.AccessibilityService} and {@link
925 <p>If you're interested in the device's accessibility state, the {@link
928 <li>{@link android.view.accessibility.AccessibilityManager.AccessibilityStateChangeListener}
931 <li>{@link android.view.accessibility.AccessibilityManager#getEnabledAccessibilityServiceList
934 <li>{@link android.view.accessibility.AccessibilityManager#isTouchExplorationEnabled()} tells
948 {@link android.service.textservice.SpellCheckerService} and extend the {@link
950 on text provided by the interface's callback methods. In the {@link
952 spelling suggestions as {@link android.view.textservice.SuggestionsInfo} objects. </p>
954 <p>Applications with a spell checker service must declare the {@link
978 <p>In previous versions of Android, you could use the {@link android.speech.tts.TextToSpeech} class
980 custom engine using {@link android.speech.tts.TextToSpeech#setEngineByPackageName
981 setEngineByPackageName()}. In Android 4.0, the {@link
983 deprecated and you can now specify the engine to use with a new {@link
986 <p>You can also query the available TTS engines with {@link
987 android.speech.tts.TextToSpeech#getEngines()}. This method returns a list of {@link
997 <p>The basic setup requires an implementation of {@link android.speech.tts.TextToSpeechService} that
998 responds to the {@link android.speech.tts.TextToSpeech.Engine#INTENT_ACTION_TTS_SERVICE} intent. The
999 primary work for a TTS engine happens during the {@link
1001 that extends {@link android.speech.tts.TextToSpeechService}. The system delivers this method two
1004 <li>{@link android.speech.tts.SynthesisRequest}: This contains various data including the text to
1006 <li>{@link android.speech.tts.SynthesisCallback}: This is the interface by which your TTS engine
1007 delivers the resulting speech data as streaming audio. First the engine must call {@link
1009 the audio, then call {@link android.speech.tts.SynthesisCallback#audioAvailable audioAvailable()},
1011 buffer, call {@link android.speech.tts.SynthesisCallback#done()}.</li>
1040 declaration an intent filter for the {@link android.content.Intent#ACTION_MANAGE_NETWORK_USAGE}
1057 <p>Also beware that {@link android.net.ConnectivityManager#getBackgroundDataSetting()} is now
1058 deprecated and always returns true—use {@link
1060 transactions, you should always call {@link android.net.ConnectivityManager#getActiveNetworkInfo()}
1061 to get the {@link android.net.NetworkInfo} that represents the current network and query {@link
1083 <p>The {@link android.renderscript.Allocation} class now supports a {@link
1085 render things directly into the {@link android.renderscript.Allocation} and use it as a framebuffer
1088 <p>{@link android.renderscript.RSTextureView} provides a means to display RenderScript graphics
1089 inside of a {@link android.view.View}, unlike {@link android.renderscript.RSSurfaceView}, which
1091 animate an {@link android.renderscript.RSTextureView} as well as draw RenderScript graphics inside
1094 <p>The {@link android.renderscript.Script#forEach Script.forEach()} method allows you to call
1097 write will have a {@link android.renderscript.Script#forEach forEach()} method that you can call in
1098 the reflected RenderScript class. You can call the reflected {@link
1099 android.renderscript.Script#forEach forEach()} method by passing in an input {@link
1100 android.renderscript.Allocation} to process, an output {@link android.renderscript.Allocation} to
1101 write the result to, and a {@link android.renderscript.FieldPacker} data structure in case the
1102 RenderScript needs more information. Only one of the {@link android.renderscript.Allocation}s is
1119 <p>The new {@link android.net.VpnService} allows applications to build their own VPN (Virtual
1120 Private Network), running as a {@link android.app.Service}. A VPN service creates an interface for a
1124 <p>To create a VPN service, use {@link android.net.VpnService.Builder}, which allows you to specify
1126 interface by calling {@link android.net.VpnService.Builder#establish()}, which returns a {@link
1130 implement {@link android.net.VpnService}, then your service must require the {@link
1138 <p>Applications that manage the device restrictions can now disable the camera using {@link
1139 android.app.admin.DevicePolicyManager#setCameraDisabled setCameraDisabled()} and the {@link
1146 <p>The new {@link android.security.KeyChain} class provides APIs that allow you to import and access
1150 certificates to authenticate users to servers. See the {@link android.security.KeyChain}
1164 <li>{@link android.hardware.Sensor#TYPE_AMBIENT_TEMPERATURE}: A temperature sensor that provides
1166 <li>{@link android.hardware.Sensor#TYPE_RELATIVE_HUMIDITY}: A humidity sensor that provides the
1170 <p>If a device has both {@link android.hardware.Sensor#TYPE_AMBIENT_TEMPERATURE} and {@link
1174 <p>The previous temperature sensor, {@link android.hardware.Sensor#TYPE_TEMPERATURE}, has been
1175 deprecated. You should use the {@link android.hardware.Sensor#TYPE_AMBIENT_TEMPERATURE} sensor
1179 latency and smoother output. These sensors include the gravity sensor ({@link
1180 android.hardware.Sensor#TYPE_GRAVITY}), rotation vector sensor ({@link
1181 android.hardware.Sensor#TYPE_ROTATION_VECTOR}), and linear acceleration sensor ({@link
1191 <p>The {@link android.app.ActionBar} has been updated to support several new behaviors. Most
1205 separate bar at the bottom of the screen. To enable split action bar, add {@link
1215 <p>If you want to use the navigation tabs provided by the {@link android.app.ActionBar.Tab} APIs,
1217 the split action bar as described above and also call {@link
1226 <p>If you want to apply custom styling to the action bar, you can use new style properties {@link
1227 android.R.attr#backgroundStacked} and {@link android.R.attr#backgroundSplit} to apply a background
1229 runtime with {@link android.app.ActionBar#setStackedBackgroundDrawable
1230 setStackedBackgroundDrawable()} and {@link android.app.ActionBar#setSplitBackgroundDrawable
1236 <p>The new {@link android.view.ActionProvider} class allows you to create a specialized handler for
1239 dynamic behaviors (such as a variable action view, default action, or submenu), extending {@link
1243 <p>For example, the {@link android.widget.ShareActionProvider} is an extension of {@link
1245 traditional action item that invokes the {@link android.content.Intent#ACTION_SEND} intent, you can
1247 the {@link android.content.Intent#ACTION_SEND} intent. When the user selects an application to use
1248 for the action, {@link android.widget.ShareActionProvider} remembers that selection and provides it
1263 <p>In your activity’s {@link android.app.Activity#onCreateOptionsMenu onCreateOptionsMenu()}
1279 <p>For an example using the {@link android.widget.ShareActionProvider}, see the <a
1287 traditional action item state. Previously only the {@link android.widget.SearchView} supported
1298 instance of {@link android.view.MenuItem.OnActionExpandListener} with the respective {@link
1299 android.view.MenuItem} by calling {@link android.view.MenuItem#setOnActionExpandListener
1300 setOnActionExpandListener()}. Typically, you should do so during the {@link
1303 <p>To control a collapsible action view, you can call {@link
1304 android.view.MenuItem#collapseActionView()} and {@link android.view.MenuItem#expandActionView()} on
1305 the respective {@link android.view.MenuItem}.</p>
1307 <p>When creating a custom action view, you can also implement the new {@link
1314 <li>{@link android.app.ActionBar#setHomeButtonEnabled setHomeButtonEnabled()} allows you to specify
1318 <li>{@link android.app.ActionBar#setIcon setIcon()} and {@link android.app.ActionBar#setLogo
1321 <li>{@link android.app.Fragment#setMenuVisibility Fragment.setMenuVisibility()} allows you to enable
1326 <li>{@link android.app.FragmentManager#invalidateOptionsMenu
1329 in which using the equivalent method from {@link android.app.Activity} might not be available.</li>
1346 <p>{@link android.widget.GridLayout} is a new view group that places child views in a rectangular
1347 grid. Unlike {@link android.widget.TableLayout}, {@link android.widget.GridLayout} relies on a flat
1351 The {@link android.widget.GridLayout} orientation determines whether sequential children are by
1353 instances of the new {@link android.widget.Space} view or by setting the relevant margin parameters
1359 for samples using {@link android.widget.GridLayout}.</p>
1365 <p>{@link android.view.TextureView} is a new view that allows you to display a content stream, such
1366 as a video or an OpenGL scene. Although similar to {@link android.view.SurfaceView}, {@link
1368 separate window, so you can treat it like any other {@link android.view.View} object. For example,
1369 you can apply transforms, animate it using {@link android.view.ViewPropertyAnimator}, or
1370 adjust its opacity with {@link android.view.View#setAlpha setAlpha()}.</p>
1372 <p>Beware that {@link android.view.TextureView} works only within a hardware accelerated window.</p>
1374 <p>For more information, see the {@link android.view.TextureView} documentation.</p>
1379 <p>The new {@link android.widget.Switch} widget is a two-state toggle that users can drag to one
1395 <p>Android 3.0 introduced {@link android.widget.PopupMenu} to create short contextual menus that pop
1397 the {@link android.widget.PopupMenu} with a couple useful features:</p>
1400 href="{@docRoot}guide/topics/resources/menu-resource.html">menu resource</a> with {@link
1402 <li>You can also now create a {@link android.widget.PopupMenu.OnDismissListener} that receives a
1409 <p>A new {@link android.preference.TwoStatePreference} abstract class serves as the basis for
1410 preferences that provide a two-state selection option. The new {@link
1411 android.preference.SwitchPreference} is an extension of {@link
1412 android.preference.TwoStatePreference} that provides a {@link android.widget.Switch} widget in the
1414 preference screen or dialog. For example, the Settings application uses a {@link
1425 "device default" theme: {@link android.R.style#Theme_DeviceDefault Theme.DeviceDefault}. This may be
1428 <p>The {@link android.R.style#Theme_Holo Theme.Holo} family of themes are guaranteed to not change
1430 apply any of the {@link android.R.style#Theme_Holo Theme.Holo} themes to your activities, you can
1435 provide different default themes for the system), you should explicitly apply themes from the {@link
1447 <p>For the best user experience, new and updated apps should instead use the {@link
1468 <p>To this day, you can hide the status bar on handsets using the {@link
1473 <li>The {@link android.view.View#SYSTEM_UI_FLAG_LOW_PROFILE} flag replaces the {@code
1479 <li>The {@link android.view.View#SYSTEM_UI_FLAG_VISIBLE} flag replaces the {@code
1482 <li>The {@link android.view.View#SYSTEM_UI_FLAG_HIDE_NAVIGATION} is a new flag that requests
1490 <p>You can set each of these flags for the system bar and navigation bar by calling {@link
1499 {@link android.view.View.OnSystemUiVisibilityChangeListener} to be notified when the visibility
1516 <p>The {@link android.view.View} class now supports “hover" events to enable richer interactions
1520 <p>To receive hover events on a view, implement the {@link android.view.View.OnHoverListener} and
1521 register it with {@link android.view.View#setOnHoverListener setOnHoverListener()}. When a hover
1522 event occurs on the view, your listener receives a call to {@link
1523 android.view.View.OnHoverListener#onHover onHover()}, providing the {@link android.view.View} that
1524 received the event and a {@link android.view.MotionEvent} that describes the type of hover event
1527 <li>{@link android.view.MotionEvent#ACTION_HOVER_ENTER}</li>
1528 <li>{@link android.view.MotionEvent#ACTION_HOVER_EXIT}</li>
1529 <li>{@link android.view.MotionEvent#ACTION_HOVER_MOVE}</li>
1532 <p>Your {@link android.view.View.OnHoverListener} should return true from {@link
1558 “tool type" associated with each pointer in a {@link android.view.MotionEvent} using {@link
1559 android.view.MotionEvent#getToolType getToolType()}. The currently defined tool types are: {@link
1560 android.view.MotionEvent#TOOL_TYPE_UNKNOWN}, {@link android.view.MotionEvent#TOOL_TYPE_FINGER},
1561 {@link android.view.MotionEvent#TOOL_TYPE_MOUSE}, {@link android.view.MotionEvent#TOOL_TYPE_STYLUS},
1562 and {@link android.view.MotionEvent#TOOL_TYPE_ERASER}. By querying the tool type, your application
1566 state" of a {@link android.view.MotionEvent} using {@link android.view.MotionEvent#getButtonState
1567 getButtonState()}. The currently defined button states are: {@link
1568 android.view.MotionEvent#BUTTON_PRIMARY}, {@link android.view.MotionEvent#BUTTON_SECONDARY}, {@link
1569 android.view.MotionEvent#BUTTON_TERTIARY}, {@link android.view.MotionEvent#BUTTON_BACK}, and {@link
1571 automatically mapped to the {@link android.view.KeyEvent#KEYCODE_BACK} and {@link
1577 and the stylus orientation angle. Your application can query this information using {@link
1578 android.view.MotionEvent#getAxisValue getAxisValue()} with the axis codes {@link
1579 android.view.MotionEvent#AXIS_DISTANCE}, {@link android.view.MotionEvent#AXIS_TILT}, and {@link
1593 <p>The new {@link android.util.Property} class provides a fast, efficient, and easy way to specify a
1613 <p>Using the {@link android.util.Property} class, you can declare a {@link android.util.Property}
1620 <p>The {@link android.view.View} class now leverages the {@link android.util.Property} class to
1621 allow you to set various fields, such as transform properties that were added in Android 3.0 ({@link
1622 android.view.View#ROTATION}, {@link android.view.View#ROTATION_X}, {@link
1625 <p>The {@link android.animation.ObjectAnimator} class also uses the {@link android.util.Property}
1626 class, so you can create an {@link android.animation.ObjectAnimator} with a {@link
1649 element. You can alternatively disable hardware acceleration for individual views by calling {@link
1686 needed for combining glyphs) in {@link android.webkit.WebView} and the built-in Browser</li>
1687 <li>Support for Ethiopic, Georgian, and Armenian fonts in {@link android.webkit.WebView} and the
1691 it easier for you to test apps that use {@link android.webkit.WebView}</li>
1719 <li>{@link android.Manifest.permission#ADD_VOICEMAIL}: Allows a voicemail service to add voicemail
1721 <li>{@link android.Manifest.permission#BIND_TEXT_SERVICE}: A service that implements {@link
1723 <li>{@link android.Manifest.permission#BIND_VPN_SERVICE}: A service that implements {@link
1725 <li>{@link android.Manifest.permission#READ_PROFILE}: Provides read access to the {@link
1727 <li>{@link android.Manifest.permission#WRITE_PROFILE}: Provides write access to the {@link
1737 <li>{@link android.content.pm.PackageManager#FEATURE_WIFI_DIRECT}: Declares that the application
1764 <li>{@link android.app.Fragment}: A framework component that allows you to separate distinct
1767 <li>{@link android.app.ActionBar}: A replacement for the traditional title bar at the top of
1771 <li>{@link android.content.Loader}: A framework component that facilitates asynchronous
1821 {@link android.mtp} documentation.</li>
1825 sessions and transmit or receive data streams over any available network. See the {@link