• Home
  • Raw
  • Download

Lines Matching refs:link

98 <p>Android 4.1 ({@link android.os.Build.VERSION_CODES#JELLY_BEAN})
152 …opics/manifest/service-element.html">{@code &lt;service>}</a> tag, your {@link android.app.Service…
158 <p>New {@link android.content.ComponentCallbacks2} constants such as {@link
159 android.content.ComponentCallbacks2#TRIM_MEMORY_RUNNING_LOW} and {@link
162 memory state before the system calls {@link android.app.Activity#onLowMemory()}.</p>
164 <p>New {@link android.app.ActivityManager#getMyMemoryState} method allows you to
170 <p>A new method, {@link android.content.ContentResolver#acquireUnstableContentProviderClient
171 acquireUnstableContentProviderClient()}, allows you to access a {@link
184 <p>To launch the live wallpaper picker, call {@link android.content.Context#startActivity
185 startActivity()} with an {@link android.content.Intent} using
186 {@link android.app.WallpaperManager#ACTION_CHANGE_LIVE_WALLPAPER} and an extra
187 that specifies your live wallpaper {@link android.content.ComponentName} as a string in {@link
203 android:parentActivityName}</a> for each activity, you don't need the {@link
224 <p>When the system creates a synthetic back stack for your app, it builds a basic {@link
236 activity, you must override the {@link
238 provides you with a {@link android.app.TaskStackBuilder} object that the system created in order to
239 synthesize the parent activities. The {@link android.app.TaskStackBuilder} contains {@link
241 implementation of {@link android.app.Activity#onPrepareNavigateUpTaskStack
242 onPrepareNavigateUpTaskStack()}, you can modify the appropriate {@link android.content.Intent} to
246 <p>When the system creates the {@link android.app.TaskStackBuilder}, it adds the {@link
248 order beginning from the top of the activity tree. So, the last {@link
250 you want to modify the {@link android.content.Intent} for the activity's parent, first determine
251 the length of the array with {@link android.app.TaskStackBuilder#getIntentCount()} and pass that
252 value to {@link android.app.TaskStackBuilder#editIntentAt editIntentAt()}.</p>
259 <dt>{@link android.app.Activity#onNavigateUp}</dt>
261 <dt>{@link android.app.Activity#navigateUpTo}</dt>
263 supplied {@link android.content.Intent}. If the activity exists in the back stack, but
266 <dt>{@link android.app.Activity#getParentActivityIntent}</dt>
267 <dd>Call this to get the {@link android.content.Intent} that will start the logical
269 <dt>{@link android.app.Activity#shouldUpRecreateTask}</dt>
273 <dt>{@link android.app.Activity#finishAffinity}</dt>
277 {@link android.app.Activity#onNavigateUp}, you should call this method when you
279 <dt>{@link android.app.Activity#onCreateNavigateUpTaskStack onCreateNavigateUpTaskStack}</dt>
280 …ra data to the intents for your back stack, you should instead override {@link android.app.Activit…
284 <p>However, most apps don't need to use these APIs or implement {@link
306 <p>The {@link android.media.MediaCodec} class provides access to low-level media codecs for encoding
307 and decoding your media. You can instantiate a {@link android.media.MediaCodec} by calling {@link
308 android.media.MediaCodec#createEncoderByType createEncoderByType()} to encode media or call {@link
313 <p>With an instance of {@link android.media.MediaCodec} created, you can then call {@link
318 create the {@link android.media.MediaCodec}. First call {@link
319 android.media.MediaCodec#getInputBuffers()} to get an array of input {@link java.nio.ByteBuffer}
320 objects and {@link android.media.MediaCodec#getOutputBuffers()} to get an array of output {@link
323 <p>When you're ready to encode or decode, call {@link android.media.MediaCodec#dequeueInputBuffer
324 dequeueInputBuffer()} to get the index position of the {@link
326 media. After you fill the {@link java.nio.ByteBuffer} with your source media, release ownership
327 of the buffer by calling {@link android.media.MediaCodec#queueInputBuffer queueInputBuffer()}.</p>
329 <p>Likewise for the output buffer, call {@link android.media.MediaCodec#dequeueOutputBuffer
330 dequeueOutputBuffer()} to get the index position of the {@link java.nio.ByteBuffer}
331 where you'll receive the results. After you read the output from the {@link java.nio.ByteBuffer},
332 release ownership by calling {@link android.media.MediaCodec#releaseOutputBuffer
335 <p>You can handle encrypted media data in the codecs by calling {@link
337 the {@link android.media.MediaCrypto} APIs, instead of the normal {@link
340 <p>For more information about how to use codecs, see the {@link android.media.MediaCodec} documenta…
345 <p>The new {@link android.media.MediaRouter} class allows you to route media channels and
347 can acquire an instance of {@link android.media.MediaRouter} by calling {@link
348 android.content.Context#getSystemService getSystemService(}{@link
355 <p>New method {@link android.media.AudioRecord#startRecording startRecording()} allows
356 you to begin audio recording based on a cue defined by a {@link android.media.MediaSyncEvent}.
357 The {@link android.media.MediaSyncEvent} specifies an audio session
358 (such as one defined by {@link android.media.MediaPlayer}), which when complete, triggers
367 <p>The {@link android.media.MediaPlayer} now handles both in-band and out-of-band text tracks.
369 tracks can be added as an external text source via {@link
371 track sources are added, {@link android.media.MediaPlayer#getTrackInfo()} should be called to get
374 <p>To set the track to use with the {@link android.media.MediaPlayer}, you must
375 call {@link android.media.MediaPlayer#selectTrack selectTrack()}, using the index
379 {@link android.media.MediaPlayer.OnTimedTextListener} interface and pass
380 it to {@link android.media.MediaPlayer#setOnTimedTextListener setOnTimedTextListener()}.</p>
385 <p>The {@link android.media.audiofx.AudioEffect} class now supports additional audio
388 <li>Acoustic Echo Canceler (AEC) with {@link android.media.audiofx.AcousticEchoCanceler}
390 <li>Automatic Gain Control (AGC) with {@link android.media.audiofx.AutomaticGainControl}
392 <li>Noise Suppressor (NS) with {@link android.media.audiofx.NoiseSuppressor}
396 <p>You can apply these pre-processor effects on audio captured with an {@link
397 android.media.AudioRecord} using one of the {@link android.media.audiofx.AudioEffect}
401 effects, so you should always first check availability by calling {@link
409 {@link android.media.MediaPlayer} objects. At any time before your first {@link android.media.Media…
410 call {@link android.media.MediaPlayer#setNextMediaPlayer setNextMediaPlayer()} and Android
422 <p>The new interface {@link android.hardware.Camera.AutoFocusMoveCallback} allows you to listen
423 for changes to the auto focus movement. You can register your interface with {@link
425 is in a continuous autofocus mode ({@link
427 {@link android.hardware.Camera.Parameters#FOCUS_MODE_CONTINUOUS_PICTURE}), you'll receive a call
428 to {@link android.hardware.Camera.AutoFocusMoveCallback#onAutoFocusMoving onAutoFocusMoving()},
433 <p>The {@link android.media.MediaActionSound} class provides a simple set of APIs to produce
437 <p>To play a sound, simply instantiate a {@link android.media.MediaActionSound} object, call
438 {@link android.media.MediaActionSound#load load()} to pre-load the desired sound, then at the
439 appropriate time, call {@link android.media.MediaActionSound#play play()}.</p>
449 to transfer with either the new {@link android.nfc.NfcAdapter#setBeamPushUris setBeamPushUris()}
450 method or the new callback interface {@link android.nfc.NfcAdapter.CreateBeamUrisCallback}, Android
456 <p>The {@link android.nfc.NfcAdapter#setBeamPushUris setBeamPushUris()} method takes an array of
457 {@link android.net.Uri} objects that specify the data you want to transfer from your app.
458 Alternatively, you can implement the {@link android.nfc.NfcAdapter.CreateBeamUrisCallback}
459 interface, which you can specify for your activity by calling {@link
463 callback interface, the system calls the interface's {@link
467 activity, whereas calling {@link android.nfc.NfcAdapter#setBeamPushUris setBeamPushUris()} is
480 <p>The new package {@link android.net.nsd} contains the new APIs that allow you to
484 <p>To register your service, you must first create an {@link android.net.nsd.NsdServiceInfo}
486 {@link android.net.nsd.NsdServiceInfo#setServiceName setServiceName()},
487 {@link android.net.nsd.NsdServiceInfo#setServiceType setServiceType()}, and
488 {@link android.net.nsd.NsdServiceInfo#setPort setPort()}.
491 <p>Then you need to implement {@link android.net.nsd.NsdManager.RegistrationListener}
492 and pass it to {@link android.net.nsd.NsdManager#registerService registerService()}
493 with your {@link android.net.nsd.NsdServiceInfo}.</p>
495 <p>To discover services on the network, implement {@link
496 android.net.nsd.NsdManager.DiscoveryListener} and pass it to {@link
499 <p>When your {@link
502 {@link android.net.nsd.NsdManager#resolveService resolveService()}, passing it an
503 implementation of {@link android.net.nsd.NsdManager.ResolveListener} that receives
504 an {@link android.net.nsd.NsdServiceInfo} object that contains information about the
512 the {@link android.net.wifi.p2p.WifiP2pManager}. This allows you to discover and filter nearby
518 your app and connect to it, call {@link
520 {@link android.net.wifi.p2p.nsd.WifiP2pServiceInfo} object that describes your app services.</p>
524link android.net.wifi.p2p.WifiP2pManager#setDnsSdResponseListeners setDnsSdResponseListeners()}, w…
525 …{@link android.net.wifi.p2p.WifiP2pManager#setUpnpServiceResponseListener setUpnpServiceResponseLi…
527 …es, you also need to call {@link android.net.wifi.p2p.WifiP2pManager#addServiceRequest addServiceR…
528 successful callback, you can then begin discovering services on local devices by calling {@link
531 <p>When local services are discovered, you'll receive a callback to either the {@link
532 android.net.wifi.p2p.WifiP2pManager.DnsSdServiceResponseListener} or {@link
535 {@link android.net.wifi.p2p.WifiP2pDevice} object representing the peer device.</p>
542 <p>The new method {@link android.net.ConnectivityManager#isActiveNetworkMetered} allows you to
558 using {@link android.accessibilityservice.AccessibilityService#onGesture onGesture()} and other
559 input events through additions to the {@link android.view.accessibility.AccessibilityEvent}, {@link
560 android.view.accessibility.AccessibilityNodeInfo} and {@link
564 scrolling and stepping through text using {@link
565 android.view.accessibility.AccessibilityNodeInfo#performAction performAction} and {@link
567 setMovementGranularities}. The {@link
576 elements and input widgets using {@link
577 android.view.accessibility.AccessibilityNodeInfo#findFocus findFocus()} and {@link
579 using {@link android.view.accessibility.AccessibilityNodeInfo#setAccessibilityFocused
599 <p>You can now associate a {@link android.content.ClipData} object with an {@link
600 android.content.Intent} using the {@link android.content.Intent#setClipData setClipData()} method.
604 access to multiple URIs in an the intent. When starting an {@link
605 android.content.Intent#ACTION_SEND} or {@link
607 automatically propagated to the {@link android.content.ClipData} so that the receiver can have
613 <p>The {@link android.content.ClipData} class now supports styled text (either as HTML or
616 strings</a>). You can add HTML styled text to the {@link android.content.ClipData} with {@link
646 <p>You can now launch an {@link android.app.Activity} using zoom animations or
647 your own custom animations. To specify the animation you want, use the {@link
648 android.app.ActivityOptions} APIs to build a {@link android.os.Bundle} that you can
650 methods that start an activity, such as {@link
653 <p>The {@link android.app.ActivityOptions} class includes a different method for each
656 <dt>{@link android.app.ActivityOptions#makeScaleUpAnimation makeScaleUpAnimation()}</dt>
660 <dt>{@link android.app.ActivityOptions#makeThumbnailScaleUpAnimation
665 <dt>{@link android.app.ActivityOptions#makeCustomAnimation
675 <p>The new {@link android.animation.TimeAnimator} provides a simple callback
676 mechanism with the {@link android.animation.TimeAnimator.TimeListener} that notifies
693 <p>The new method {@link android.app.Notification.Builder#setStyle setStyle()} allows you to specify
695 specify the style for your large content region, pass {@link
698 <dt>{@link android.app.Notification.BigPictureStyle}</dt>
700 <dt>{@link android.app.Notification.BigTextStyle}</dt>
702 <dt>{@link android.app.Notification.InboxStyle}</dt>
711 <p>To add an action button, call {@link android.app.Notification.Builder#addAction
713 text for the button, and a {@link android.app.PendingIntent} that defines the action
721 the priority with {@link android.app.Notification.Builder#setPriority setPriority()}. You
723 in the {@link android.app.Notification} class. The default is {@link
735 UI elements and your activity layout in relation to them by calling {@link
740 <dt>{@link android.view.View#SYSTEM_UI_FLAG_FULLSCREEN}</dt>
747 <dt>{@link android.view.View#SYSTEM_UI_FLAG_LAYOUT_FULLSCREEN}</dt>
749 enabled {@link android.view.View#SYSTEM_UI_FLAG_FULLSCREEN} even if the system UI elements
752 {@link android.view.View#SYSTEM_UI_FLAG_FULLSCREEN}, because it avoids your layout from
755 <dt>{@link android.view.View#SYSTEM_UI_FLAG_LAYOUT_HIDE_NAVIGATION}</dt>
757 enabled {@link android.view.View#SYSTEM_UI_FLAG_HIDE_NAVIGATION} (added in Android 4.0)
761 with {@link android.view.View#SYSTEM_UI_FLAG_HIDE_NAVIGATION}, because it avoids your layout from
764 <dt>{@link android.view.View#SYSTEM_UI_FLAG_LAYOUT_STABLE}</dt>
765 <dd>You might want to add this flag if you're using {@link
766 android.view.View#SYSTEM_UI_FLAG_LAYOUT_FULLSCREEN} and/or {@link
768 {@link android.view.View#fitSystemWindows fitSystemWindows()} on a view that the
770 That is, with this flag set, {@link android.view.View#fitSystemWindows
782 <p>{@link android.widget.GridLayout} and {@link android.view.ViewStub}
809 <p>You can apply any one of these with the new {@link android.R.attr#fontFamily}
810 attribute in combination with the {@link android.R.attr#textStyle} attribute.</p>
812 <p>Supported values for {@link android.R.attr#fontFamily} are:</p>
819 <p>You can then apply bold and/or italic with {@link android.R.attr#textStyle} values
823 <p>You can also use {@link android.graphics.Typeface#create Typeface.create()}.
836 <p>The new {@link android.hardware.input.InputManager} class allows you to query the
843 {@link android.hardware.input.InputManager#getInputDeviceIds()}. This returns
845 {@link android.hardware.input.InputManager#getInputDevice getInputDevice()} to acquire
846 an {@link android.view.InputDevice} for a specified input device ID.</p>
849 implement the {@link android.hardware.input.InputManager.InputDeviceListener} interface and
850 register it with {@link android.hardware.input.InputManager#registerInputDeviceListener
856 the vibration of those devices using the existing {@link android.os.Vibrator} APIs simply
857 by calling {@link android.view.InputDevice#getVibrator()} on the {@link android.view.InputDevice}.<…
866 <dt>{@link android.Manifest.permission#READ_EXTERNAL_STORAGE}</dt>
874 <dt>{@link android.Manifest.permission#READ_USER_DICTIONARY}</dt>
877 <dt>{@link android.Manifest.permission#READ_CALL_LOG}</dt>
880 <dt>{@link android.Manifest.permission#WRITE_CALL_LOG}</dt>
882 <dt>{@link android.Manifest.permission#WRITE_USER_DICTIONARY}</dt>
890 to displaying the user interface on a television screen: {@link