page.title=Android 4.1 APIs
sdk.platform.version=4.1
sdk.platform.apiLevel=16
@jd:body

<div id="qv-wrapper">
<div id="qv">
  
  <h2>In this document</h2>
<ol>
  <li><a href="#AppComponents">App Components</a></li>
  <li><a href="#Multimedia">Multimedia</a></li>
  <li><a href="#Camera">Camera</a></li>
  <li><a href="#Connectivity">Connectivity</a></li>
  <li><a href="#A11y">Accessibility</a></li>
  <li><a href="#CopyPaste">Copy and Paste</a></li>
  <li><a href="#Renderscript">Renderscript</a></li>
  <li><a href="#Animation">Animation</a></li>
  <li><a href="#UI">User Interface</a></li>
  <li><a href="#Input">Input Framework</a></li>
  <li><a href="#Permissions">Permissions</a></li>
  <li><a href="#DeviceFeatures">Device Features</a></li>
</ol>

<h2>See also</h2>
<ol>
<li><a
href="{@docRoot}sdk/api_diff/16/changes.html">API
Differences Report &raquo;</a> </li>
</ol>

</div>
</div>


<p>API Level: 16</p>

<p>Android 4.1 (Jelly Bean) is a progression of the platform that offers improved
performance and enhanced user experience. It adds new features for users and app
developers. This document provides an introduction to the most notable and
useful new APIs for app developers.</p>


<div class="sidebox-wrapper">
<div class="sidebox">  
  
<h3 id="ApiLevel">Declare your app API Level</h3>

<p>To better optimize your app for devices running Android {@sdkPlatformVersion},
  you should set your <a
href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#target">{@code targetSdkVersion}</a> to
<code>"{@sdkPlatformApiLevel}"</code>, install it on an Android {@sdkPlatformVersion} system image,
test it, then publish an update with this change.</p>

<p>You
can use APIs in Android {@sdkPlatformVersion} while also supporting older versions by adding
conditions to your code that check for the system API level before executing
APIs not supported by your <a
href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#min">{@code minSdkVersion}</a>. 
To learn more about
maintaining backward-compatibility, read <a
href="{@docRoot}training/backward-compatible-ui/index.html">Creating Backward-Compatible
UIs</a>.</p>

<p>More information about how API levels work is available in <a
href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#ApiLevels">What is API
Level?</a></p>

</div>
</div>


<p>As an app developer, Android 4.1 is available to you with an SDK platform you can use to build
your app against Android 4.1 and with a system image you can run in the Android emulator. You
should download download the platform and system image as soon as possible to build and test your
app on Android 4.1. To get started developing and testing against Android
4.1, use the Android SDK Manager to download the platform into your SDK.</p>





<h2 id="AppComponents">App Components</h2>



<h3 id="Isolated">Isolated services</h3>

<p>By specifying <a href="{@docRoot}guide/topics/manifest/service-element.html#isolated">{@code android:isolatedProcess="true"}</a> in the 
<a href="{@docRoot}guide/topics/manifest/service-element.html">{@code &lt;service>}</a> tag, your {@link android.app.Service} will run under
its own isolated user ID process that has no permissions of its own.</p>


<h3 id="Memory">Memory management</h3>

<p>New {@link android.content.ComponentCallbacks2} constants such as {@link 
android.content.ComponentCallbacks2#TRIM_MEMORY_RUNNING_LOW} and {@link 
android.content.ComponentCallbacks2#TRIM_MEMORY_RUNNING_CRITICAL} provide foreground
processes more information about
memory state before the system calls {@link android.app.Activity#onLowMemory()}.</p>

<p>New {@link android.app.ActivityManager#getMyMemoryState} method allows you to
retrieve the general memory state.</p>


<h3 id="ContentProvider">Content providers</h3>

<p>A new method, {@link android.content.ContentResolver#acquireUnstableContentProviderClient
acquireUnstableContentProviderClient()}, allows you to access a {@link
android.content.ContentProviderClient} that may be "unstable" such that your app will not crash if
the content provider does. It's useful when you are interacting with content providers in a separate
app.</p>



<h3 id="LiveWallpapers">Live Wallpapers</h3>

<p>New intent protocol to directly launch the live wallpaper preview activity so you can help
  users easily select your live wallpaper without forcing them to leave 
your app and navigate through the Home wallpaper picker.</p>

<p>To launch the live wallpaper picker, call {@link android.content.Context#startActivity
  startActivity()} with an {@link android.content.Intent} using
{@link android.app.WallpaperManager#ACTION_CHANGE_LIVE_WALLPAPER} and an extra
that specifies your live wallpaper {@link android.content.ComponentName} as a string in {@link
android.app.WallpaperManager#EXTRA_LIVE_WALLPAPER_COMPONENT}.</p>




<h3 id="StackNav">App stack navigation</h3>

<p>Android 4.1 makes it much easier to implement the proper design patterns for Up navigation.
All you need to do is add the <a
href="{@docRoot}guide/topics/manifest/activity-element.html#parent">{@code
android:parentActivityName}</a> to each <a
href="{@docRoot}guide/topics/manifest/activity-element.html">{@code &lt;activity>}</a> element in
your manifest file. The system uses this information to open the appropriate activity when the user
presses the Up button in the action bar (while also finishing the current activity). So if you
declare the <a href="{@docRoot}guide/topics/manifest/activity-element.html#parent">{@code
android:parentActivityName}</a> for each activity, you don't need the {@link
android.app.Activity#onOptionsItemSelected onOptionsItemSelected()} method to handle click
events on the action bar's app icon&mdash;the system now handles that event and resumes or
creates the appropriate activity.</p>

<p>This is particularly powerful for scenarios in which the user enters one of your app's activities
through a "deep dive" intent such as from a notification or an intent from
different app (as described in the design guide for <a 
href="{@docRoot}design/patterns/navigation.html#between-apps">Navigating Between Apps</a>). When
the user enters your activity this way, your app may not naturally have a back stack of
activities that can be resumed as the user navigates up. However, when you supply the <a
href="{@docRoot}guide/topics/manifest/activity-element.html#parent">{@code
android:parentActivityName}</a> attribute for your activities, the system recognizes
 whether or not your app already contains a back stack of parent activities and, if not, constructs
a synthetic back stack that contains all parent activities.</p>

<p class="note"><strong>Note:</strong> When the user enters a deep activity in your app and 
  it creates a new task for your app, the system actually inserts the stack of parent activities 
  into the task. As such, pressing the Back button also navigates back through the stack of parent
  activities.</p>

<p>When the system creates a synthetic back stack for your app, it builds a basic {@link 
  android.content.Intent} to create a new instance of each parent activity. So there's no
  saved state for the parent activities the way you'd expect had the user naturally navigated
through
  each activity. If any of the parent activities normally show a UI that's dependent on
  the user's context, that context information will be missing and you should deliver it when the
user
  navigates back through the stack. For example, if the user is viewing an album
in a music app, navigating up might bring them to an activity that lists all albums in a chosen
music genre. In this case, if the stack must be created, it's necessary that you inform the parent
activity what genre the current album belongs to so that the parent can display the proper list as
if the user actually came from that activity. To deliver such information to a synthetic parent
activity, you must override the {@link
android.app.Activity#onPrepareNavigateUpTaskStack onPrepareNavigateUpTaskStack()} method. This
provides you with a {@link android.app.TaskStackBuilder} object that the system created in order to
synthesize the parent activities. The {@link android.app.TaskStackBuilder} contains {@link
android.content.Intent} objects that the system uses to create each parent activity. In your
implementation of {@link android.app.Activity#onPrepareNavigateUpTaskStack
onPrepareNavigateUpTaskStack()}, you can modify the appropriate {@link android.content.Intent} to
add extra data that the parent activity can use to determine the appropriate context and display
the appropriate UI.</p>

<p>When the system creates the {@link android.app.TaskStackBuilder}, it adds the {@link
android.content.Intent} objects that are used to create the parent activities in their logical
order beginning from the top of the activity tree. So, the last {@link
android.content.Intent} added to the internal array is the direct parent of the current activity. If
you want to modify the {@link android.content.Intent} for the activity's parent, first determine
the length of the array with {@link android.app.TaskStackBuilder#getIntentCount()} and pass that
value to {@link android.app.TaskStackBuilder#editIntentAt editIntentAt()}.</p>

<p>If your app structure is more complex, there are several other APIs 
  available that allow you to handle the behavior of Up navigation and
  fully customize the synthetic back stack. Some of the APIs that give you additional 
  control include:</p>
<dl>
  <dt>{@link android.app.Activity#onNavigateUp}</dt>
    <dd>Override this to perform a custom action when the user presses the Up button.</dd>
  <dt>{@link android.app.Activity#navigateUpTo}</dt>
    <dd>Call this to finish the current activity and go to the activity indicated by the
    supplied {@link android.content.Intent}. If the activity exists in the back stack, but
    is not the closest parent, then all other activities between the current activity and the
    activity specified with the intent are finished as well.</dd>
  <dt>{@link android.app.Activity#getParentActivityIntent}</dt>
    <dd>Call this to get the {@link android.content.Intent} that will start the logical
    parent for the current activity.</dd>
  <dt>{@link android.app.Activity#shouldUpRecreateTask}</dt>
    <dd>Call this to query whether a synthetic back stack must be created in order to navigate
    up. Returns true if a synthetic stack must be created, false if the appropropriate stack
  already exists.</dd>
  <dt>{@link android.app.Activity#finishAffinity}</dt>
    <dd>Call this to finish the current activity and all parent activities with the same
      task affinity that are chained to the current activity.
      If you override the default behaviors such as
  {@link android.app.Activity#onNavigateUp}, you should call this method when you
  create a synthetic back stack upon Up navigation.</dd>
  <dt>{@link android.app.Activity#onCreateNavigateUpTaskStack onCreateNavigateUpTaskStack}</dt>
    <dd>Override this if you need to fully control how the synthetic task stack is created. If you want to simply add some extra data to the intents for your back stack, you should instead override {@link android.app.Activity#onPrepareNavigateUpTaskStack
onPrepareNavigateUpTaskStack()}</dd>
</dl>

<p>However, most apps don't need to use these APIs or implement {@link
android.app.Activity#onPrepareNavigateUpTaskStack
onPrepareNavigateUpTaskStack()}, but can can achieve the correct behavior simply by
adding <a
href="{@docRoot}guide/topics/manifest/activity-element.html#parent">{@code
android:parentActivityName}</a> to each <a
href="{@docRoot}guide/topics/manifest/activity-element.html">{@code &lt;activity>}</a> element.</p>











<h2 id="Multimedia">Multimedia</h2>

<h3 id="Codecs">Media codecs</h3>

<p>The {@link android.media.MediaCodec} class provides access to low-level media codecs for encoding
and decoding your media. You can instantiate a {@link android.media.MediaCodec} by calling {@link
android.media.MediaCodec#createEncoderByType createEncoderByType()} to encode media or call {@link
android.media.MediaCodec#createDecoderByType createDecoderByType()} to decode media. Each of these
methods take a MIME type for the type of media you want to encode or decode, such as {@code
"video/3gpp"} or {@code "audio/vorbis"}. </p>

<p>With an instance of {@link android.media.MediaCodec} created, you can then call {@link
android.media.MediaCodec#configure configure()} to specify properties such as the media format or
whether or not the content is encrypted.</p>

<p>Whether you're encoding or decoding your media, the rest of the process is the same after you
create the {@link android.media.MediaCodec}. First call {@link
android.media.MediaCodec#getInputBuffers()} to get an array of input {@link java.nio.ByteBuffer}
objects and {@link android.media.MediaCodec#getOutputBuffers()} to get an array of output {@link
java.nio.ByteBuffer} objects.</p>

<p>When you're ready to encode or decode, call {@link android.media.MediaCodec#dequeueInputBuffer
dequeueInputBuffer()} to get the index position of the {@link
java.nio.ByteBuffer} (from the array of input buffers) that you should use to to feed in your source
media. After you fill the {@link java.nio.ByteBuffer} with your source media, release ownership
of the buffer by calling {@link android.media.MediaCodec#queueInputBuffer queueInputBuffer()}.</p>

<p>Likewise for the output buffer, call {@link android.media.MediaCodec#dequeueOutputBuffer
dequeueOutputBuffer()} to get the index position of the {@link java.nio.ByteBuffer}
where you'll receive the results. After you read the output from the {@link java.nio.ByteBuffer},
release ownership by calling {@link android.media.MediaCodec#releaseOutputBuffer
releaseOutputBuffer()}.</p>

<p>You can handle encrypted media data in the codecs by calling {@link
android.media.MediaCodec#queueSecureInputBuffer queueSecureInputBuffer()} in conjunction with
  the {@link android.media.MediaCrypto} APIs, instead of the normal {@link
android.media.MediaCodec#queueInputBuffer queueInputBuffer()}.</p>

<p>For more information about how to use codecs, see the {@link android.media.MediaCodec} documentation.</p>

<!--
<h3 id="Routing">Media routing</h3>

<p>The new {@link android.media.MediaRouter} class allows you to route media channels and 
  streams from the current device to external speakers and other devices. You
can acquire an instance of {@link android.media.MediaRouter} by calling {@link
android.content.Context#getSystemService getSystemService(}{@link
android.content.Context#MEDIA_ROUTER_SERVICE MEDIA_ROUTER_SERVICE)}.</p>
-->


<h3 id="AudioCue">Record audio on cue</h3>

<p>New method {@link android.media.AudioRecord#startRecording startRecording()} allows
you to begin audio recording based on a cue defined by a {@link android.media.MediaSyncEvent}.
The {@link android.media.MediaSyncEvent} specifies an audio session 
(such as one defined by {@link android.media.MediaPlayer}), which when complete, triggers
the audio recorder to begin recording. For example, you can use this functionality to
play an audio tone that indicates the beginning of a recording session and recording
automatically begins so you don't have to manually synchronize the tone and the beginning
of recording.</p>


<h3 id="TextTracks">Timed text tracks</h3>

<p>The {@link android.media.MediaPlayer} now handles both in-band and out-of-band text tracks.
In-band text tracks come as a text track within an MP4 or 3GPP media source. Out-of-band text
tracks can be added as an external text source via {@link
android.media.MediaPlayer#addTimedTextSource addTimedTextSource()} method. After all external text
track sources are added, {@link android.media.MediaPlayer#getTrackInfo()} should be called to get
the refreshed list of all available tracks in a data source.</p>

<p>To set the track to use with the {@link android.media.MediaPlayer}, you must
call {@link android.media.MediaPlayer#selectTrack selectTrack()}, using the index
position for the track you want to use.</p>

<p>To be notified when the text track is ready to play, implement the
{@link android.media.MediaPlayer.OnTimedTextListener} interface and pass
it to {@link android.media.MediaPlayer#setOnTimedTextListener setOnTimedTextListener()}.</p>


<h3 id="AudioEffects">Audio effects</h3>

<p>The {@link android.media.audiofx.AudioEffect} class now supports additional audio
  pre-processing types when capturing audio:</p>
<ul>
  <li>Acoustic Echo Canceler (AEC) with {@link android.media.audiofx.AcousticEchoCanceler}
    removes the contribution of the signal received from the remote party from the captured audio signal.</li>
  <li>Automatic Gain Control (AGC) with {@link android.media.audiofx.AutomaticGainControl}
    automatically normalizes the output of the captured signal.</li>
  <li>Noise Suppressor (NS) with {@link android.media.audiofx.NoiseSuppressor}
    removes background noise from the captured signal.</li>
</ul>

<p>You can apply these pre-processor effects on audio captured with an {@link
android.media.AudioRecord} using one of the {@link android.media.audiofx.AudioEffect}
subclasses.</p>

<p class="note"><strong>Note:</strong> It's not guaranteed that all devices support these
effects, so you should always first check availability by calling {@link
android.media.audiofx.AcousticEchoCanceler#isAvailable isAvailable()} on the corresponding
audio effect class.</p>


<h3 id="Gapless">Gapless playback</h3>

<p>You can now perform gapless playback between two separate
{@link android.media.MediaPlayer} objects. At any time before your first {@link android.media.MediaPlayer} finishes,
call {@link android.media.MediaPlayer#setNextMediaPlayer setNextMediaPlayer()} and Android
attempts to start the second player the moment that the first one stops.</p>


Media router. The new APIs MediaRouter, MediaRouteActionProvider, and MediaRouteButton provide
standard mechanisms and UI for choosing where to play media.


<h2 id="Camera">Camera</h2>

<h3 id="AutoFocus">Auto focus movement</h3>

<p>The new interface {@link android.hardware.Camera.AutoFocusMoveCallback} allows you to listen
for changes to the auto focus movement. You can register your interface with {@link
android.hardware.Camera#setAutoFocusMoveCallback setAutoFocusMoveCallback()}. Then when the camera
is in a continuous autofocus mode ({@link
android.hardware.Camera.Parameters#FOCUS_MODE_CONTINUOUS_VIDEO} or 
{@link android.hardware.Camera.Parameters#FOCUS_MODE_CONTINUOUS_PICTURE}), you'll receive a call
to {@link android.hardware.Camera.AutoFocusMoveCallback#onAutoFocusMoving onAutoFocusMoving()},
which tells you whether auto focus has started moving or has stopped moving.</p>

<h3 id="CameraSounds">Camera sounds</h3>

<p>The {@link android.media.MediaActionSound} class provides a simple set of APIs to produce
standard sounds made by the camera or other media actions. You should use these APIs to play
the appropriate sound when building a custom still or video camera.</p>

<p>To play a sound, simply instantiate a {@link android.media.MediaActionSound} object, call 
{@link android.media.MediaActionSound#load load()} to pre-load the desired sound, then at the
appropriate time, call {@link android.media.MediaActionSound#play play()}.</p>



<h2 id="Connectivity">Connectivity</h2>


<h3 id="AndroidBeam">Android Beam</h3>

<p>Android Beam&trade; now supports large payload transfers over Bluetooth. When you define the data
to transfer with either the new {@link android.nfc.NfcAdapter#setBeamPushUris setBeamPushUris()}
method or the new callback interface {@link android.nfc.NfcAdapter.CreateBeamUrisCallback}, Android
hands off the data transfer to Bluetooth or another alternate transport to
achieve faster transfer speeds. This is especially useful for large payloads such as image and
audio files and requires no visible pairing between the devices. No additional work is required by
your app to take advantage of transfers over Bluetooth.</p>

<p>The {@link android.nfc.NfcAdapter#setBeamPushUris setBeamPushUris()} method takes an array of
{@link android.net.Uri} objects that specify the data you want to transfer from your app.
Alternatively, you can implement the {@link android.nfc.NfcAdapter.CreateBeamUrisCallback}
interface, which you can specify for your activity by calling {@link
android.nfc.NfcAdapter#setBeamPushUrisCallback setBeamPushUrisCallback()}.</p>

<p>When using the
callback interface, the system calls the interface's {@link
android.nfc.NfcAdapter.CreateBeamUrisCallback#createBeamUris createBeamUris()} method when the
user executes a share with Android Beam so that you can define the URIs to share at share-time.
This is useful if the URIs to share might vary depending on the user context within the
activity, whereas calling {@link android.nfc.NfcAdapter#setBeamPushUris setBeamPushUris()} is
useful when the URIs to share are unchanging and you can safely define them ahead of time.</p>





<h3 id="LocalNsd">Network service discovery</h3>

<p>Android 4.1 adds support for multicast DNS-based service discovery, which allows you to
find and connect to services offered by peer devices over Wi-Fi, such as mobile devices,
printers, cameras, media players, and others that are registered on the local network.</p>

<p>The new package {@link android.net.nsd} contains the new APIs that allow you to
broadcast your services on the local network, discover local devices on the network, and
connect to devices.</p>

<p>To register your service, you must first create an {@link android.net.nsd.NsdServiceInfo}
  object and define the various properties of your service with methods such as
  {@link android.net.nsd.NsdServiceInfo#setServiceName setServiceName()}, 
  {@link android.net.nsd.NsdServiceInfo#setServiceType setServiceType()}, and
  {@link android.net.nsd.NsdServiceInfo#setPort setPort()}.
</p>

<p>Then you need to implement {@link android.net.nsd.NsdManager.RegistrationListener}
and pass it to {@link android.net.nsd.NsdManager#registerService registerService()}
with your {@link android.net.nsd.NsdServiceInfo}.</p>

<p>To discover services on the network, implement {@link
  android.net.nsd.NsdManager.DiscoveryListener} and pass it to {@link
  android.net.nsd.NsdManager#discoverServices discoverServices()}.</p>

<p>When your {@link
  android.net.nsd.NsdManager.DiscoveryListener} receives callbacks about services
found, you need to resolve the service by calling 
{@link android.net.nsd.NsdManager#resolveService resolveService()}, passing it an
implementation of {@link android.net.nsd.NsdManager.ResolveListener} that receives
an {@link android.net.nsd.NsdServiceInfo} object that contains information about the
discovered service, allowing you to initiate the connection.</p>



<h3 id="WiFiNsd">Wi-Fi Direct service discovery</h3>

<p>The Wi-Fi Direct APIs are enhanced in Android 4.1 to support pre-association service discovery in
the {@link android.net.wifi.p2p.WifiP2pManager}. This allows you to discover and filter nearby
devices by services using Wi-Fi Direct before connecting to one, while Network Service
Discovery allows you to discover a service on an existing connected network (such as a local Wi-Fi
network).</p>

<p>To broadcast your app as a service over Wi-Fi so that other devices can discover
  your app and connect to it, call {@link 
  android.net.wifi.p2p.WifiP2pManager#addLocalService addLocalService()} with a
  {@link android.net.wifi.p2p.nsd.WifiP2pServiceInfo} object that describes your app services.</p>

<p>To initiate discover of nearby devices over Wi-Fi, you need to first decide whether you'll
  communicate using Bonjour or Upnp. To use Bonjour, first set up some callback listeners with
  {@link android.net.wifi.p2p.WifiP2pManager#setDnsSdResponseListeners setDnsSdResponseListeners()}, which takes both a {@link android.net.wifi.p2p.WifiP2pManager.DnsSdServiceResponseListener} and {@link android.net.wifi.p2p.WifiP2pManager.DnsSdTxtRecordListener}. To use Upnp, call 
  {@link android.net.wifi.p2p.WifiP2pManager#setUpnpServiceResponseListener setUpnpServiceResponseListener()}, which takes a {@link android.net.wifi.p2p.WifiP2pManager.UpnpServiceResponseListener}.</p>

<p>Before you can start discovering services on local devices, you also need to call {@link android.net.wifi.p2p.WifiP2pManager#addServiceRequest addServiceRequest()}. When the {@link android.net.wifi.p2p.WifiP2pManager.ActionListener} you pass to this method receives a
successful callback, you can then begin discovering services on local devices by calling {@link
android.net.wifi.p2p.WifiP2pManager#discoverServices discoverServices()}.</p>

<p>When local services are discovered, you'll receive a callback to either the {@link
android.net.wifi.p2p.WifiP2pManager.DnsSdServiceResponseListener} or {@link
android.net.wifi.p2p.WifiP2pManager.UpnpServiceResponseListener}, depending on whether you
registered to use Bonjour or Upnp. The callback received in either case contains a
{@link android.net.wifi.p2p.WifiP2pDevice} object representing the peer device.</p>




<h3 id="NetworkUsage">Network usage</h3>

<p>The new method {@link android.net.ConnectivityManager#isActiveNetworkMetered} allows you to
check whether the device is currently connected to a metered network. By checking this state
before performing intensive network transactions, you can help manage the data usage that may cost your users money and make
informed decisions about whether to perform the transactions now or later (such as when the
device becomes connected to Wi-Fi).</p>





<h2 id="A11y">Accessibility</h2>

<h3 id="">Accessibility service APIs</h3>

<p>The reach of accessibility service APIs has been significantly increased in Android 4.1. It now
allows you to build services that monitor and respond to more input events, such as complex gestures
using {@link android.accessibilityservice.AccessibilityService#onGesture onGesture()} and other
input events through additions to the {@link android.view.accessibility.AccessibilityEvent}, {@link
android.view.accessibility.AccessibilityNodeInfo} and {@link
android.view.accessibility.AccessibilityRecord} classes.</p>

<p>Accessibility services can also perform actions on behalf of the user, including clicking,
scrolling and stepping through text using {@link
android.view.accessibility.AccessibilityNodeInfo#performAction performAction} and {@link
android.view.accessibility.AccessibilityNodeInfo#setMovementGranularities
setMovementGranularities}. The {@link
android.accessibilityservice.AccessibilityService#performGlobalAction performGlobalAction()} method
also allows services to perform actions such as Back, Home, and open Recent
Apps and Notifications.</p>


<h3 id="A11yCustomNav">Customizable app navigation</h3>

<p>When building an Android app, you can now customize navigation schemes by finding focusable
elements and input widgets using {@link
android.view.accessibility.AccessibilityNodeInfo#findFocus findFocus()} and {@link
android.view.accessibility.AccessibilityNodeInfo#focusSearch focusSearch()}, and set focus
using {@link android.view.accessibility.AccessibilityNodeInfo#setAccessibilityFocused
setAccessibilityFocused()}.</p>


<h3 id="A11yStructure">More accessible widgets</h3>

<p>The new {@code android.view.accessibility.AccessibilityNodeProvider} class allows you to
surface complex custom views to accessibility services so they can present the information in a
more accessible way. The {@code android.view.accessibility.AccessibilityNodeProvider} allows a user
widget with advanced content, such as a calendar grid, to present a logical semantic structure for
accessibility services that is completely separate from the widget’s layout structure. This semantic
structure allows accessibility services to present a more useful interaction model for users who are
visually impaired.</p>



<h2 id="CopyPaste">Copy and Paste</h2>

<h3 id="CopyIntent">Copy and paste with intents</h3>

<p>You can now associate a {@link android.content.ClipData} object with an {@link
android.content.Intent} using the {@link android.content.Intent#setClipData setClipData()} method.
This is especially useful when using an intent to transfer multiple {@code content:} URIs to another
application, such as when sharing multiple documents.  The {@code content:} URIs supplied
this way will also respect the intent's flags to offer read or write access, allowing you to grant
access to multiple URIs in an the intent. When starting an {@link
android.content.Intent#ACTION_SEND} or {@link
android.content.Intent#ACTION_SEND_MULTIPLE} intent, the URIs supplied in the intent are now
automatically propagated to the {@link android.content.ClipData} so that the receiver can have
access granted to them.</p>


<h5>Support for HTML and string styles</h5>

<p>The {@link android.content.ClipData} class now supports styled text (either as HTML or
Android <a
href="{@docRoot}guide/topics/resources/string-resource.html#FormattingAndStyling">styled
strings</a>). You can add HTML styled text to the {@link android.content.ClipData} with {@link
android.content.ClipData#newHtmlText newHtmlText()}.</p>



<h2 id="Renderscript">Renderscript</h2>

<p>Renderscript computation functionality has been enhanced with the following features:</p>
<ul>
  <li>Support for multiple kernels within one script.</li>
  <li>Support for reading from allocation with filtered samplers from compute in a new script API
{@code rsSample}.</li>
  <li>Support for different levels of FP precision in {@code #pragma}.</li>
  <li>Support for querying additional information from RS objects from a compute script.</li>
  <li>Numerous performance improvements.</li>
</ul>

<p>New pragmas are also available to define the floating point precision required by your
compute Renderscripts. This lets you enable NEON like operations such as fast vector math operations
on the CPU path that wouldn’t otherwise be possible with full IEEE 754-2008 standard.</p>

<p class="note"><strong>Note:</strong> The experimental Renderscript graphics engine is now
deprecated.</p>



<h2 id="Animation">Animation</h2>

<h3 id="ActivityOptions">Activity launch animations</h3>

<p>You can now launch an {@link android.app.Activity} using zoom animations or
your own custom animations. To specify the animation you want, use the {@link 
android.app.ActivityOptions} APIs to build a {@link android.os.Bundle} that you can
then pass to any of the
methods that start an activity, such as {@link
android.app.Activity#startActivity(Intent,Bundle) startActivity()}.</p>

<p>The {@link android.app.ActivityOptions} class includes a different method for each
type of animation you may want to show as your activity opens:</p>
<dl>
  <dt>{@link android.app.ActivityOptions#makeScaleUpAnimation makeScaleUpAnimation()}</dt>
    <dd>Creates an animation that scales up the activity window from a specified starting
    position on the screen and a specified starting size. For example, the home screen in
  Android 4.1 uses this when opening an app.</dd>
  <dt>{@link android.app.ActivityOptions#makeThumbnailScaleUpAnimation
    makeThumbnailScaleUpAnimation()}</dt>
    <dd>Creates an animation that scales up the activity window starting from a specified 
      position and a provided thumbnail image. For example, the Recent Apps window in 
      Android 4.1 uses this when returning to an app.</dd>
  <dt>{@link android.app.ActivityOptions#makeCustomAnimation
    makeCustomAnimation()}</dt>
    <dd>Creates an animation defined by your own resources: one that defines the animation for
    the activity opening and another for the activity being stopped.</dd>
</dl>



<h3 id="TimeAnimator">Time animator</h3>

<p>The new {@link android.animation.TimeAnimator} provides a simple callback 
  mechanism with the {@link android.animation.TimeAnimator.TimeListener} that notifies
  you upon every frame of the animation. There is no duration, interpolation, or object value-setting with this Animator. The listener's callback receives information for each frame including
  total elapsed time and the elapsed time since the previous animation frame.</p>




<h2 id="UI">User Interface</h2>


<h3 id="Notifications">Notifications</h3>

<p>In Android 4.1, you can create notifications with larger content regions, big image previews,
  multiple action buttons, and configurable priority.</p>

<h5>Notification styles</h5>

<p>The new method {@link android.app.Notification.Builder#setStyle setStyle()} allows you to specify
  one of three new styles for your notification that each offer a larger content region. To
specify the style for your large content region, pass {@link 
android.app.Notification.Builder#setStyle setStyle()} one of the following objects:</p>
<dl>
  <dt>{@link android.app.Notification.BigPictureStyle}</dt>
    <dd>For notifications that includes a large image attachment.</dd>
  <dt>{@link android.app.Notification.BigTextStyle}</dt>
    <dd>For notifications that includes a lot of text, such as a single email.</dd>
  <dt>{@link android.app.Notification.InboxStyle}</dt>
    <dd>For notifications that include a list of strings, such as snippets from multiple emails.</dd>
</dl>

<h5>Notification actions</h5>

<p>There's now support for up to two action buttons that appear at the bottom of the
  notification message, whether your notification uses the normal or larger style.</p>

<p>To add an action button, call {@link android.app.Notification.Builder#addAction
  addAction()}. This method takes three arguments: a drawable resource for an icon,
  text for the button, and a {@link android.app.PendingIntent} that defines the action
  to perfrom.</p>


<h5>Priorities</h5>

<p>You can now hint to the system how important your notification is to affect the
  order of your notification in the list by setting
the priority with {@link android.app.Notification.Builder#setPriority setPriority()}. You
can pass this one of five different priority levels defined by {@code PRIORITY_*} constants
in the {@link android.app.Notification} class. The default is {@link 
android.app.Notification#PRIORITY_DEFAULT}, and there's two levels higher and two levels lower.</p>

<p>High priority notifications are things that users generally want to respond to quickly,
such as a new instant message, text message, or impending event reminder. Low priority
notifications are things like expired calendar events or app promotions.</p>

<h3 id="SystemUI">Controls for system UI</h3>

<p>Android 4.0 (Ice Cream Sandwich) added new flags to control the visibility of the system UI
elements, such as to dim the appearance of the system bar or make it disappear completely on handsets.
Android 4.1 adds a few more flags that allow you to further control the appearance of system
UI elements and your activity layout in relation to them by calling {@link
android.view.View#setSystemUiVisibility setSystemUiVisibility()}
and passing the following flags:</p>

<dl>
  <dt>{@link android.view.View#SYSTEM_UI_FLAG_FULLSCREEN}</dt>
    <dd>Hides non-critical system UI (such as the status bar). 
      If your activity uses the action bar in overlay mode (by
      enabling <a href="{@docRoot}reference/android/R.attr.html#windowActionBarOverlay">{@code
      android:windowActionBarOverlay}</a>), then this flag also hides the action bar and does
      so with a coordinated animation when both hiding and showing the two.</dd>
  
  <dt>{@link android.view.View#SYSTEM_UI_FLAG_LAYOUT_FULLSCREEN}</dt>
    <dd>Sets your activity layout to use the same screen area that's available when you've
    enabled {@link android.view.View#SYSTEM_UI_FLAG_FULLSCREEN} even if the system UI elements
    are still visible. Although parts of your layout will be overlayed by the
    system UI, this is useful if your app often hides and shows the system UI with 
  {@link android.view.View#SYSTEM_UI_FLAG_FULLSCREEN}, because it avoids your layout from
  adjusting to the new layout bounds each time the system UI hides or appears.</dd>

  <dt>{@link android.view.View#SYSTEM_UI_FLAG_LAYOUT_HIDE_NAVIGATION}</dt>
    <dd>Sets your activity layout to use the same screen area that's available when you've
    enabled {@link android.view.View#SYSTEM_UI_FLAG_HIDE_NAVIGATION} (added in Android 4.0)
    even if the system UI elements are still visible. Although parts of your layout will be
    overlayed by the
    navigation bar, this is useful if your app often hides and shows the navigation bar
  with {@link android.view.View#SYSTEM_UI_FLAG_HIDE_NAVIGATION}, because it avoids your layout from
  adjusting to the new layout bounds each time the navigation bar hides or appears.</dd>

  <dt>{@link android.view.View#SYSTEM_UI_FLAG_LAYOUT_STABLE}</dt>
    <dd>You might want to add this flag if you're using {@link
      android.view.View#SYSTEM_UI_FLAG_LAYOUT_FULLSCREEN} and/or {@link
      android.view.View#SYSTEM_UI_FLAG_LAYOUT_HIDE_NAVIGATION} to ensure that when you call
      {@link android.view.View#fitSystemWindows fitSystemWindows()} on a view that the
    bounds defined remain consistent with regard to the available screen space.
    That is, with this flag set, {@link android.view.View#fitSystemWindows
    fitSystemWindows()} will behave as if the visibility of system UI elements is unchanged
    even after you hide all system UI.</dd>
    <dd></dd>
</dl>

<p>For more discussion about the other related system UI flags, read about
  those added in <a href="{@docRoot}about/versions/android-4.0.html#SystemUI">Android 4.0</a>.</p>


<h3 id="RemoteViews">Remote views</h3>

<p>{@link android.widget.GridLayout} and {@link android.view.ViewStub} 
  are now remotable views so you can use them in layouts for your
  app widgets and notification custom layouts.</p>



<h3 id="Fonts">Font families</h3>

<p>Android 4.1 adds several more variants of the Roboto font style for a total of 10 variants, 
  and they're all usable by apps. Your apps now have access to the full set of both light and
condensed variants.</p>

<p>The complete set of Roboto font variants available is:</p>

<ul>
  <li>Regular</li>
  <li>Italic</li>
  <li>Bold</li>
  <li>Bold-italic</li>
  <li>Light</li>
  <li>Light-italic</li>
  <li>Condensed regular</li>
  <li>Condensed italic</li>
  <li>Condensed bold</li>
  <li>Condensed bold-italic</li>
</ul>

<p>You can apply any one of these with the new {@link android.R.attr#fontFamily}
  attribute in combination with the {@link android.R.attr#textStyle} attribute.</p>

<p>Supported values for {@link android.R.attr#fontFamily} are:</p>
<ul>
  <li>{@code "sans-serif"} for regular Roboto</li>
  <li>{@code "sans-serif-light"} for Roboto Light</li>
  <li>{@code "sans-serif-condensed"} for Roboto Condensed</li>
</ul>

<p>You can then apply bold and/or italic with {@link android.R.attr#textStyle} values
{@code "bold"} and {@code "italic"}. You can apply both like so: {@code
android:textStyle="bold|italic"}.</p>

<p>You can also use {@link android.graphics.Typeface#create Typeface.create()}. 
  For example, {@code Typeface.create("sans-serif-light", Typeface.NORMAL)}.</p>





    
<h2 id="Input">Input Framework</h2>


<h3 id="InputDevice">Multiple input devices</h3>

<p>The new {@link android.hardware.input.InputManager} class allows you to query the
set of input devices current connected and register to be notified when a new device
is added, changed, or removed. This is particularly useful if you're building a game
that supports multiple players and you want to detect how many controllers are connected
and when there are changes to the number of controllers.</p>

<p>You can query all input devices connected by calling
{@link android.hardware.input.InputManager#getInputDeviceIds()}. This returns
an array of integers, each of which is an ID for a different input device. You can then call
{@link android.hardware.input.InputManager#getInputDevice getInputDevice()} to acquire
an {@link android.view.InputDevice} for a specified input device ID.</p>

<p>If you want to be informed when new input devices are connected, changed, or disconnected,
implement the {@link android.hardware.input.InputManager.InputDeviceListener} interface and
register it with {@link android.hardware.input.InputManager#registerInputDeviceListener
registerInputDeviceListener()}.</p>

<h3 id="Vibrate">Vibrate for input controllers</h3>

<p>If connected input devices have their own vibrate capabilities, you can now control
the vibration of those devices using the existing {@link android.os.Vibrator} APIs simply
by calling {@link android.view.InputDevice#getVibrator()} on the {@link android.view.InputDevice}.</p>



<h2 id="Permissions">Permissions</h2>

<p>The following are new permissions:</p>

<dl>
  <dt>{@link android.Manifest.permission#READ_EXTERNAL_STORAGE}</dt>
  <dd>Provides protected read access to external storage.  In Android 4.1 by 
    default all applications still have read
access.  This will be changed in a future release to require that applications explicitly request
read access using this permission.  If your application already requests write access, it will
automatically get read access as well.  There is a new developer option to turn on read access
restriction, for developers to test their applications against how Android will behave in the
future.</dd>
  <dt>{@link android.Manifest.permission#READ_USER_DICTIONARY}</dt>
    <dd>Allows an application to read the user dictionary. This should only be required by an
IME, or a dictionary editor like the Settings app.</dd>
  <dt>{@link android.Manifest.permission#READ_CALL_LOG}</dt>
    <dd>Allows an application to read the system's call log that contains information about
      incoming and outgoing calls.</dd>
  <dt>{@link android.Manifest.permission#WRITE_CALL_LOG}</dt>
    <dd>Allows an application to modify the system's call log stored on your phone</dd>
  <dt>{@link android.Manifest.permission#WRITE_USER_DICTIONARY}</dt>
    <dd>Allows an application to write to the user's word dictionary.</dd>
</dl>


<h2 id="DeviceFeatures">Device Features</h2>

<p>Android 4.1 includes a new feature declaration for devices that are dedicated
to displaying the user interface on a television screen: {@link
android.content.pm.PackageManager#FEATURE_TELEVISION}. To declare that your app requires
a television interface, declare this feature in your manifest file with the <a href="{@docRoot}guide/topics/manifest/uses-feature-element.html">{@code &lt;uses-feature>}</a> element:</p>
<pre>
&lt;manifest ... >
    &lt;uses-feature android:name="android.hardware.type.television"
                  android:required="true" />
    ...
&lt;/manifest>
</pre>

<p>This feature defines "television" to be a typical living room television experience: 
  displayed on a big screen, where the user is sitting far away and the dominant form of 
  input is be something like a d-pad, and generally not through touch or a 
  mouse/pointer-device.</p>