Merge "docs: M Runtime Permissions overview." into mnc-preview-docs
diff --git a/docs/html/preview/api-overview.jd b/docs/html/preview/api-overview.jd
index 5ab4b89..d1639aa 100644
--- a/docs/html/preview/api-overview.jd
+++ b/docs/html/preview/api-overview.jd
@@ -14,8 +14,7 @@
         <span class="less" style="display:none">show less</span></a></h2>
 
 <ol id="toc44" class="hide-nested">
-  <li><a href="#backup">Auto Backup for Apps</a></li>
-  <li><a href="#notifications">Notifications</a></li>
+  <li><a href="#backup">Automatic App Data Backup</a></li>
   <li><a href="#authentication">Authentication</a>
     <ul>
       <li><a href="#fingerprint-authentication">Fingerprint Authentication</a></li>
@@ -24,14 +23,27 @@
   </li>
   <li><a href="#direct-share">Direct Share</a></li>
   <li><a href="#voice-interactions">Voice Interactions</a></li>
+  <li><a href="#assist">Assist API</a></li>
+  <li><a href="#notifications">Notifications</a></li>
   <li><a href="#bluetooth-stylus">Bluetooth Stylus Support</a></li>
-  <li><a href="#audio">New Audio Features</a></li>
-  <li><a href="#afw">New Android for Work Features</a></li>
+  <li><a href="#ble-scanning">Improved Bluetooth Low Energy Scanning</a></li>
+  <li><a href="#hotspot">Hotspot 2.0 Release 1 Support</a></li>
+  <li><a href="#4K-display">4K Display Mode</a></li>
+  <li><a href="#behavior-themeable-colorstatelists">Themeable ColorStateLists</a></li>
+  <li><a href="#audio">Audio Features</a></li>
+  <li><a href="#video">Video Features</a></li>
+  <li><a href="#camera">Camera Features</a>
+    <ul>
+      <li><a href="#flashlight">Flashlight API</a></li>
+      <li><a href="#reprocessing">Camera Reprocessing</a></li>
+    </ul>
+  </li>
+  <li><a href="#afw">Android for Work Features</a></li>
 </ol>
 
 <h2>API Differences</h2>
 <ol>
-<li><a href="">API level 22 to M &raquo;</a> </li>
+<li><a href="{@docRoot}preview/download.html">API level 22 to M Preview &raquo;</a> </li>
 </ol>
 
 </div>
@@ -55,40 +67,24 @@
 href="{@docRoot}">developer.android.com</a>. These API elements are
 formatted in {@code code style} in this document (without hyperlinks). For the
 preliminary API documentation for these elements, download the <a
-href="http://storage.googleapis.com/androiddevelopers/preview/m-developer-preview-reference.zip">
-preview reference</a>.</p>
+href="{@docRoot}preview/download.html#docs"> preview reference</a>.</p>
 
 <h3>Important behavior changes</h3>
 
 <p>If you have previously published an app for Android, be aware that your app might be affected
-by changes in M.</p>
+by changes in the platform.</p>
 
-<p>Please see <a href="api-changes.html">Behavior Changes</a> for complete information.</p>
+<p>Please see <a href="behavior-changes.html">Behavior Changes</a> for complete information.</p>
 
-<h2 id="backup">Auto Backup for Apps</h2>
+<h2 id="backup">Automatic App Data Backup</h2>
 <p>The system now performs automatic full data backup and restore for apps. This behavior is
-enabled by default for apps targeting M; you do not need to add any additional code. If users
-delete their Google account, their backup data is deleted as well.</p>
-<p>To learn how this feature works and how to configure what to back up on the file system,
-see the <a href="">App Backup for Apps guide</a>.</p>
-
-<h2 id="notifications">Notifications</h2>
-<p>M adds the following API changes for notifications:</p>
-<ul>
-  <li>New {@code NotificationListenerService.INTERRUPTION_FILTER_ALARMS} filter level that
-    corresponds to the new <em>Alarms only</em> do not disturb mode.</li>
-  <li>New {@code Notification.CATEGORY_REMINDER} category value that is used to distinguish
-  user-scheduled reminders from other events
-  ({@link android.app.Notification#CATEGORY_EVENT}) and alarms
-  ({@link android.app.Notification#CATEGORY_ALARM}).</li>
-  <li>New {@code android.graphics.drawable.Icon} class which can be attached to your notifications
-    via the Notification.Builder.setIcon() and Notification.Builder.setLargeIcon() methods.</li>
-  <li>New {@code NotificationManager.getActiveNotifications()} method that allows your apps to
-    find out which of their notifications are currently alive.</li>
-</ul>
+enabled by default for apps targeting M Preview; you do not need to add any additional code. If
+users delete their Google accounts, their backup data is deleted as well. To learn how this feature
+works and how to configure what to back up on the file system, see
+<a href="{@docRoot}preview/backup/index.html">Automatic App Data Backup</a>.</p>
 
 <h2 id="authentication">Authentication</h2>
-<p>The M release offers new APIs to let you authenticate users by using their fingerprint scans on
+<p>This preview offers new APIs to let you authenticate users by using their fingerprint scans on
 supported devices, and check how recently the user was last authenticated using a device unlocking
 mechanism (such as a lockscreen password). Use these APIs in conjunction with
 the <a href="{@docRoot}training/articles/keystore.html">Android Keystore system</a>.</p>
@@ -97,17 +93,15 @@
 
 <p>To authenticate users via fingerprint scan, get an instance of the new
 {@code android.hardware.fingerprint.FingerprintManager} class and call the
-{@code FingerprintManager.authenticate()} method. Your app must be running on a device with a
-fingerprint sensor. You must implement the user interface for the fingerprint
-authentication flow on your app, and use the standard fingerprint Android icon in your UI.
-If you are developing multiple apps that use fingerprint authentication, note that each app must
-authenticate the user’s fingerprint independently.
+{@code FingerprintManager.authenticate()} method. Your app must be running on a compatible
+device with a fingerprint sensor. You must implement the user interface for the fingerprint
+authentication flow on your app, and use the standard Android fingerprint icon in your UI.
+The Android fingerprint icon ({@code c_fp_40px.png}) is included in the
+<a href="https://github.com/googlesamples/android-FingerprintDialog"
+class="external-link">sample app</a>. If you are developing multiple apps that use fingerprint
+authentication, note that each app must authenticate the user’s fingerprint independently.
 </p>
 
-<img src="{@docRoot}preview/images/fingerprint-screen_2x.png"
-srcset="{@docRoot}preview/images/fingerprint-screen.png 1x, preview/images/fingerprint-screen_2x.png 2x"
-style="margin:0 0 10px 20px" width="282" height="476" />
-
 <p>To use this feature in your app, first add the {@code USE_FINGERPRINT} permission in your
 manifest.</p>
 
@@ -116,34 +110,34 @@
         android:name="android.permission.USE_FINGERPRINT" /&gt;
 </pre>
 
-<p>The following snippet shows how you might listen for fingerprint events in your
-{@code FingerprintManager.AuthenticationCallback} implementation.</p>
+<img src="{@docRoot}preview/images/fingerprint-screen.png"
+srcset="{@docRoot}preview/images/fingerprint-screen.png 1x, {@docRoot}preview/images/fingerprint-screen_2x.png 2x"
+style="float:right; margin:0 0 10px 20px" width="282" height="476" />
 
-<pre>
-// Call this to start listening for fingerprint events
-public void startListening(FingerprintManager.CryptoObject cryptoObject) {
-    if (!isFingerprintAuthAvailable()) {
-        return;
-    }
-    mCancellationSignal = new CancellationSignal();
-    mSelfCancelled = false;
-    mFingerprintManager.authenticate(cryptoObject,
-            mCancellationSignal, this, 0 /* flags */);
-    // Icon to display when prompting users to start a fingerprint scan
-    mIcon.setImageResource(R.drawable.ic_fp_40px);
-}
+<p>To see an app implementation of fingerprint authentication, refer to the
+<a href="https://github.com/googlesamples/android-FingerprintDialog" class="external-link">
+  Fingerprint Dialog sample</a>.</p>
 
-// Helper method to check if the device supports fingerprint
-// scanning and if the user has enrolled at least one fingerprint.
-public boolean isFingerprintAuthAvailable() {
-    return mFingerprintManager.isHardwareDetected()
-        &amp;&amp; mFingerprintManager.hasEnrolledFingerprints();
-}
+<p>If you are testing this feature, follow these steps:</p>
+<ol>
+<li>Enroll a new fingerprint in the emulator by going to
+<strong>Settings > Security > Fingerprint</strong>, then follow the enrollment instructions.</li>
+<li>Install Android SDK Tools Revision 24.3, if you have not done so.</li>
+<li>Use an emulator to emulate fingerprint touch events with the
+following command. Use the same command to emulate fingerprint touch events on the lockscreen or
+in your app.
+<pre class="no-prettyprint">
+adb -e emu finger touch &lt;finger_id&gt;
 </pre>
+<p>On Windows, you may have to run {@code telnet 127.0.0.1 &lt;emulator-id&gt;} followed by
+  {@code finger touch &lt;finger_id&gt;}.
+</p>
+</li>
+</ol>
 
 <h3 id="confirm-credentials">Confirm Credentials</h3>
 <p>Your app can authenticate users based on how recently they last unlocked their device. You can
-use the same public or secret key to authenticate users into multiple apps. This feature frees
+use the same public or secret key to authenticate users. This feature frees
 users from having to remember additional app-specific passwords, and avoids the need for you to
 implement your own authentication user interface.</p>
 
@@ -152,46 +146,14 @@
 {@code android.security.KeyPairGeneratorSpec.Builder} and
 {@code android.security.KeyGeneratorSpec.Builder} classes for public key pairs and secret keys
 respectively. If you are importing keys, use the {@link android.security.KeyStoreParameter.Builder}
-class to set your constraints.</p>
+class to set your constraints. You can use the
+{@link android.app.KeyguardManager#createConfirmDeviceCredentialIntent(java.lang.CharSequence, java.lang.CharSequence) createConfirmDeviceCredentialIntent()}
+method to re-authenticate the user within your app if the timeout expired.
+</p>
 
-<p>The following example shows how you might create a symmetric key in the Keystore which can only be
-used if the user has successfully unlocked the device within the last 5 minutes.</p>
-
-<pre>
-private void createKey() {
-    // Generate a key to decrypt payment credentials, tokens, etc.
-    // This will most likely be a registration step for the user when
-    // they are setting up your app.
-    try {
-        KeyStore ks = KeyStore.getInstance("AndroidKeyStore");
-        ks.load(null);
-        KeyGenerator keyGenerator = KeyGenerator.getInstance("AES",
-                "AndroidKeyStore");
-        keyGenerator.init(new KeyGeneratorSpec.Builder(this)
-                // Alias of the entry in Android KeyStore where the key will appear
-                .setAlias(KEY_NAME)
-                // Key use constraints
-                .setPurposes(KeyStoreKeyProperties.Purpose.ENCRYPT
-                    | KeyStoreKeyProperties.Purpose.DECRYPT)
-                .setBlockModes("CBC")
-                .setUserAuthenticationRequired(true)
-                // Require that the user has unlocked in the last 5 minutes
-                .setUserAuthenticationValidityDurationSeconds(5 * 60)
-                .setEncryptionPaddings("PKCS7Padding")
-                .build());
-        keyGenerator.generateKey();
-    } catch (NoSuchAlgorithmException | NoSuchProviderException
-            | InvalidAlgorithmParameterException | KeyStoreException
-            | CertificateException | IOException e) {
-          throw new RuntimeException(e);
-    }
-}
-</pre>
-
-<p>To determine the last time users logged into their account, call the
-{@code android.accounts.AccountManager.confirmCredentials()} method. If the call is successful, the
-method returns an bundle that includes a {@code KEY_LAST_AUTHENTICATED_TIME} value which indicates
-the last time, in milliseconds, that the credential for that account was validated or created.</p>
+<p>To see an app implementation of this feature, refer to the
+<a href="https://github.com/googlesamples/android-ConfirmDeviceCredentials" class="external-link">
+  Confirm Device Credentials sample</a>.</p>
 
 <h2 id="direct-share">Direct Share</h2>
 
@@ -199,7 +161,7 @@
 srcset="{@docRoot}preview/images/direct-share-screen.png 1x, preview/images/direct-share-screen_2x.png 2x"
 style="float:right; margin:0 0 20px 30px" width="312" height="385" />
 
-<p>This release provides you with APIs to makes sharing intuitive and quick for users. You can now
+<p>This preview provides you with APIs to makes sharing intuitive and quick for users. You can now
 define <em>deep links</em> that target a specific activity in your app. These deep links are
 exposed to users via the <em>Share</em> menu. This feature allows users to share content to
 targets, such as contacts, within other apps. For example, the deep link might launch an
@@ -214,9 +176,6 @@
 {@code SERVICE_INTERFACE} action.</p>
 <p>The following example shows how you might declare the {@code ChooserTargetService} in your
 manifest.</p>
-<br>
-<br>
-<br>
 <pre>
 &lt;service android:name=".ChooserTargetService"
         android:label="&#64;string/service_name"
@@ -243,37 +202,142 @@
         android:value=".ChooserTargetService" /&gt;
 &lt;/activity>
 </pre>
+<p>To see an app implementation of this feature, refer to the
+<a href="https://github.com/googlesamples/android-DeepLinkSharing" class="external-link">
+  Deep Link Sharing sample</a>.</p>
+
 
 <h2 id="voice-interactions">Voice Interactions</h2>
 <p>
-This release provides a new voice interaction API which, together with
+This preview provides a new voice interaction API which, together with
 <a href="https://developers.google.com/voice-actions/" class="external-link">Voice Actions</a>,
 allows you to build conversational voice experiences into your apps. Call the
 {@code android.app.Activity.isVoiceInteraction()} method to determine if your activity was
 started in response to a voice action. If so, your app can use the
 {@code android.app.VoiceInteractor} class to request a voice confirmation from the user, select
-from a list of options, and more.</p>
-<p>To learn more about implementing voice actions, see the voice interaction API
+from a list of options, and more. To learn more about implementing voice actions, see the
 <a href="https://developers.google.com/voice-actions/interaction/"
-class="external-link">guide</a>.
+class="external-link">Voice Actions developer site</a>.
 </p>
 
+<h2 id="assist">Assist API</h2>
+<p>
+This preview offers a new way for users to engage with your apps through an assistant. To use this
+feature, the user must enable the assistant to use the current context. Once enabled, the user
+can summon the assistant within any app, by long-pressing on the <strong>Home</strong> button.</p>
+<p>The platform passes the current context to the assistant. In addition to the standard set of
+information that the platform passes to the assistant, your app can share additional information
+by using the new {@code android.app.Activity.AssistContent} class.</p>
+
+<p>To provide the assistant with additional context from your app, follow these steps:</p>
+
+<ol>
+<li>Implement the {@link android.app.Application.OnProvideAssistDataListener} interface.</li>
+<li>Register this listener by using
+{@link android.app.Application#registerOnProvideAssistDataListener(android.app.Application.OnProvideAssistDataListener) registerOnProvideAssistDataListener()}.</li>
+<li>In order to provide activity-specific contextual information, override the
+{@link android.app.Activity#onProvideAssistData(android.os.Bundle) onProvideAssistData()}
+callback and, optionally, the new {@code Activity.onProvideAssistContent()} callback.
+</ol>
+
+<h2 id="notifications">Notifications</h2>
+<p>This preview adds the following API changes for notifications:</p>
+<ul>
+  <li>New {@code NotificationListenerService.INTERRUPTION_FILTER_ALARMS} filter level that
+    corresponds to the new <em>Alarms only</em> do not disturb mode.</li>
+  <li>New {@code Notification.CATEGORY_REMINDER} category value that is used to distinguish
+  user-scheduled reminders from other events
+  ({@link android.app.Notification#CATEGORY_EVENT}) and alarms
+  ({@link android.app.Notification#CATEGORY_ALARM}).</li>
+  <li>New {@code android.graphics.drawable.Icon} class which can be attached to your notifications
+    via the {@code Notification.Builder.setSmallIcon(Icon)} and
+    {@code Notification.Builder.setLargeIcon(Icon)} methods.</li>
+  <li>New {@code NotificationManager.getActiveNotifications()} method that allows your apps to
+    find out which of their notifications are currently alive. To see an app implementation that
+    uses this feature, see the <a href="https://github.com/googlesamples/android-ActiveNotifications"
+    class="external-link">Active Notifications sample</a>.</li>
+</ul>
+
 <h2 id="bluetooth-stylus">Bluetooth Stylus Support</h2>
-<p>The M release provides improved support for user input using a Bluetooth stylus. If the user
-touches a stylus with a button on the screen of your app, the
-{@link android.view.MotionEvent#getToolType(int) getTooltype()} method now returns
-{@code TOOL_TYPE_STYLUS}. The {@link android.view.MotionEvent#getButtonState() getButtonState()}
-method returns {@link android.view.MotionEvent#BUTTON_SECONDARY} when the user
+<p>This preview provides improved support for user input using a Bluetooth stylus. Users can pair
+and connect a compatible Bluetooth stylus with their phone or tablet.  While connected, position
+information from the touch screen is fused with pressure and button information from the stylus to
+provide a greater range of expression than with the touch screen alone. Your app can listen for
+stylus button presses and perform secondary actions, by registering the new
+{@code View.onStylusButtonPressListener} and {@code GestureDetector.OnStylusButtonPressListener}
+callbacks in your activity.</p>
+
+<p>Use the {@link android.view.MotionEvent} methods and constants to detect stylus button
+interactions:</p>
+<ul>
+<li>If the user touches a stylus with a button on the screen of your app, the
+{@link android.view.MotionEvent#getToolType(int) getTooltype()} method returns
+{@link android.view.MotionEvent#TOOL_TYPE_STYLUS}.</li>
+<li>For apps targeting M Preview, the
+{@link android.view.MotionEvent#getButtonState() getButtonState()}
+method returns {@code MotionEvent.STYLUS_BUTTON_PRIMARY} when the user
 presses the primary stylus button. If the stylus has a second button, the same method returns
-{@link android.view.MotionEvent#BUTTON_TERTIARY} when the user presses it. If the user presses
-both buttons simultaneously, the method returns both these values. In addition, the system reports
-the user button-press action to the new {@code View.onStylusButtonPressListener} and
-{@code GestureDetector.OnStylusButtonPressListener} callbacks in your activity, if you have
-registered these listeners in your app.</p>
+{@code MotionEvent.STYLUS_BUTTON_SECONDARY} when the user presses it. If the user presses
+both buttons simultaneously, the method returns both values OR'ed together
+({@code STYLUS_BUTTON_PRIMARY|STYLUS_BUTTON_SECONDARY}).</li>
+<li>
+For apps targeting a lower platform version, the
+{@link android.view.MotionEvent#getButtonState() getButtonState()} method returns
+{@link android.view.MotionEvent#BUTTON_SECONDARY} (for primary stylus button press),
+{@link android.view.MotionEvent#BUTTON_TERTIARY} (for secondary stylus button press), or both.
+</li>
+</ul>
 
-<h2 id="audio">New Audio Features</h2>
+<h2 id="ble-scanning">Improved Bluetooth Low Energy Scanning</h2>
+<p>
+If your app performs performs Bluetooth Low Energy scans, you can use the new
+{@code android.bluetooth.le.ScanSettings.Builder.setCallbackType()} method to specify that
+you want callbacks to only be notified when an advertisement packet matching the set
+{@link android.bluetooth.le.ScanFilter} is first found, and when it is not seen for a period of
+time. This approach to scanning is more power-efficient than what’s provided in the previous
+platform version.
+</p>
 
-<p>This release adds enhancements to audio processing on Android, including: </p>
+<h2 id="hotspot">Hotspot 2.0 Release 1 Support</h2>
+<p>
+This preview adds support for the Hotspot 2.0 Release 1 spec on Nexus 6 and Nexus 9 devices. To
+provision Hotspot 2.0 credentials in your app, use the new methods of the
+{@link android.net.wifi.WifiEnterpriseConfig} class, such as {@code setPlmn()} and
+{@code setRealm()}. In the {@link android.net.wifi.WifiConfiguration} object, you can set the
+{@link android.net.wifi.WifiConfiguration#FQDN} and the {@code providerFriendlyName} fields.
+The new {@code ScanResult.PasspointNetwork} property indicates if a detected
+network represents a Hotspot 2.0 access point.
+</p>
+
+<h2 id="4K-display">4K Display Mode</h2>
+<p>The platform now allows apps to request that the display resolution be upgraded to 4K rendering
+on compatible hardware. To query the current physical resolution, use the new
+{@code android.view.Display.Mode} APIs. If the UI is drawn at a lower logical resolution and is
+upscaled to a larger physical resolution, be aware that the physical resolution the
+{@code Display.Mode.getPhysicalWidth()} method returns may differ from the logical
+resolution reported by {@link android.view.Display#getSize(android.graphics.Point) getSize()}.</p>
+
+<p>You can request the system to change the physical resolution in your app as it runs, by setting
+the {@code WindowManager.LayoutParams.preferredDisplayModeId} property of your app’s window.  This
+feature is useful if you want to switch to 4K display resolution. While in 4K display mode, the
+UI continues to be rendered at the original resolution (such as 1080p) and is upscaled to 4K, but
+{@link android.view.SurfaceView} objects may show content at the native resolution.</p>
+
+<p>To test the new 4K display mode, simulate a secondary display of a larger resolution using the
+<strong>Developer Options</strong> settings.</p>
+
+<h2 id="behavior-themeable-colorstatelists">Themeable ColorStateLists</h2>
+<p>Theme attributes are now supported in
+{@link android.content.res.ColorStateList} for devices running the M Preview. The
+{@link android.content.res.Resources#getColorStateList(int) getColorStateList()} and
+{@link android.content.res.Resources#getColor(int) getColor()} methods have been deprecated. If
+you are calling these APIs, call the new {@code Context.getColorStateList()} or
+{@code Context.getColor()} methods instead. These methods are also available in the
+v4 appcompat library via {@link android.support.v4.content.ContextCompat}.</p>
+
+<h2 id="audio">Audio Features</h2>
+
+<p>This preview adds enhancements to audio processing on Android, including: </p>
 <ul>
   <li>Support for the <a href="http://en.wikipedia.org/wiki/MIDI" class="external-link">MIDI</a>
 protocol, with the new {@code android.media.midi} APIs. Use these APIs to send and receive MIDI
@@ -293,8 +357,78 @@
 when an audio device is connected or disconnected.</li>
 </ul>
 
-<h2 id="afw">New Android for Work Features</h2>
-<p>This release includes the following new APIs for Android for Work:</p>
+<h2 id="video">Video Features</h2>
+<p>This preview adds new capabilities to the video processing APIs, including:</p>
+<ul>
+<li>New {@code android.media.MediaSync} class which helps applications to synchronously render
+audio and video streams. The audio buffers are submitted in non-blocking fashion and are
+returned via a callback. It also supports dynamic playback rate.
+</li>
+<li>New {@code MediaDrm.EVENT_SESSION_RECLAIMED} event, which indicates that a session opened by
+the app has been reclaimed by the resource manager. If your app uses DRM sessions, you should
+handle this event and make sure not to use a reclaimed session.
+</li>
+<li>New {@code MediaCodec.CodecException.ERROR_RECLAIMED} error code, which indicates that the
+resource manager reclaimed the media resource used by the codec. With this exception, the codec
+must be released, as it has moved to terminal state.
+</li>
+<li>New {@code MediaCodecInfo.CodecCapabilities.getMaxSupportedInstances()} interface to get a
+hint for the max number of the supported concurrent codec instances.
+</li>
+<li>New {@code MediaPlayer.setPlaybackParams()} method to set the media playback rate for fast or
+slow motion playback. It also stretches or speeds up the audio playback automatically in
+conjunction with the video.</li>
+</ul>
+
+<h2 id="camera">Camera Features</h2>
+<p>This preview includes the following new APIs for accessing the camera’s flashlight and for
+camera reprocessing of images:</p>
+
+<h3 id="flashlight">Flashlight API</h3>
+<p>If a camera device has a flash unit, you can call the {@code CameraManager.setTorchMode()}
+method to switch the flash unit’s torch mode on or off without opening the camera device. The app
+does not have exclusive ownership of the flash unit or the camera device. The torch mode is turned
+off and becomes unavailable whenever the camera device becomes unavailable, or when other camera
+resources keeping the torch on become unavailable. Other apps can also call {@code setTorchMode()}
+to turn off the torch mode. When the last app that turned on the torch mode is closed, the torch
+mode is turned off.</p>
+
+<p>You can register a callback to be notified about torch mode status by calling the
+{@code CameraManager.registerTorchCallback()} method. The first time the callback is registered,
+it is immediately called with the torch mode status of all currently known camera devices with a
+flash unit. If the torch mode is turned on or off successfully, the
+{@code CameraManager.TorchCallback.onTorchModeChanged()} method is invoked.</p>
+
+<h3 id="reprocessing">Reprocessing API</h3>
+<p>The {@link android.hardware.camera2 Camera2} API is extended to support YUV and private
+opaque format image reprocessing. Your app determine if the reprocessing capabilities are available
+via {@code CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES}. If a device supports reprocessing,
+you can create a reprocessable camera capture session by calling
+{@code CameraDevice.createReprocessableCaptureSession()}, and create requests for input
+buffer reprocessing.</p>
+
+<p>Use the {@code ImageWriter} class to connect the input buffer flow to the camera reprocessing
+input. To get an empty buffer, follow this programming model:</p>
+
+<ol>
+<li>Call the {@code ImageWriter.dequeueInputImage()} method.</li>
+<li>Fill the data into the input buffer.</li>
+<li>Send the buffer to the  camera by calling the {@code ImageWriter.queueInputImage()} method.</li>
+</ol>
+
+<p>If you are using a {@code ImageWriter} object together with an
+{@code android.graphics.ImageFormat.PRIVATE} image, your app cannot access the image
+data directly. Instead, pass the {@code ImageFormat.PRIVATE} image directly to the
+{@code ImageWriter} by calling the {@code ImageWriter.queueInputImage()} method without any
+buffer copy.</p>
+
+<p>The {@code ImageReader} class now supports {@code android.graphics.ImageFormat.PRIVATE} format
+image streams. This support allows your app to maintain a circular image queue of
+{@code ImageReader} output images, select one or more images, and send them to the
+{@code ImageWriter} for camera reprocessing.</p>
+
+<h2 id="afw">Android for Work Features</h2>
+<p>This preview includes the following new APIs for Android for Work:</p>
 <ul>
   <li><strong>Enhanced controls for Corporate-Owned, Single-Use devices:</strong> The Device Owner
 can now control the following settings to improve management of
@@ -325,13 +459,13 @@
 <li><strong>Auto-acceptance of system updates.</strong> By setting a system update policy with
 {@code DevicePolicyManager.setSystemUpdatePolicy()}, a Device Owner can now auto-accept a system
 update, for instance in the case of a kiosk device, or postpone the update and prevent it being
-taken by the user for up to 30 days. Furthermore, an administrator can set a time window in which an
-update must be taken, for example during the hours when a kiosk device is not in use. When a
-system update is available, the system checks if the Work Policy Controller app has set a system
+taken by the user for up to 30 days. Furthermore, an administrator can set a daily time window in
+which an update must be taken, for example during the hours when a kiosk device is not in use. When
+a system update is available, the system checks if the Work Policy Controller app has set a system
 update policy, and behaves accordingly.
 </li>
 <li>
-<strong>Delegated certificate installation.</strong> A Profile or Device Owner can now grant a
+<strong>Delegated certificate installation:</strong> A Profile or Device Owner can now grant a
 third-party app the ability to call these {@link android.app.admin.DevicePolicyManager} certificate
 management APIs:
 <ul>
@@ -349,21 +483,49 @@
 installKeyPair()}</li>
 </ul>
 </li>
-<li><strong>Enterprise factory reset protection.</strong> When provisioning a Device Owner, you can
-now configure parameters for bypassing Factory Reset Protection (FRP), by setting the
+<li><strong>Enterprise factory reset protection:</strong> When provisioning a Device Owner, you can
+now configure parameters to unlock Factory Reset Protection (FRP) by setting the
 {@code DeviceManagerPolicy.EXTRA_PROVISIONING_RESET_PROTECTION_PARAMETERS} bundle. An NFC Programmer
-app can provide these parameters after a device has been reset to bypass FRP and provision the device,
+app can provide these parameters after a device has been reset to unlock FRP and provision the device,
 without requiring the previously configured Google account. If you don't modify these parameters,
 FRP remains in-place and prevents the device from being activated without the previously activated
-Google credentials.</li>
-<li><strong>Data usage tracking.</strong> A Profile or Device Owner can now query for the data
-usage statistics visible in <em>Settings > Data</em> usage by using the new
+Google credentials.
+<p>Additionally, by setting app restrictions on Google Play services, Device Owners can specify
+alternative Google accounts for unlocking FRP to replace the ones activated on the device.</p>
+</li>
+<li><strong>Data usage tracking.</strong> A Profile or Device Owner can now query for the
+data usage statistics visible in <strong>Settings > Data</strong> usage by using the new
 {@code android.app.usage.NetworkStatsManager} methods. Profile Owners are automatically granted
 permission to query data on the profile they manage, while Device Owners get access to usage data
 of the managed primary user.</li>
+<li><strong>Runtime permission management:</strong> With the new runtime permissions model, a
+Profile or Device Owner can now silently grant or revoke an app’s permissions by calling
+{@code DevicePolicyManager.setPermissionGranted()}. Granting or revoking a single permission applies
+that setting to all permissions within that runtime permission group; the user is not prompted
+at runtime when any permission from that runtime permission group is required. Furthermore, the
+user cannot modify the selection made by the Profile or Device Owner within the app’s permissions
+screen in <strong>Settings</strong>.
+<img src="{@docRoot}preview/images/work-profile-screen_2x.png"
+srcset="{@docRoot}preview/images/work-profile-screen.png 1x, preview/images/work-profile-screen_2x.png 2x"
+style="float:right; margin:0 0 10px 20px" width="282" height="476" />
+<p>A Profile or Device Owner can also set a permission policy
+for all runtime requests of all applications using
+{@code DevicePolicyManager.setPermissionPolicy()}, to either prompt the user to grant the
+permission as normal or automatically grant or deny the permission silently. If the latter policy
+is set, the user cannot modify the selection made by the Profile or Device Owner within the
+app’s permissions screen in <strong>Settings</strong>.</p></li>
+<li><strong>VPN in Settings:</strong> VPN apps are now visible in
+    <strong>Settings > More > VPN</strong>.
+Additionally, the notifications that accompany VPN usage are now specific to whether that VPN is
+configured for a managed profile or the entire device.</li>
+<li><strong>Work status notification:</strong> A status bar briefcase icon now appears whenever
+an app from the managed profile has an activity in the foreground. Furthermore, if the device is
+unlocked directly to the activity of an app in the managed profile, a toast is displayed notifying
+the user that they are within the work profile.
+</li>
 </ul>
 
 <p class="note">
   For a detailed view of all API changes in the M Developer Preview, see the <a href=
-  "{@docRoot}preview/reference.html">API Differences Report</a>.
+  "{@docRoot}preview/download.html">API Differences Report</a>.
 </p>
\ No newline at end of file
diff --git a/docs/html/preview/behavior-changes.jd b/docs/html/preview/behavior-changes.jd
new file mode 100644
index 0000000..0dd549b
--- /dev/null
+++ b/docs/html/preview/behavior-changes.jd
@@ -0,0 +1,407 @@
+page.title=Behavior Changes
+page.keywords=preview,sdk,compatibility
+sdk.platform.apiLevel=23
+@jd:body
+
+<div id="qv-wrapper">
+<div id="qv">
+
+<h2>In this document</h2>
+
+<ol id="toc44" class="hide-nested">
+    <li><a href="#behavior-runtime-permissions">Runtime Permissions</a></li>
+    <li><a href="#behavior-project-volta">Project Volta</a>
+        <ol>
+            <li><a href="#behavior-doze">Doze mode</a></li>
+            <li><a href="#behavior-app-standby">App Standby</a></li>
+        </ol>
+    </li>
+    <li><a href="#behavior-adoptable-storage">Adoptable Storage Devices</a></li>
+    <li><a href="#behavior-apache-http-client">Apache HTTP Client Removal</a></li>
+    <li><a href="#behavior-audiomanager-Changes">AudioManager Changes</a></li>
+    <li><a href="#behavior-test-selection">Text Selection</a></li>
+    <li><a href="#behavior-keystore">Android Keystore Changes</a></li>
+    <li><a href="#night-mode">Night Mode</a></li>
+    <li><a href="#behavior-network">Wi-Fi and Networking Changes</a></li>
+    <li><a href="#behavior-camera">Camera Service Changes</a></li>
+    <li><a href="#behavior-art-runtime">ART Runtime</a></li>
+    <li><a href="#behavior-apk-validation">APK Validation</a></li>
+    <li><a href="#behavior-afw">Android for Work Changes</a></li>
+</ol>
+
+<h2>API Differences</h2>
+<ol>
+<li><a href="{@docRoot}preview/download.html">API level 22 to M Preview &raquo;</a> </li>
+</ol>
+
+
+<h2>See Also</h2>
+<ol>
+<li><a href="{@docRoot}preview/api-overview.html">M Developer Preview API Overview</a> </li>
+</ol>
+
+</div>
+</div>
+
+<p>API Level: M</p>
+<p>Along with new features and capabilities, the M Developer Preview includes a variety of
+system changes and API behavior changes. This document highlights
+some of the key changes that you should be understand and account for in your apps.</p>
+
+<p>If you have previously published an app for Android, be aware that your app
+  might be affected by these changes in the platform.</p>
+
+<h2 id="behavior-runtime-permissions">Runtime Permissions</h1>
+<p>This preview introduces a new runtime permissions model, where users can now directly manage
+their app permissions at runtime. This model gives users improved visibility and control over
+permissions, while streamlining the installation and auto-update processes for app developers.
+Users can set permissions on or off for all apps running on the M Preview. However, apps that
+don’t target the M Preview cannot request permissions at runtime.</p>
+
+<p>On your apps that target the M Preview, make sure to check and request for permissions at
+runtime. To determine if your app has been granted a permission, call the
+new {@code Context.checkSelfPermission()} method. To request for a permission, call the new
+{@code Activity.requestPermission()} method.</p>
+
+<p>For more information on supporting the new permissions model in your app, see
+<a href="{@docRoot}preview/features/runtime-permissions.html">
+Android M Preview Runtime Permissions</a>.</p>
+
+<h2 id="behavior-project-volta">Project Volta</h2>
+<p>This preview introduces new power-saving optimizations for idle devices and apps.</p>
+
+<h3 id="behavior-doze">Doze mode</h3>
+<p>If a device is unplugged and left stationary with the screen off for a period of time, it
+goes into <em>Doze</em> mode where it attempts to keep the system in a sleep state. In this mode,
+devices periodically resume normal operations for brief periods of time so that app syncing can
+occur and the system can perform any pending operations.</p>
+
+<p>The following restrictions apply to your apps while in Doze mode:</p>
+<ul>
+<li>Network access is disabled, unless your app receives a high priority Google Cloud Messaging
+tickle.</li>
+<li><a href="{@docRoot}reference/android/os/PowerManager.WakeLock.html">Wake locks</a> are ignored.</li>
+<li>Alarms scheduled with the {@link android.app.AlarmManager} class are disabled, except for
+alarms that you've set with the {@link android.app.AlarmManager#setAlarmClock setAlarmClock()}
+method and {@code AlarmManager.setAndAllowWhileIdle()}.</li>
+<li>WiFi scans are not performed.</li>
+<li>Syncs and jobs for your sync adapters and {@link android.app.job.JobScheduler} are not
+permitted to run.</li>
+</ul>
+</p>
+<p>When the device exists doze mode, it executes any jobs and syncs that are pending.</p>
+<p>You can test this feature by connecting a device running the M Preview to your development
+machine and calling the following commands:
+</p>
+<pre class="no-prettyprint">
+$ adb shell dumpsys battery unplug
+$ adb shell dumpsys deviceidle step
+$ adb shell dumpsys deviceidle -h
+</pre>
+<p class="note"><strong>Note</strong>: The upcmoning
+<a href="{@docRoot}google/gcm/index.html">Google Cloud Messaging</a> release lets you designate
+high-priority messages. If your app receives high-priority GCM messages, it’s granted
+brief network access even when the device is in doze mode.
+</p>
+
+<h3 id="behavior-app-standby">App standby</h3>
+<p>With this preview, the system may determine that apps are idle when they are not in active
+use. Your app is considered idle after a period of time, unless the system detects
+any of these signals:</p>
+
+<ul>
+<li>The app has a process currently in the foreground (either as an activity or foreground service,
+or in use by another activity or foreground service).</li>
+<li>The app generates a notification that users see on the lock screen or in the
+notification tray.</li>
+<li>The user explicitly asks for the app to be exempt from optimizations,
+via <strong>Settings</strong>.</li>
+</ul>
+
+<p>If the device is unplugged, apps deemed idle will have their network access disabled
+and their syncs and jobs suspended. When the device is plugged into a power supply, these apps are
+allowed network access and can execute any jobs and syncs that are pending. If the
+device is idle for long periods of time, idle apps are allowed network access around once a day.</p>
+
+<p>You can test this feature by connecting a device running the M Preview to your development
+machine and calling the following commands:
+</p>
+<pre class="no-prettyprint">
+$ adb shell am broadcast -a android.os.action.DISCHARGING
+$ adb shell am set-idle &lt;packageName&gt; true
+$ adb shell am set-idle &lt;packageName&gt; false
+$ adb shell am get-idle &lt;packageName&gt;
+</pre>
+
+<p class="note"><strong>Note</strong>: The upcoming
+<a href="{@docRoot}google/gcm/index.html">Google Cloud Messaging</a> (GCM) release lets you
+designate high-priority messages. If your app receives high-priority GCM messages, it’s granted
+brief network access even when the app is idle.
+</p>
+
+<h2 id="behavior-adoptable-storage">Adoptable Storage Devices</h2>
+<p>
+With this preview, users can <em>adopt</em> external storage devices such as SD cards. Adopting an
+external storage device encrypts and formats the device to behave like internal storage. This
+feature allows users to move both apps and private data of those apps between storage devices. When
+moving apps, the system respects the
+<a href="{@docRoot}guide/topics/manifest/manifest-element.html#install">{@code android:installLocation}</a>
+preference in the manifest.</p>
+
+<p>If your app accesses the following APIs or fields, be aware that the file paths they return
+will dynamically change when the app is moved between internal and external storage devices.
+When building file paths, it is strongly recommended that you always call these APIs dynamically.
+Don’t use hardcoded file paths or persist fully-qualified file paths that were built previously.</p>
+
+<ul>
+<li>{@link android.content.Context} methods:
+    <ul>
+        <li>{@link android.content.Context#getFilesDir() getFilesDir()}</li>
+        <li>{@link android.content.Context#getCacheDir() getCacheDir()}</li>
+        <li>{@link android.content.Context#getCodeCacheDir() getCodeCacheDir()}</li>
+        <li>{@link android.content.Context#getDatabasePath(java.lang.String) getDatabasePath()}</li>
+        <li>{@link android.content.Context#getDir(java.lang.String,int) getDir()}</li>
+        <li>{@link android.content.Context#getNoBackupFilesDir() getNoBackupFilesDir()}</li>
+        <li>{@link android.content.Context#getFileStreamPath(java.lang.String) getFileStreamPath()}</li>
+        <li>{@link android.content.Context#getPackageCodePath() getPackageCodePath()}</li>
+        <li>{@link android.content.Context#getPackageResourcePath() getPackageResourcePath()}</li>
+    </ul>
+</li>
+<li>{@link android.content.pm.ApplicationInfo} fields:
+    <ul>
+        <li>{@link android.content.pm.ApplicationInfo#dataDir dataDir}</li>
+        <li>{@link android.content.pm.ApplicationInfo#sourceDir sourceDir}</li>
+        <li>{@link android.content.pm.ApplicationInfo#nativeLibraryDir nativeLibraryDir}</li>
+        <li>{@link android.content.pm.ApplicationInfo#publicSourceDir publicSourceDir}</li>
+        <li>{@link android.content.pm.ApplicationInfo#splitSourceDirs splitSourceDirs}</li>
+        <li>{@link android.content.pm.ApplicationInfo#splitPublicSourceDirs splitPublicSourceDirs}</li>
+    </ul>
+</li>
+</ul>
+
+<p>To debug this feature in the developer preview, you can enable adoption of a USB drive that is
+connected to an Android device through a USB On-The-Go (OTG) cable, by running this command:</p>
+
+<pre class="no-prettyprint">
+$ adb shell sm set-force-adoptable true
+</pre>
+
+<h2 id="behavior-apache-http-client">Apache HTTP Client Removal</h2>
+<p>This preview removes support for the Apache HTTP client. If your app is using this client and
+targets Android 2.3 (API level 9) or higher, use the {@link java.net.HttpURLConnection} class
+instead. This API is more efficient because it reduces network use through transparent compression
+and response caching, and minimizes power consumption. To continue using the Apache HTTP APIs, you
+must first declare the following compile-time dependency in your {@code build.gradle} file:
+</p>
+<pre>
+android {
+    useLibrary 'org.apache.http.legacy'
+}
+</pre>
+<p>Android is moving away from OpenSSL to the
+<a href="https://boringssl.googlesource.com/boringssl/" class="external-link">BoringSSL</a>
+library. If you’re using the Android NDK in your app, don't link against cryptographic libraries
+that are not a part of the NDK API, such as {@code libcrypto.so} and {@code libssl.so}. These
+libraries are not public APIs, and may change or break without notice across releases and devices.
+In addition, you may expose yourself to security vulnerabilities. Instead, modify your
+native code to call the Java cryptography APIs via JNI or to statically link against a
+cryptography library of your choice.</p>
+
+<h2 id="behavior-audiomanager-Changes">AudioManager Changes</h2>
+<p>Setting the volume directly or muting specific streams via the {@link android.media.AudioManager}
+class is no longer supported. The {@link android.media.AudioManager#setStreamSolo(int,boolean)
+setStreamSolo()} method is deprecated, and you should call the
+{@code AudioManager.requestAudioFocus()} method instead. Similarly, the
+{@link android.media.AudioManager#setStreamMute(int,boolean) setStreamMute()} method is
+deprecated; instead, call the {@code AudioManager.adjustStreamVolume()} method
+and pass in the direction value {@code ADJUST_MUTE} or {@code ADJUST_UNMUTE}.</p>
+
+<h2 id="behavior-test-selection">Text Selection</h2>
+
+<img src="{@docRoot}preview/images/text-selection.gif"
+style="float:right; margin:0 0 20px 30px" width="360" height="640" />
+
+<p>When users select text in your app, you can now display text selection actions such as
+<em>Cut</em>, <em>Copy</em>, and <em>Paste</em> in a
+<a href="http://www.google.com/design/spec/patterns/selection.html#selection-text-selection"
+class="external-link">floating toolbar</a>. The user interaction implementation is similar to that
+for the contextual action bar, as described in
+<a href="{@docRoot}guide/topics/ui/menus.html#CABforViews">
+Enabling the contextual action mode for individual views</a>.</p>
+
+<p>To implement a floating toolbar for text selection, make the following changes in your existing
+apps:</p>
+<ol>
+<li>In your {@link android.view.View} or {@link android.app.Activity} object, change your
+{@link android.view.ActionMode} calls from
+{@code startActionMode(Callback)} to {@code startActionMode(Callback, ActionMode.TYPE_FLOATING)}.</li>
+<li>Take your existing implementation of {@code ActionMode.Callback} and make it extend
+{@code ActionMode.Callback2} instead.</li>
+<li>Override the {@code Callback2.onGetContentRect()} method to provide the coordinates of the
+content {@link android.graphics.Rect} object (such as a text selection rectangle) in the view.</li>
+<li>If the rectangle positioning is no longer valid, and this is the only element to be invalidated,
+call the {@code ActionMode.invalidateContentRect()} method.</li>
+</ol>
+
+<p>If you are using <a href="{@docRoot}tools/support-library/index.html">
+Android Support Library</a> revision 22.2, be aware that floating toolbars are not
+backward-compatible and appcompat takes control over {@link android.view.ActionMode} objects by
+default. This prevents floating toolbars from being displayed. To enable
+{@link android.view.ActionMode} support in an
+{@link android.support.v7.app.AppCompatActivity}, call
+{@code android.support.v7.app.AppCompatActivity.getDelegate()}, then call
+{@code android.support.v7.app.AppCompatDelegate.setHandleNativeActionModesEnabled()} on the returned
+{@link android.support.v7.app.AppCompatDelegate} object and set the input
+parameter to {@code false}. This call returns control of {@link android.view.ActionMode} objects to
+the framework. In devices running the M Preview, that allows the framework to support
+{@link android.support.v7.app.ActionBar} or floating toolbar modes, while on pre-M Preview devices,
+only the {@link android.support.v7.app.ActionBar} modes are supported.</p>
+
+<h2 id="behavior-keystore">Android Keystore Changes</h2>
+<p>With this preview, the
+<a href="{@docRoot}training/articles/keystore.html">Android Keystore provider</a> no longer supports
+DSA. ECDSA is still supported.</p>
+
+<p>Keys which do not require encryption at rest will no longer be deleted when secure lock screen
+is disabled or reset (for example, by the user or a Device Administrator). Keys which require
+encryption at rest will be deleted during these events.</p>
+
+<h2 id="night-mode">Night Mode (User-configurable Dark Theme)</h2>
+<p>
+Support for the {@code -night} resource qualifier has been updated. Previously, night mode was
+only available when a device was docked and in car mode. With this preview, night mode is
+available on
+all devices and is user-configurable via <strong>Settings > Display > Theme</strong>. You can adjust
+this setting globally using {@link android.app.UiModeManager#setNightMode(int) setNightMode()}. The
+Dark theme corresponds to {@link android.app.UiModeManager#MODE_NIGHT_YES}. When the device is in
+night mode, the resource framework prefers resources that have the {@code -night} qualifier. To
+take advantage of user-configurable Dark mode in your app, extend from the
+{@code Theme.Material.DayNight} set of themes rather than {@code Theme.Material} or
+{@code Theme.Material.Light}.
+</p>
+
+<h2 id="behavior-network">Wi-Fi and Networking Changes</h2>
+
+<p>This preview introduces the following behavior changes to the Wi-Fi and networking APIs.</p>
+<ul>
+<li>Your apps can now change the state of {@link android.net.wifi.WifiConfiguration} objects only
+if you created these objects. You are not permitted to modify or delete
+{@link android.net.wifi.WifiConfiguration} objects created by the user or by other apps.
+</li>
+<li>
+Previously, if an app forced the device to connect to a specific Wi-Fi network by using
+{@link android.net.wifi.WifiManager#enableNetwork(int,boolean) enableNetwork()} with the
+{@code disableAllOthers=true} setting, the device disconnected from other networks such as
+cellular data. In this preview, the device no longer disconnects from such other networks. If
+your app’s {@code targetSdkVersion} is {@code “20”} or lower, it is pinned to the selected
+Wi-Fi network. If your app’s {@code targetSdkVersion} is {@code “21”} or higher, use the
+multinetwork APIs (such as
+{@link android.net.Network#openConnection(java.net.URL) openConnection()},
+{@link android.net.Network#bindSocket(java.net.Socket) bindSocket()}, and the new
+{@code ConnectivityManager.bindProcessToNetwork()} method) to ensure that its network traffic is
+sent on the selected network.</li>
+</ul>
+
+<h2 id="behavior-camera">Camera Service Changes</h2>
+<p>In this preview, the model for accessing shared resources in the camera service has been changed
+from the previous “first come, first serve” access model to an access model where high-priority
+processes are favored.  Changes to the service behavior include:</p>
+<ul>
+<li>Access to camera subsystem resources, including opening and configuring a camera device, is
+awarded based on the “priority” of the client application process. Application processes with
+user-visible or foreground activities are generally given a higher-priority, making camera resource
+acquisition and use more dependable.</li>
+<li>Active camera clients for lower priority apps may be “evicted” when a higher priority
+application attempts to use the camera.  In the deprecated {@link android.hardware.Camera} API,
+this results in
+{@link android.hardware.Camera.ErrorCallback#onError(int,android.hardware.Camera) onError()} being
+called for the evicted client. In the {@link android.hardware.camera2 Camera2} API, it results in
+{@link android.hardware.camera2.CameraDevice.StateCallback#onDisconnected(android.hardware.camera2.CameraDevice) onDisconnected()}
+being called for the evicted client.</li>
+<li>On devices with appropriate camera hardware, separate application processes are able to
+independently open and use separate camera devices simultaneously. However, multi-process use
+cases, where simultaneous access causes significant degradation of performance or capabilities of
+any of the open camera devices, are now detected and disallowed by the camera service. This change
+may result in “evictions” for lower priority clients even when no other app is directly
+attempting to access the same camera device.
+</li>
+<li>
+Changing the current user causes active camera clients in apps owned by the previous user account
+to be evicted.  Access to the camera is limited to user profiles owned by the current device user.
+In practice, this means that a “Guest” account, for example, will not be able to leave running
+processes that use the camera subsystem when the user has switched to a different account.
+</li>
+</ul>
+
+<h2 id="behavior-art-runtime">ART Runtime</h2>
+<p>The ART runtime now properly implements access rules for the
+{@link java.lang.reflect.Constructor#newInstance(java.lang.Object...) newInstance()} method. This
+change fixes a problem where Dalvik was checking access rules incorrectly in previous versions.
+If your app uses the
+{@link java.lang.reflect.Constructor#newInstance(java.lang.Object...) newInstance()} method and you
+want to override access checks, call the
+{@link java.lang.reflect.Constructor#setAccessible(boolean) setAccessible()} method with the input
+parameter set to {@code true}. If your app uses the
+<a href="{@docRoot}tools/support-library/features.html#v7-appcompat">v7 appcompat library</a> or the
+<a href="{@docRoot}tools/support-library/features.html#v7-recyclerview">v7 recyclerview library</a>,
+you must update your app to use to the latest versions of these libraries. Otherwise, make sure that
+any custom classes referenced from XML are updated so that their class constructors are accessible.</p>
+
+<p>This preview updates the behavior of the dynamic linker. The dynamic linker now understands the
+difference between a library’s {@code soname} and its path
+(<a href="https://code.google.com/p/android/issues/detail?id=6670" class="external-link">
+public bug 6670</a>), and search by {@code soname} is now
+implemented. Apps which previously worked that have bad {@code DT_NEEDED} entries
+(usually absolute paths on the build machine’s file system) may fail when loaded.</p>
+
+<p>The {@code dlopen(3) RTLD_LOCAL} flag is now correctly implemented. Note that
+{@code RTLD_LOCAL} is the default, so calls to {@code dlopen(3)} that didn’t explicitly use
+{@code RTLD_LOCAL} will be affected (unless your app explicitly used {@code RTLD_GLOBAL}). With
+{@code RTLD_LOCAL}, symbols will not be made available to libraries loaded by later calls to
+{@code dlopen(3)} (as opposed to being referenced by {@code DT_NEEDED} entries).</p>
+</p>
+
+<h2 id="behavior-apk-validation">APK Validation</h2>
+<p>The platform now performs stricter validation of APKs. An APK is considered corrupt if a file is
+declared in the manifest but not present in the APK itself. An APK must be re-signed if any of the
+contents are removed.</p>
+
+<h2 id="behavior-afw">Android for Work Changes</h2>
+<p>This preview includes the following behavior changes for Android for Work:</p>
+<ul>
+<li><strong>Work contacts in personal contexts.</strong> The Google Dialer
+Call Log now displays work contacts when the user views past calls. Both
+work and personal contacts are now available to devices over Bluetooth, but you can hide work
+profile contacts through a device policy by calling the new
+{@code DevicePolicyManager.setBluetoothContactSharingDisabled()} method. Initiating a call still
+shows personal contacts, as consistent with the experience in Android 5.0.
+</li>
+<li><strong>WiFi configuration removal:</strong> WiFi configurations added by a Profile Owner
+(for example, through calls to the
+{@link android.net.wifi.WifiManager#addNetwork(android.net.wifi.WifiConfiguration)
+addNetwork()} method) are now removed if that work profile is deleted.</li>
+<li><strong>WiFi configuration lockdown:</strong> Any WiFi configuration created by an active Device
+Owner can no longer be modified or deleted by the user. The user can still create and
+modify their own WiFi configurations, so long as the {@link android.os.UserManager} constant
+{@link android.os.UserManager#DISALLOW_CONFIG_WIFI} has not been set for that user.</li>
+<li><strong>Download Work Policy Controller via Google account addition:</strong> When a Google
+account that requires management via a Work Policy Controller (WPC) app is added to a device
+outside of a managed context, the add account flow now prompts the user to install the
+appropriate WPC. This behavior also applies to accounts added via
+<strong>Settings > Accounts</strong> in the initial device setup wizard.</li>
+<li><strong>Changes to specific DevicePolicyManager API behaviors:</strong>
+Calling the {@link android.app.admin.DevicePolicyManager#setCameraDisabled(android.content.ComponentName,boolean) setCameraDisabled()}
+method affects the camera for the calling user only; calling it from the managed profile doesn’t
+affect camera apps running on the primary user. In addition, the
+{@link android.app.admin.DevicePolicyManager#setKeyguardDisabledFeatures(android.content.ComponentName,int) setKeyguardDisabledFeatures()}
+method is now available for Profile Owners, in addition to Device Owners. A Profile Owner can set
+these keyguard restrictions:
+<ul>
+<li>{@link android.app.admin.DevicePolicyManager#KEYGUARD_DISABLE_TRUST_AGENTS} and
+    {@link android.app.admin.DevicePolicyManager#KEYGUARD_DISABLE_FINGERPRINT}, which affect the
+    keyguard settings for the profile’s parent user.</li>
+<li>{@link android.app.admin.DevicePolicyManager#KEYGUARD_DISABLE_UNREDACTED_NOTIFICATIONS}, which
+    only affects notifications generated by applications in the managed profile.</li>
+</ul>
+</li>
+</ul>
diff --git a/docs/html/preview/images/perf-test-frame-latency.png b/docs/html/preview/images/perf-test-frame-latency.png
new file mode 100644
index 0000000..87d1cfc
--- /dev/null
+++ b/docs/html/preview/images/perf-test-frame-latency.png
Binary files differ
diff --git a/docs/html/preview/images/perf-test-framestats.png b/docs/html/preview/images/perf-test-framestats.png
new file mode 100644
index 0000000..589a923
--- /dev/null
+++ b/docs/html/preview/images/perf-test-framestats.png
Binary files differ
diff --git a/docs/html/preview/images/work-profile-screen.png b/docs/html/preview/images/work-profile-screen.png
new file mode 100644
index 0000000..c3e4e44
--- /dev/null
+++ b/docs/html/preview/images/work-profile-screen.png
Binary files differ
diff --git a/docs/html/preview/images/work-profile-screen_2x.png b/docs/html/preview/images/work-profile-screen_2x.png
new file mode 100644
index 0000000..5dcf610
--- /dev/null
+++ b/docs/html/preview/images/work-profile-screen_2x.png
Binary files differ
diff --git a/docs/html/preview/preview_toc.cs b/docs/html/preview/preview_toc.cs
index 7e9f292..0371932 100644
--- a/docs/html/preview/preview_toc.cs
+++ b/docs/html/preview/preview_toc.cs
@@ -38,10 +38,9 @@
   </li>
 
   <li class="nav-section">
-    <div class="nav-section-header empty"><a href="<?cs var:toroot ?>preview/api-changes.html">
+    <div class="nav-section-header empty"><a href="<?cs var:toroot ?>preview/behavior-changes.html">
       Behavior Changes</a></div>
   </li>
-
   <li class="nav-section">
     <div class="nav-section-header empty"><a href="<?cs var:toroot ?>preview/samples.html">
       Samples</a></div>
diff --git a/docs/html/preview/testing/guide.jd b/docs/html/preview/testing/guide.jd
new file mode 100644
index 0000000..1879268
--- /dev/null
+++ b/docs/html/preview/testing/guide.jd
@@ -0,0 +1,176 @@
+page.title=Testing Guide
+page.image=images/cards/card-set-up_16-9_2x.png
+
+@jd:body
+
+<div id="qv-wrapper">
+  <div id="qv">
+    <h2>In this document</h2>
+      <ol>
+        <li><a href="#runtime-permissions">Testing Runtime Permissions</a></li>
+        <li><a href="#doze-standby">Testing Doze and App Standby</a></li>
+      </ol>
+  </div>
+</div>
+
+<p>
+  The Android M Developer Preview gives you an opportunity to ensure your apps work with the next
+  version of the platform. This preview includes a number of APIs and behavior changes that can
+  impact your app, as described in the <a href="{@docRoot}preview/api-overview.html">API
+  Overview</a> and <a href="{@docRoot}preview/api-changes.html">Behavior Changes</a>. In testing
+  your app with the preview, there are some specific system changes that you should focus on to
+  ensure that users have a good experience.
+</p>
+
+<p>
+  This guide describes the what and how to test preview features with your app. You should
+  prioritize testing of these specific preview features, due to their high potential impact on your
+  app's behavior:
+</p>
+
+<ul>
+  <li><a href="#runtime-permissions">Runtime Permissions</a>
+  </li>
+  <li><a href="#doze-mode">Doze and App Standby</a>
+  </li>
+</ul>
+
+<p>
+  For more information about how to set up devices or virtual devices with a preview system image
+  for testing, see <a href="{@docRoot}preview/setup-sdk.html">Set up the Preview SDK</a>.
+</p>
+
+
+<h2 id="runtime-permissions">Testing Runtime Permissions</h2>
+
+<p>
+  The <a href="{@docRoot}preview/features/runtime-permissions.html">Runtime Permissions</a> feature
+  changes the way that permissions are allocated to your app by the user. Instead of granting all
+  permissions during the install procedure, your app must ask the user for individual permissions
+  at runtime. For users this behavior provides more granular control over each app’s activities, as
+  well as better context for understanding why the app is requesting a specific permission. Users
+  can grant or revoke the permissions granted to an app individually at any time. This feature of
+  the preview is most likely to have an impact on your app's behavior and may prevent some of your
+  app features from working, or they may work in a degraded state.
+</p>
+
+<p>
+  This change that affects all apps running on the new platform, even those not targeting the new
+  platform version. The platform provides a limited compatibility behavior for legacy apps, but you
+  should begin planning your app’s migration to the new permissions model now, with a goal of
+  publishing an updated version of your app at the official platform launch.
+</p>
+
+
+<h3 id="permission-test-tips">Test tips</h3>
+
+<p>
+  Use the following test tips to help you plan and execute testing of your app with the new
+  permissions behavior.
+</p>
+
+<ul>
+  <li>Identify your app’s current permissions and the related code paths.</li>
+  <li>Test user flows across permission-protected services and data.</li>
+  <li>Test with various combinations of granted/revoked permission.</li>
+  <li>Use the {@code adb} tool to manage permssions from the command line:
+    <ul>
+      <li>List permissions and status by group:
+        <pre>adb shell pm list permissions -d -g</pre>
+      </li>
+      <li>Grant or revoke one or more permissions using the following syntax:<br>
+        <pre>adb shell pm [grant|revoke] &lt;permission.name&gt; ...</pre>
+      </li>
+    </ul>
+  </li>
+  <li>Analyze your app for services that use permissions.</li>
+</ul>
+
+<h3 id="permission-test-strategy">Test strategy</h3>
+
+<p>
+  The Runtime Permissions change affects the structure and design of your app, as well as
+  the user experience and flows you provide to users. You should assess your app’s current
+  permissions use and start planning for the new flows you want to offer. The official release of
+  the platform provides compatibility behavior, but you should plan on updating your app and not
+  rely on these behaviors.
+</p>
+
+<p>
+  Identify the permissions that your app actually needs and uses, and then find the various code
+  paths that use the permission-protected services. You can do this through a combination of
+  testing on the new platform and code analysis. In testing, you should focus on opting in to
+  runtime permissions by changing the app’s {@code targetSdkVersion} to the preview version. For
+  more information, see <a href="{@docRoot}preview/setup-sdk.html#">Set up the Preview SDK</a>.
+</p>
+
+<p>
+  Test with various combinations of permissions revoked and added, to highlight the user flows that
+  depend on permissions. Where a dependency is not obvious or logical you should consider
+  refactoring or compartmentalizing that flow to eliminate the dependency or make it clear why the
+  permission is needed.
+</p>
+
+<p>
+  For more information on the behavior of Runtime Permissions, testing, and best practices, see the
+  <a href="{@docRoot}preview/features/runtime-permissions.html">Runtime Permissions</a> developer
+  preview page.
+</p>
+
+
+<h2 id="doze-standby">Testing Doze and App Standby</h2>
+
+<p>
+  The power saving features of Doze and App Standby limits the amount of background processing that
+  your app can perform when a device is in an idle state or while your app is not in focus. The
+  restrictions the system may impose on apps include limited or no network access,
+  suspended background tasks, suspended Notifications, ignored wake requests, and alarms. To ensure
+  that your app behaves properly with these power saving optimizations, you should test your app by
+  simulating these low power states.
+</p>
+
+<h4 id="doze">Testing your app with Doze</h4>
+
+<p>To test Doze with your app:</p>
+
+<ol>
+<li>Configure a hardware device or virtual device with a M Preview system image.</li>
+<li>Connect the device to your development machine and install your app.</li>
+<li>Run your app and leave it active.</li>
+<li>Simulate the device going into Doze mode by running the following commands:
+
+<pre>
+$ adb shell dumpsys battery unplug
+$ adb shell dumpsys deviceidle step
+$ adb shell dumpsys deviceidle -h
+</pre>
+
+  </li>
+  <li>Observe the behavior of your app when the device is re-activated. Make sure it
+    recovers gracefully when the device exits Doze.</li>
+</ol>
+
+
+<h4 id="standby">Testing apps with App Standby</h4>
+
+<p>To test the App Standby mode with your app:</p>
+
+<ol>
+  <li>Configure a hardware device or virtual device with a M Preview system image.</li>
+  <li>Connect the device to your development machine and install your app.</li>
+  <li>Run your app and leave it active.</li>
+  <li>Simulate the app going into standby mode by running the following commands:
+
+<pre>
+$ adb shell am broadcast -a android.os.action.DISCHARGING
+$ adb shell am set-idle &lt;packageName&gt; true
+</pre>
+
+  </li>
+  <li>Simulate waking your app using the following command:
+    <pre>$ adb shell am set-idle &lt;packageName&gt; false</pre>
+  </li>
+  <li>Observe the behavior of your app when it is woken. Make sure it recovers gracefully
+    from standby mode. In particular, you should check if your app's Notifications and background
+    jobs continue to function as expected.</li>
+</ol>
diff --git a/docs/html/preview/testing/performance.jd b/docs/html/preview/testing/performance.jd
new file mode 100644
index 0000000..a61091f
--- /dev/null
+++ b/docs/html/preview/testing/performance.jd
@@ -0,0 +1,667 @@
+page.title=Testing Display Performance
+
+@jd:body
+
+
+<div id="qv-wrapper">
+  <div id="qv">
+    <h2>In this document</h2>
+      <ol>
+        <li><a href="#measure">Measuring UI Performance</a>
+          <ul>
+            <li><a href="#aggregate">Aggregate frame stats</a></li>
+            <li><a href="#timing-info">Precise frame timing info</a></li>
+            <li><a href="#timing-dump">Simple frame timing dump</a></li>
+            <li><a href="#collection-window">Controlling the window of stat collection</a></li>
+            <li><a href="#diagnose">Diagnosing performance regressions</a></li>
+            <li><a href="#resources">Additional resources</a></li>
+          </ul>
+        </li>
+        <li><a href="#automate">Automating UI Perfomance Tests</a>
+          <ul>
+            <li><a href="#ui-tests">Setting up UI tests</a></li>
+            <li><a href="#automated-tests">Setting up automated UI testing</a></li>
+            <li><a href="#triage">Triaging and fixing observed problems</a></li>
+          </ul>
+        </li>
+      </ol>
+  </div>
+</div>
+
+
+<p>
+  User interface (UI) performance testing ensures that your app not only meets its functional
+  requirements, but that user interactions with your app are buttery smooth, running at a
+  consistent 60 frames per second (<a href=
+  "https://www.youtube.com/watch?v=CaMTIgxCSqU&amp;index=25&amp;list=PLWz5rJ2EKKc9CBxr3BVjPTPoDPLdPIFCE">why
+  60fps?</a>), without any dropped or delayed frames, or as we like to call it, <em>jank</em>. This
+  document explains tools available to measure UI performance, and lays out an approach to
+  integrate UI performance measurements into your testing practices.
+</p>
+
+
+<h2 id="measure">Measuring UI Performance</h2>
+
+<p>
+  In order to improve performance you first need the ability to measure the performance of
+  your system, and then diagnose and identify problems that may arrive from various parts of your
+  pipeline.
+</p>
+
+<p>
+  <em><a href="https://source.android.com/devices/tech/debug/dumpsys.html">dumpsys</a></em> is an
+  Android tool that runs on the device and dumps interesting information about the status of system
+  services. Passing the <em>gfxinfo</em> command to dumpsys provides an output in logcat with
+  performance information relating to frames of animation that are occurring during the recording
+  phase.
+</p>
+
+<pre>
+&gt; adb shell dumpsys gfxinfo &lt;PACKAGE_NAME&gt;
+</pre>
+
+<p>
+  This command can produce multiple different variants of frame timing data.
+</p>
+
+<h3 id="aggregate">Aggregate frame stats</h3>
+
+<p>
+  With the M Preview the command prints out aggregated analysis of frame data to logcat, collected
+  across the entire lifetime of the process. For example:
+</p>
+
+<pre class="noprettyprint">
+Stats since: 752958278148ns
+Total frames rendered: 82189
+Janky frames: 35335 (42.99%)
+90th percentile: 34ms
+95th percentile: 42ms
+99th percentile: 69ms
+Number Missed Vsync: 4706
+Number High input latency: 142
+Number Slow UI thread: 17270
+Number Slow bitmap uploads: 1542
+Number Slow draw: 23342
+</pre>
+
+<p>
+  These high level statistics convey at a high level the rendering performance of the app, as well
+  as its stability across many frames.
+</p>
+
+
+<h3 id="timing-info">Precise frame timing info</h3>
+
+<p>
+  With the M Preview comes a new command for gfxinfo, and that’s <em>framestats</em> which provides
+  extremely detailed frame timing information from recent frames, so that you can track down and
+  debug problems more accurately.
+</p>
+
+<pre>
+&gt;adb shell dumpsys gfxinfo &lt;PACKAGE_NAME&gt; framestats
+</pre>
+
+<p>
+  This command prints out frame timing information, with nanosecond timestamps, from the last 120
+  frames produced by the app. Below is example raw output from adb dumpsys gfxinfo
+  &lt;PACKAGE_NAME&gt; framestats:
+</p>
+
+<pre class="noprettyprint">
+0,49762224585003,49762241251670,9223372036854775807,0,49762257627204,49762257646058,49762257969704,49762258002100,49762265541631,49762273951162,49762300914808,49762303675954,
+0,49762445152142,49762445152142,9223372036854775807,0,49762446678818,49762446705589,49762447268818,49762447388037,49762453551527,49762457134131,49762474889027,49762476150120,
+0,49762462118845,49762462118845,9223372036854775807,0,49762462595381,49762462619287,49762462919964,49762462968454,49762476194547,49762476483454,49762480214964,49762480911527,
+0,49762479085548,49762479085548,9223372036854775807,0,49762480066370,49762480099339,49762481013089,49762481085850,49762482232152,49762482478350,49762485657620,49762486116683,
+</pre>
+
+<p>
+  Each line of this output represents a frame produced by the app. Each line has a fixed number of
+  columns describing time spent in each stage of the frame-producing pipeline. The next section
+  describes this format in detail, including what each column represents.
+</p>
+
+
+<h4 id="fs-data-format">Framestats data format</h4>
+
+<p>
+  Since the block of data is output in CSV format, it's very straightforward to paste it to your
+  spreadsheet tool of choice, or collect and parse with a script. The following table explains the
+  format of the output data columns. All timestamps are in nanoseconds.
+</p>
+
+<ul>
+  <li>FLAGS
+    <ul>
+      <li>Rows with a ‘0’ for the FLAGS column can have their total frame time computed by
+      subtracting the INTENDED_VSYNC column from the FRAME_COMPLETED column.
+      </li>
+
+      <li>If this is non-zero the row should be ignored, as the frame has been determined as being
+      an outlier from normal performance, where it is expected that layout &amp; draw take longer
+      than 16ms. Here are a few reasons this could occur:
+        <ul>
+          <li>The window layout changed (such as the first frame of the application or after a
+          rotation)
+          </li>
+
+          <li>It is also possible the frame was skipped in which case some of the values will have
+          garbage timestamps. A frame can be skipped if for example it is out-running 60fps or if
+          nothing on-screen ended up being dirty, this is not necessarily a sign of a problem in
+          the app.
+          </li>
+        </ul>
+      </li>
+    </ul>
+  </li>
+
+  <li>VSYNC
+    <ul>
+      <li>The time value that was used in all the vsync listeners and drawing for the frame
+      (Choreographer frame callbacks, animations, View.getDrawingTime(), etc…)
+      </li>
+
+      <li>To understand more about VSYNC and how it influences your application, check out the
+      <a href=
+      "https://www.youtube.com/watch?v=1iaHxmfZGGc&amp;list=PLOU2XLYxmsIKEOXh5TwZEv89aofHzNCiu&amp;index=23">
+        Understanding VSYNC</a> video.
+      </li>
+    </ul>
+  </li>
+
+
+  <li>INTENDED_VSYNC
+    <ul>
+      <li>The intended start point for the frame. If this value is different from VSYNC, there
+      was work occurring on the UI thread that prevented it from responding to the vsync signal
+      in a timely fashion.
+      </li>
+    </ul>
+  </li>
+
+  <li>OLDEST_INPUT_EVENT
+    <ul>
+      <li>The timestamp of the oldest input event in the input queue, or Long.MAX_VALUE if
+      there were no input events for the frame.
+      </li>
+
+      <li>This value is primarily intended for platform work and has limited usefulness to app
+      developers.
+      </li>
+    </ul>
+  </li>
+
+  <li>NEWEST_INPUT_EVENT
+    <ul>
+      <li>The timestamp of the newest input event in the input queue, or 0 if there were no
+      input events for the frame.
+      </li>
+
+      <li>This value is primarily intended for platform work and has limited usefulness to app
+      developers.
+      </li>
+
+      <li>However it’s possible to get a rough idea of how much latency the app is adding by
+      looking at (FRAME_COMPLETED - NEWEST_INPUT_EVENT).
+      </li>
+    </ul>
+  </li>
+
+  <li>HANDLE_INPUT_START
+    <ul>
+      <li>The timestamp at which input events were dispatched to the application.
+      </li>
+
+      <li>By looking at the time between this and ANIMATION_START it is possible to measure how
+      long the application spent handling input events.
+      </li>
+
+      <li>If this number is high (&gt;2ms), this indicates the app is spending an unusually
+      long time processing input events, such as View.onTouchEvent(), which may indicate this
+      work needs to be optimized, or offloaded to a different thread. Note that there are some
+      scenarios, such as click events that launch new activities or similar, where it is
+      expected and acceptable that this number is large.
+      </li>
+    </ul>
+  </li>
+
+  <li>ANIMATION_START
+    <ul>
+      <li>The timestamp at which animations registered with Choreographer were run.
+      </li>
+
+      <li>By looking at the time between this and PERFORM_TRANVERSALS_START it is possible to
+      determine how long it took to evaluate all the animators (ObjectAnimator,
+      ViewPropertyAnimator, and Transitions being the common ones) that are running.
+      </li>
+
+      <li>If this number is high (&gt;2ms), check to see if your app has written any custom
+      animators or what fields ObjectAnimators are animating and ensure they are appropriate
+      for an animation.
+      </li>
+
+      <li>To learn more about Choreographer, check out the <a href=
+      "https://developers.google.com/events/io/sessions/325418001">For Butter or Worse</a>
+      video.
+      </li>
+    </ul>
+  </li>
+
+  <li>PERFORM_TRAVERSALS_START
+    <ul>
+      <li>If you subtract out DRAW_START from this value, you can extract how long the layout
+      &amp; measure phases took to complete. (note, during a scroll, or animation, you would
+      hope this should be close to zero..)
+      </li>
+
+      <li>To learn more about the measure &amp; layout phases of the rendering pipeline, check
+      out the <a href=
+      "https://www.youtube.com/watch?v=we6poP0kw6E&amp;list=PLOU2XLYxmsIKEOXh5TwZEv89aofHzNCiu&amp;index=27">
+        Invalidations, Layouts and Performance</a> video
+      </li>
+    </ul>
+  </li>
+
+  <li>DRAW_START
+    <ul>
+      <li>The time at which the draw phase of performTraversals started. This is the start
+      point of recording the display lists of any views that were invalidated.
+      </li>
+
+      <li>The time between this and SYNC_START is how long it took to call View.draw() on all
+      the invalidated views in the tree.
+      </li>
+
+      <li>For more information on the drawing model, see <a href=
+      "{@docRoot}guide/topics/graphics/hardware-accel.html#hardware-model">Hardware Acceleration</a>
+      or the <a href=
+      "https://www.youtube.com/watch?v=we6poP0kw6E&amp;list=PLOU2XLYxmsIKEOXh5TwZEv89aofHzNCiu&amp;index=27">
+        Invalidations, Layouts and Performance</a> video
+      </li>
+    </ul>
+  </li>
+
+  <li>SYNC_START
+    <ul>
+      <li>The time at which the sync phase of the drawing started.
+      </li>
+
+      <li>If the time between this and ISSUE_DRAW_COMMANDS_START is substantial (&gt;0.4ms or
+      so), it typically indicates a lot of new Bitmaps were drawn which must be uploaded to the
+      GPU.
+      </li>
+
+      <li>To understand more about the sync phase, check out the <a href=
+      "https://www.youtube.com/watch?v=VzYkVL1n4M8&amp;index=24&amp;list=PLOU2XLYxmsIKEOXh5TwZEv89aofHzNCiu">
+        Profile GPU Rendering</a> video
+      </li>
+    </ul>
+  </li>
+
+  <li>ISSUE_DRAW_COMMANDS_START
+    <ul>
+      <li>The time at which the hardware renderer started issuing drawing commands to the GPU.
+      </li>
+
+      <li>The time between this and FRAME_COMPLETED gives a rough idea of how much GPU work the
+      app is producing. Problems like too much overdraw or inefficient rendering effects show
+      up here.
+      </li>
+    </ul>
+  </li>
+
+  <li>SWAP_BUFFERS
+    <ul>
+      <li>The time at which eglSwapBuffers was called, relatively uninteresting outside of
+      platform work.
+      </li>
+    </ul>
+  </li>
+
+  <li>FRAME_COMPLETED
+    <ul>
+      <li>All done! The total time spent working on this frame can be computed by doing
+      FRAME_COMPLETED - INTENDED_VSYNC.
+      </li>
+    </ul>
+  </li>
+
+</ul>
+
+<p>
+  You can use this data in different ways. One simple but useful visualization is a
+  histogram showing the distribution of frames times (FRAME_COMPLETED - INTENDED_VSYNC) in
+  different latency buckets, see figure below. This graph tells us at a glance that most
+  frames were very good - well below the 16ms deadline (depicted in red), but a few frames
+  were significantly over the deadline. We can look at changes in this histogram over time
+  to see wholesale shifts or new outliers being created. You can also graph input latency,
+  time spent in layout, or other similar interesting metrics based on the many timestamps
+  in the data.
+</p>
+
+<img src="{@docRoot}preview/images/perf-test-framestats.png">
+
+
+<h3 id="timing-dump">Simple frame timing dump</h3>
+
+<p>
+  If <strong>Profile GPU rendering</strong> is set to <strong>In adb shell dumpsys gfxinfo</strong>
+  in Developer Options, the <code>adb shell dumpsys gfxinfo</code> command prints out timing
+  information for the most recent 120 frames, broken into a few different categories with
+  tab-separated-values. This data can be useful for indicating which parts of the drawing pipeline
+  may be slow at a high level.
+</p>
+
+<p>
+  Similar to <a href="#fs-data-format">framestats</a> above, it's very
+  straightforward to paste it to your spreadsheet tool of choice, or collect and parse with
+  a script. The following graph shows a breakdown of where many frames produced by the app
+  were spending their time.
+</p>
+
+<img src="{@docRoot}preview/images/perf-test-frame-latency.png">
+
+<p>
+  The result of running gfxinfo, copying the output, pasting it into a spreadsheet
+  application, and graphing the data as stacked bars.
+</p>
+
+<p>
+  Each vertical bar represents one frame of animation; its height represents the number of
+  milliseconds it took to compute that frame of animation. Each colored segment of the bar
+  represents a different stage of the rendering pipeline, so that you can see what parts of
+  your application may be creating a bottleneck. For more information on understanding the
+  rendering pipeline, and how to optimize for it, see the <a href=
+  "https://www.youtube.com/watch?v=we6poP0kw6E&amp;index=27&amp;list=PLWz5rJ2EKKc9CBxr3BVjPTPoDPLdPIFCE">
+  Invalidations Layouts and Performance</a> video.
+</p>
+
+
+<h3 id="collection-window">Controlling the window of stat collection</h3>
+
+<p>
+  Both the framestats and simple frame timings gather data over a very short window - about
+  two seconds worth of rendering. In order to precisely control this window of time - for
+  example, to constrain the data to a particular animation - you can reset all counters,
+  and aggregate statistics gathered.
+</p>
+
+<pre>
+&gt;adb shell dumpsys gfxinfo &lt;PACKAGE_NAME&gt; reset
+</pre>
+
+<p>
+  This can also be used in conjunction with the dumping commands themselves to collect and
+  reset at a regular cadence, capturing less-than-two-second windows of frames
+  continuously.
+</p>
+
+
+<h3 id="diagnose">Diagnosing performance regressions</h3>
+
+<p>
+  Identification of regressions is a good first step to tracking down problems, and
+  maintaining high application health. However, dumpsys just identifies the existence and
+  relative severity of problems. You still need to diagnose the particular cause of the
+  performance problems, and find appropriate ways to fix them. For that, it’s highly
+  recommended to use the <a href="{@docRoot}tools/help/systrace.html">systrace</a> tool.
+</p>
+
+
+<h3 id="resources">Additional resources</h3>
+
+<p>
+  For more information on how Android’s rendering pipeline works, common problems that you
+  can find there, and how to fix them, some of the following resources may be useful to
+  you:
+</p>
+
+<ul>
+  <li>Rendering Performance 101
+  </li>
+  <li>Why 60fps?
+  </li>
+  <li>Android UI and the GPU
+  </li>
+  <li>Invalidations Layouts and performance
+  </li>
+  <li>Analyzing UI Performance with Systrace
+  </li>
+</ul>
+
+
+<h2 id="automate">Automating UI Perfomance Tests</h2>
+
+<p>
+  One approach to UI Performance testing is to simply have a human tester perform a set of
+  user operations on the target app, and either visually look for jank, or spend an very
+  large amount of time using a tool-driven approach to find it. But this manual approach is
+  fraught with peril - human ability to perceive frame rate changes varies tremendously,
+  and this is also time consuming, tedious, and error prone.
+</p>
+
+<p>
+  A more efficient approach is to log and analyze key performance metrics from automated UI
+  tests. The Android M developer preview includes new logging capabilities which make it
+  easy to determine the amount and severity of jank in your application’s animations, and
+  that can be used to build a rigorous process to determine your current performance and
+  track future performance objectives.
+</p>
+
+<p>
+  This article walks you through a recommended approach to using that data to automate your
+  performance testing.
+</p>
+
+<p>
+  This is mostly broken down into two key actions. Firstly, identifying what you're
+  testing, and how you’re testing it. and Secondly, setting up, and maintaining an
+  automated testing environment.
+</p>
+
+
+<h3 id="ui-tests">Setting up UI tests</h3>
+
+<p>
+  Before you can get started with automated testing, it’s important to determine a few high
+  level decisions, in order to properly understand your test space, and needs you may have.
+</p>
+
+<h4>
+  Identify key animations / flows to test
+</h4>
+
+<p>
+  Remember that bad performance is most visible to users when it interrupts a smooth
+  animation. As such, when identifying what types of UI actions to test for, it’s useful to
+  focus on the key animations that users see most, or are most important to their
+  experience. For example, here are some common scenarios that may be useful to identify:
+</p>
+
+<ul>
+  <li>Scrolling a primary ListView or RecyclerView
+  </li>
+
+  <li>Animations during async wait cycles
+  </li>
+
+  <li>Any animation that may have bitmap loading / manipulation in it
+  </li>
+
+  <li>Animations including Alpha Blending
+  </li>
+
+  <li>Custom View drawing with Canvas
+  </li>
+</ul>
+
+<p>
+  Work with engineers, designers, and product managers on your team to prioritize these key
+  product animations for test coverage.
+</p>
+
+<h4>
+  Define your future objectives and track against them
+</h4>
+
+<p>
+  From a high-level, it may be critical to identify your specific performance goals, and
+  focus on writing tests, and collecting data around them. For example:
+</p>
+
+<ul>
+  <li>Do you just want to begin tracking UI performance for the first time to learn more?
+  </li>
+
+  <li>Do you want to prevent regressions that might be introduced in the future?
+  </li>
+
+  <li>Are you at 90% of smooth frames today and want to get to 98% this quarter?
+  </li>
+
+  <li>Are you at 98% smooth frames and don’t want to regress?
+  </li>
+
+  <li>Is your goal to improve performance on low end devices?
+  </li>
+</ul>
+
+<p>
+  In all of these cases, you’ll want historical tracking which shows performance across
+  multiple versions of your application.
+</p>
+
+<h4>
+  Identify devices to test on
+</h4>
+
+<p>
+  Application performance varies depending on the device it's running on. Some devices may
+  contain less memory, less powerful GPUs, or slower CPU chips. This means that animations
+  which may perform well on one set of hardware, may not on others, and worse, may be a
+  result of a bottleneck in a different part of the pipeline. So, to account for this
+  variation in what a user might see, pick a range of devices to execute tests on, both
+  current high end devices, low end devices, tablets, etc. Look for variation in CPU
+  performance, RAM, screen density, size, and so on. Tests that pass on a high end device
+  may fail on a low end device.
+</p>
+
+<h4>
+  Basic frameworks for UI Testing
+</h4>
+
+<p>
+  Tool suites like <a href=
+  "https://developer.android.com/tools/testing-support-library/index.html">UIAutomator</a>,
+  and <a href="https://code.google.com/p/android-test-kit/">Espresso</a> are built to help
+  automate the action of a user moving through your application. These are simple
+  frameworks which mimic user interaction with your device. To use these frameworks, you
+  effectively create unique scripts, which run through a set of user-actions, and play them
+  out on the device itself.
+</p>
+
+<p>
+  By combining these automated tests, alongside <code>dumpsys gfxinfo</code> you can quickly
+  create a reproducible system that allows you to execute a test, and measure the
+  performance information of that particular condition.
+</p>
+
+
+<h3 id="automated-tests">Setting up automated UI testing</h3>
+
+<p>
+  Once you have the ability to execute a UI test, and a pipeline to gather the data from a
+  single test, the next important step is to embrace a framework which can execute that
+  test multiple times, across multiple devices, and aggregate the resulting performance
+  data for further analysis by your development team.
+</p>
+
+<h4>
+  A framework for test automation
+</h4>
+
+<p>
+  It’s worth noting that UI testing frameworks (like <a href=
+  "https://developer.android.com/tools/testing-support-library/index.html">UIAutomator</a>)
+  run on the target device/emulator directly. While performance gathering information done
+  by <em>dumpsys gfxinfo</em> is driven by a host machine, sending commands over ADB. To
+  help bridge the automation of these separate entities, <a href=
+  "{@docRoot}tools/help/monkeyrunner_concepts.html">MonkeyRunner</a> framework was
+  developed; A scripting system that runs on your host machine, which can issue commands to
+  a set of connected devices, as well as receive data from them.
+</p>
+
+<p>
+  Building a set of scripts for proper Automation of UI Performance testing, at a minimum,
+  should be able to utilize monkeyRunner to accomplish the following tasks:
+</p>
+
+<ul>
+  <li>Load &amp; Launch a desired APK to a target device, devices, or emulator.
+  </li>
+
+  <li>Launch a UIAutomator UI test, and allow it to be executed
+  </li>
+
+  <li>Collect performance information through <em>dumpsys gfxinfo</em><em>.</em>
+  </li>
+
+  <li>Aggregate information and display it back in a useful fashion to the developer.
+  </li>
+</ul>
+
+
+<h3 id="triage">Triaging and fixing observed problems</h3>
+
+<p>
+  Once problem patterns or regressions are identified, the next step is identifying and
+  applying the fix. If your automated test framework preserves precise timing breakdowns
+  for frames, it can help you scrutinize recent suspicious code/layout changes (in the case
+  of regression), or narrow down the part of the system you’re analyzing when you switch to
+  manual investigation. For manual investigation, <a href=
+  "{@docRoot}tools/help/systrace.html">systrace</a> is a great place to start, showing
+  precise timing information about every stage of the rendering pipeline, every thread and
+  core in the system, as well as any custom event markers you define.
+</p>
+
+<h4>
+  Properly profiling temporal timings
+</h4>
+
+<p>
+  It is important to note the difficulties in obtaining and measuring timings that come from
+  rendering performance. These numbers are, by nature, non deterministic, and often
+  fluctuate depending on the state of the system, amount of memory available, thermal
+  throttling, and the last time a sun flare hit your area of the earth. The point is that
+  you can run the same test, twice and get slightly different numbers that may be close to
+  each other, but not exact.
+</p>
+
+<p>
+  Properly gathering and profiling data in this manner means running the same test,
+  multiple times, and accumulating the results as an average, or median value. (for the
+  sake of simplicity, let’s call this a ‘batch’) This gives you the rough approximation of
+  the performance of the test, while not needing exact timings.
+</p>
+
+<p>
+  Batches can be used between code changes to see the relative impact of those changes on
+  performance. If the average frame rate for the pre-change Batch is larger than the
+  post-change batch, then you generally have an overall win wrt performance for that
+  particular change.
+</p>
+
+<p>
+  This means that any Automated UI testing you do should take this concept into
+  consideration, and also account for any anomalies that might occur during a test. For
+  example, if your application performance suddenly dips, due to some device issue (that
+  isn’t caused by your application) then you may want to re-run the batch in order to get
+  less chaotic timings.
+</p>
+
+<p>
+  So, how many times should you run a test, before the measurements become meaningful? 10
+  times should be the minimum, with higher numbers like 50 or 100 yielding more accurate
+  results (of course, you’re now trading off time for accuracy)
+</p>