Merge "docs: App Translation API updates" into mnc-docs
diff --git a/docs/html/distribute/essentials/_book.yaml b/docs/html/distribute/essentials/_book.yaml
index e8b7811..22418b9 100644
--- a/docs/html/distribute/essentials/_book.yaml
+++ b/docs/html/distribute/essentials/_book.yaml
@@ -18,6 +18,9 @@
 - title: Auto App Quality
   path: /distribute/essentials/quality/auto.html
 
+- title: Building for Billions
+  path: /distribute/essentials/quality/billions.html
+
 - title: Launch Checklist
   path: /distribute/tools/launch-checklist.html
   custom_link_attributes:
diff --git a/docs/html/distribute/essentials/essentials_toc.cs b/docs/html/distribute/essentials/essentials_toc.cs
index a78252d..374f338 100644
--- a/docs/html/distribute/essentials/essentials_toc.cs
+++ b/docs/html/distribute/essentials/essentials_toc.cs
@@ -29,6 +29,12 @@
           </a>
     </div>
   </li>
+    <li class="nav-section">
+    <div class="nav-section-header empty" style="font-weight:normal"><a href="<?cs var:toroot?>distribute/essentials/quality/billions.html">
+            <span class="en">Building for Billions</span>
+          </a>
+    </div>
+  </li>
 
   <li class="nav-section">
     <div class="nav-section empty" style="font-weight:normal"><a href="<?cs var:toroot?>distribute/tools/launch-checklist.html" zh-cn-lang="发布检查清单">
diff --git a/docs/html/distribute/essentials/quality/billions.jd b/docs/html/distribute/essentials/quality/billions.jd
new file mode 100644
index 0000000..7042143
--- /dev/null
+++ b/docs/html/distribute/essentials/quality/billions.jd
@@ -0,0 +1,788 @@
+page.title=Building for Billions
+page.metaDescription=Best practices on how to optimize Android apps for low- and no-bandwidth and low-cost devices. 
+page.image=/distribute/images/billions-guidelines.png
+
+@jd:body
+
+<!-- table of contents -->
+<div id="qv-wrapper"><div id="qv">
+<h2><a href="#connectivity">Connectivity</a></h2>
+ <ol>
+  <li><a href="#images">Optimize images</a></li>
+  <li><a href="#network">Optimize networking</a></li>
+  <li><a href="#transfer">Fine-tune data transfer</a></li>
+ </ol>
+<h2><a href="#capability">Device Capability</a></h2>
+ <ol>
+  <li><a href="#screens">Support varying screen sizes</a></li>
+  <li><a href="#compatibility">Backward compatibility</a></li>
+  <li><a href="#memory">Efficient memory usage</a></li>
+ </ol>
+  
+<h2><a href="#cost">Data Cost</a></h2>
+ <ol>
+  <li><a href="#appsize">Reduce app size</a></li>
+  <li><a href="#configurablenetwork">Offer configurable network usage</a></li>
+ </ol>
+
+<h2><a href="#consumption">Battery Consumption</a></h2>
+ <ol>
+  <li><a href="#consumption-reduce">Reduce battery consumption</a></li>
+  <li><a href="#consumption-benchmark">Benchmark battery usage</a></li>
+ </ol>
+
+<h2><a href="#contentsection">Content</a></h2>
+ <ol>
+  <li><a href="#content-responsive">Fast and responsive UI</a></li>
+  <li><a href="#ui">UI Best practices</a></li>
+  <li><a href="#localization">Localization</a></li>
+ </ol>
+</div>
+</div>
+
+<!-- intro -->
+<p>Internet use—and smartphone penetration—is growing fastest in markets with
+ low, intermittent, or expensive connectivity. Successful apps in these 
+ markets need to perform across a variety of speeds and devices, as well as 
+ conserve and share information about battery and data consumption.</p>
+
+<p>To help you address these important considerations, we’ve compiled the
+ following checklist. These do not follow a particular order, and as 
+ always it's a good idea to research particularities of any market or country 
+ you're targeting. 
+</p>
+
+<!-- connectivity -->
+<div class="headerLine">
+  <h2 id="connectivity">Connectivity</h2>
+</div>
+
+<p>Over half of the users in the world still experience your app over 2G
+ connections. To improve their experience, optimize for no- and low-connection
+ speeds. For offline and slow connections: store data, queue requests, and handle
+ images for optimal performance.
+</p>
+
+<h3 id="images">Optimize images</h3>
+<h4 id="images-format">Serve WebP images</h4>
+ <ul>
+  <li>Serve <a
+   href="https://developers.google.com/speed/webp/">WebP</a> files over the 
+   network. WebP reduces image load times, saves network bandwidth, and often 
+   results in smaller file sizes than its PNG and JPG counterparts, with at 
+   least the same image quality. Even at lossy settings, WebP can produce a 
+   nearly identical image. Android has had lossy <a 
+   href="{@docRoot}guide/appendix/media-formats.html">WebP support</a> since 
+   Android 4.0 (API level 14: Ice Cream Sandwich) and support for lossless / 
+   transparent WebP since Android 4.2 (API level 17: Jelly Bean).</li>
+ </ul>
+<h4 id="images-sizing">Dynamic image sizing</h4>
+ <ul>
+  <li>Have your apps request images at the targeted rendering size, and have
+   your server provide those images to fit; the target rendering size will 
+   vary based on device specifications. Doing this minimizes the network 
+   overhead and reduces the amount of memory needed to hold each image, 
+   resulting in improved performance and user satisfaction.</li>
+  <li>Your user experience degrades when users are waiting for images to
+   download. Using appropriate image sizes helps to address these issues. 
+   Consider making image size requests based on network type or network 
+   quality; this size could be smaller than the target rendering size.</li>
+  <li>Dynamic placeholders like <a
+   href="{@docRoot}reference/android/support/v7/graphics/Palette.html">
+   pre-computed palette values</a> or low-resolution thumbnails can improve 
+   the user experience while the image is being fetched.</li>
+ </ul>
+<h4 id="images-libraries">Use image loading libraries</h4>
+ <ul>
+  <li>Your app should not have to fetch any image more than once. Image
+   loading libraries such as <a class="external-link" 
+   href="https://github.com/bumptech/glide">Glide</a> and  <a 
+   class="external-link" href="http://square.github.io/picasso/">Picasso</a> 
+   fetch the image, cache it, and provide hooks into your Views to show 
+   placeholder images until the actual images are ready. Because images are 
+   cached, these libraries return the local copy the next time they are 
+   requested.</li>
+  <li>Image-loading libraries manage their cache, holding onto the most recent
+   images so that your app storage doesn’t grow indefinitely.</li>
+ </ul>
+
+<h3 id="network">Optimize networking</h3>
+<h4 id="network-offline">Make your app usable offline</h4>
+ <ul>
+  <li>In places like subways, planes, elevators, and parking garages, it is
+   common for devices to lose network connectivity. Creating a useful offline 
+   state results in users being able to interact with the app at all times, by 
+   presenting cached information. Ensure that your app is usable offline or 
+   when network connectivity is poor by storing data locally, caching data, 
+   and queuing outbound requests for when connectivity is restored.</li>
+  <li>Where possible, apps should not notify users that connectivity has
+   been lost. It is only when the user performs an operation where connectivity 
+   is essential that the user needs to be notified.</li>
+  <li>When a device lacks connectivity, your app should batch up network
+   requests&mdash;on behalf of the user&mdash;that can be executed when 
+   connectivity is restored. An example of this is an email client that allows 
+   users to compose, send, read, move, and delete existing mails even when the 
+   device is offline. These operations can be cached and executed when 
+   connectivity is restored. In doing so, the app is able to provide a similar 
+   user experience whether the device is online or offline.</li>
+ </ul>
+<h4 id="network-arch">Use GcmNetworkManager and/or Content Providers</h4>
+ <ul>
+  <li>Ensure that your app stores all data on disk via a database or similar
+   structure so that it performs optimally regardless of network conditions 
+   (for example, via SQLite + ContentProvider). The <a 
+   href="https://developers.google.com/cloud-messaging/network-manager">
+   GCM Network Manager</a> 
+   (<a href="https://developers.google.com/android/reference/com/google/android/gms/gcm/GcmNetworkManager">
+   <code>GcmNetworkManager</code></a>) can result in a robust mechanism to 
+   sync data with servers while <a 
+   href="{@docRoot}guide/topics/providers/content-providers.html">content 
+   providers</a> ({@link android.content.ContentProvider}) cache that data, 
+   combining to provide an architecture that enables a useful offline state.</li>
+  <li>Apps should cache content that is fetched from the network. Before making
+   subsequent requests, apps should display locally cached data. This ensures 
+   that the app is functional regardless of whether the device is offline or 
+   on a slow/unreliable network.</li>
+ </ul>
+<h4 id="network-duplicate">Deduplicate network requests</h4>
+ <ul>
+  <li>An offline-first architecture initially tries to fetch data from local
+   storage and, failing that, requests the data from the network. After being 
+   retrieved from the network, the data is cached locally for future 
+   retrieval. This helps to ensure that network requests for the same piece of 
+   data only occur once—the rest of the requests are satisfied locally. To 
+   achieve this, use a local database for long-lived data (usually 
+   {@link android.database.sqlite} or 
+   {@link android.content.SharedPreferences}).</li>
+  <li>An offline-first architecture always looks for data locally first, then
+   makes the request over the network. The response is cached and then returned 
+   locally. Such an architecture simplifies an app’s flow between offline and 
+   online states as one side fetches from the network to the cache, while the 
+   other retrieves data from the cache to present to the user.</li>
+  <li>For transitory data, use a bounded disk cache such as a <a class="external-link"
+   href="https://github.com/JakeWharton/DiskLruCache"><code>DiskLruCache</code>
+   </a>. Data that doesn’t typically change should only be requested once over 
+   the network and cached for future use. Examples of such data are images and 
+   non-temporal documents like news articles or social posts.</li>
+ </ul>
+
+<h3 id="transfer">Fine-tune data transfer</h3>
+<h4 id="transfer-prioritize">Prioritize bandwidth</h4>
+ <ul>
+  <li>Writers of apps should not assume that any network that the device is
+   connected to is long-lasting or reliable. For this reason, apps should 
+   prioritize network requests to display the most useful information to the 
+   user as soon as possible.</li>
+  <li>Presenting users with visible and relevant information immediately is a
+   better user experience than making them wait for information that might not 
+   be necessary. This reduces the time that the user has to wait and 
+   increases the usefulness of the app on slow networks.</li>
+  <li>To achieve this, sequence your network requests such that text is
+   fetched before rich media. Text requests tend to be smaller, compress 
+   better, and hence transfer faster, meaning that your app can display useful 
+   content quickly. For more information on managing network requests, visit 
+   the Android training on <a 
+   href="{@docRoot}training/basics/network-ops/managing.html">Managing Network 
+   Usage</a>.</li>
+ </ul>
+<h4 id="network-bandwidth">Use less bandwidth on slower connections</h4>
+ <ul>
+  <li>The ability for your app to transfer data in a timely fashion is
+   dependent on the network connection. Detecting the quality of the network 
+   and adjusting the way your app uses it can help provide an excellent user 
+   experience.</li>
+  <li>You can use the following methods to detect the underlying network
+   quality. Using the data from these methods, your app should tailor its use 
+   of the network to continue to provide a timely response to user actions:
+    <ul>
+     <li>{@link android.net.ConnectivityManager}>
+     {@link android.net.ConnectivityManager#isActiveNetworkMetered}</li>
+     <li>{@link android.net.ConnectivityManager}>
+     {@link android.net.ConnectivityManager#getActiveNetworkInfo}</li>
+     <li>{@link android.net.ConnectivityManager}>
+     {@link android.net.ConnectivityManager#getNetworkCapabilities}</li>
+     <li>{@link android.telephony.TelephonyManager}>
+     {@link android.telephony.TelephonyManager#getNetworkType}</li>
+    </ul>
+  </li>
+  <li>On slower connections, consider downloading only lower-resolution media
+   or perhaps none at all. This ensures that your users are still able to use 
+   the app on slow connections. Where you don’t have an image or the image is 
+   still loading, you should always show a placeholder. You can create a 
+   dynamic placeholder by using the <a 
+   href="{@docRoot}tools/support-library/features.html#v7-palette">
+   Palette library</a> to generate placeholder colors that match the target 
+   image.</li>
+  <li>Prioritize network requests such that text is fetched before rich media.
+   Text requests tend to be smaller, compress better, and hence transfer 
+   faster, meaning that your app can display useful content quickly. For more 
+   information on adjusting bandwidth based on network connection, see the 
+   Android training on <a 
+   href="{@docRoot}training/basics/network-ops/managing.html">Managing Network 
+   Usage</a>.</li>
+ </ul>
+<h4 id="network-behavior">Detect network changes, then change app behavior</h4>
+ <ul>
+  <li>Network quality is not static; it changes based on location, network
+   traffic, and local population density. Apps should detect changes in 
+   network and adjust bandwidth accordingly. By doing so, your app can tailor 
+   the user experience to the network quality. Detect network state using 
+   these methods:
+    <ul>
+     <li>{@link android.net.ConnectivityManager}>
+     {@link android.net.ConnectivityManager#getActiveNetworkInfo}</li>
+     <li>{@link android.net.ConnectivityManager}>
+     {@link android.net.ConnectivityManager#getNetworkCapabilities}</li>
+     <li>{@link android.telephony.TelephonyManager}>
+     {@link android.telephony.TelephonyManager#getDataState}</li>
+    </ul>
+  </li>
+  <li>As the network quality degrades, scale down the number and size of
+   requests. As the connection quality improves, you can scale up your 
+   requests to optimal levels.</li>
+  <li>On higher quality, unmetered networks, consider <a
+   href="{@docRoot}training/efficient-downloads/efficient-network-access.html#PrefetchData">
+   prefetching data</a> to make it available ahead of time. From a user 
+   experience standpoint, this might mean that news reader apps only fetch 
+   three articles at a time on 2G but fetch twenty articles at a time on 
+   Wi-Fi. For more information on adjusting app behavior based on network changes, 
+   visit the Android training on <a 
+   href="{@docRoot}training/monitoring-device-state/connectivity-monitoring.html">
+   Monitoring the Connectivity Status</a>.</li>
+  <li>The broadcast <a
+   href="{@docRoot}reference/android/net/ConnectivityManager.html#CONNECTIVITY_ACTION">
+   <code>CONNECTIVITY_CHANGE</code></a> is sent when a change in network 
+   connectivity occurs. When your app is in the foreground, you can call <a 
+   href="{@docRoot}reference/android/content/Context.html#registerReceiver(android.content.BroadcastReceiver,%20android.content.IntentFilter)">
+   <code>registerReceiver</code></a> to receive this broadcast. After receiving 
+   the broadcast, you should reevaluate the current network state and adjust 
+   your UI and network usage appropriately. You should not declare this receiver 
+   in your manifest, as it will no longer function beginning with Android N. 
+   For more details see <a href="{@docRoot}preview/behavior-changes.html">
+   Android N behavior changes</a>.</li>
+ </ul>
+
+<h3 class="rel-resources clearfloat">Related resources</h3>
+<div class="resource-widget resource-flow-layout col-13"
+  data-query="collection:distribute/essentials/billionsquality/connectivity"
+  data-sortOrder="-timestamp"
+  data-cardSizes="6x3"
+  data-maxResults="6"></div>
+
+<!-- capability -->
+<div class="headerLine">
+  <h2 id="capability">Device Capability</h2>
+</div>
+<p>Reaching new users means supporting an increasing variety of Android
+ platform versions and device specifications. Optimize for common RAM and 
+ screen sizes and resolutions to improve the user experience. </p>
+
+<h3 id="screens">Support varying screen sizes</h3>
+<h4 id="screens-dp">Use density-independent pixels (dp)</h4>
+ <ul>
+  <li>Defining layout dimensions with pixels is a problem because different
+   screens have different pixel densities, so the same number of pixels may 
+   correspond to different physical sizes on different devices. The 
+   density-independent pixel (dp) corresponds to the physical size of a pixel 
+   at 160 dots per inch (mdpi density).</li>
+  <li>Defining layouts with dp ensures that the physical size of your user
+   interface is consistent regardless of device. Visit the Android 
+   guide on <a 
+   href="https://developer.android.com/guide/practices/screens_support.html">
+   Supporting Multiple Screens</a> for best practices using 
+   density-independent pixels.</li>
+ </ul>
+<h4 id="screens-density">Test graphics on ldpi/mdpi screen densities</h4>
+ <ul>
+  <li>Ensure that your app layouts work well on low- and medium-density
+   (ldpi/mdpi) screens because these are <a 
+   href="https://developer.android.com/about/dashboards/index.html#Screens">
+   common densities</a>, especially in lower-cost devices. Testing on 
+   lower-density screens helps to validate that your layouts are legible on 
+   lower-density screens.</li>
+  <li>Lower-density screens can result in unclear text where the finer details
+   aren't visible. The Material Design guidelines describe <a 
+   class="external-link" href="https://www.google.com/design/spec/layout/metrics-keylines.html">
+   metrics and keylines</a> to ensure that your layouts can scale across 
+   screen densities.</li>
+  <li>Devices with lower-density screens tend to have lower hardware
+   specifications. To ensure that your app performs well on these devices, 
+   consider reducing or eliminating heavy loads, such as animations and 
+   transitions. For more information on supporting different densities, see 
+   the Android training on <a 
+   href="https://developer.android.com/training/multiscreen/screendensities.html">
+   Supporting Different Densities</a>.</li>
+ </ul>
+<h4 id="screens-sizes">Test layouts on small/medium screen sizes</h4>
+ <ul>
+  <li>Validate that your layouts scale down by testing on smaller screens. As
+   screen sizes shrink, be very selective about visible UI elements, because 
+   there is limited space for them.</li>
+  <li>Devices with smaller screens tend to have lower hardware specifications.
+   To ensure that your app performs well on these devices, try reducing or 
+   eliminating heavy loads, such as animations or transitions. For more 
+   information on supporting different screen sizes, see the Android 
+   training on <a 
+   href="https://developer.android.com/training/multiscreen/screendensities.html">
+   Supporting Different Screen Sizes</a>.</li>
+ </ul>
+
+<h3 id="compatibility">Backward compatibility</h3>
+<h4 id="compatibility-sdkversion">Set your targetSdkVersion and minSdkVersion
+ appropriately</h4>
+ <ul>
+  <li>Apps should build and target a recent version of Android to ensure most
+   current behavior across a broad range of devices; this still provides 
+   backward compatibility to older versions. Here are the best practices for 
+   targeting API levels appropriately:
+    <ul>
+     <li><a
+      href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#target">
+      {@code targetSdkVersion}</a> should be the latest version of Android. 
+      Targeting the most recent version ensures that your app inherits newer 
+      runtime behaviors when running newer versions of Android. Be sure to 
+      test your app on newer Android versions when updating the 
+      targetSdkVersion as it can affect app behavior.</li>
+     <li><a
+      href="{@docRoot}guide/topics/manifest/uses-sdk-element.html#min">
+      {@code minSdkVersion}</a> sets the minimum supported Android version. 
+      Use Android 4.0 (API level 14: Ice Cream Sandwich) or Android 4.1 (API 
+      level 16: Jelly Bean)—these versions give maximum coverage for modern 
+      devices. Setting {@code minSdkVersion} also results in the Android build 
+      tools reporting incorrect use of new APIs that might not be available in 
+      older versions of the platform. By doing so, developers are protected 
+      from inadvertently breaking backward compatibility.</li>
+    </ul>
+  </li>
+  <li>Consult the <a
+   href="https://developer.android.com/about/dashboards/index.html#Platform">
+   Android dashboards</a>, the <a class="external-link" 
+   href="https://play.google.com/apps/publish/">Google Play Developer 
+   Console</a> for your app, and industry research in your target markets to 
+   gauge which versions of Android to target, based on your target users.</li>
+ </ul>
+<h4 id="compatibility-libraries">Use the Android Support libraries</h4>
+ <ul>
+  <li>Ensure your app provides a consistent experience across OS versions by
+   using the Google-provided support libraries such as AppCompat and the Design
+    Support Library. The Android Support Library package is a set of code 
+    libraries that provides backward-compatible versions of Android framework 
+    APIs as well as features that are only available through the library APIs.
+    </li>
+  <li>Some of the the highlights include:
+  <ul>
+   <li>v4 & v7 support library: Many framework APIs for older versions of
+    Android such as {@link android.support.v4.view.ViewPager}, 
+    {@link android.app.ActionBar}, 
+    {@link android.support.v7.widget.RecyclerView}, and 
+    {@link android.support.v7.graphics.Palette}.</li>
+   <li><a href="{@docRoot}tools/support-library/features.html#design">Design
+    Support</a> library: APIs to support adding Material Design components 
+    and patterns to your apps.</li>
+   <li><a href="{@docRoot}tools/support-library/features.html#multidex">
+    Multidex Support</a> library: provides support for large apps that have 
+    more than 65K methods. This can happen if your app is using many 
+    libraries.</li> 
+  </ul>
+  </li>
+  <li>For more information on the available support libraries, see the <a
+   href="https://developer.android.com/tools/support-library/features.html">
+   Support Libraries Features</a> section of the Android Developer site.</li>
+ </ul>
+<h4 id="compatibility-playservices">Use Google Play services</h4>
+ <ul>
+  <li>Google Play services brings the best of Google APIs independent of
+   Android platform version. Consider using features from Google Play services 
+   to offer the most streamlined Google experience on Android devices.</li>
+  <li>Google Play services also include useful APIs such as <a 
+   href="https://developers.google.com/android/reference/com/google/android/gms/gcm/GcmNetworkManager">
+   <code>GcmNetworkManager</code></a>, which provides much of Android 5.0’s 
+   {@link android.app.job.JobScheduler} API for older versions of Android. </li>
+  <li>Updates to Google Play services are distributed automatically by the
+   Google Play Store, and new versions of the client library are delivered 
+   through the Android SDK Manager. </li>
+ </ul>
+<h3 id="memory">Efficient memory usage</h3>
+<h4 id="memory-footprint">Reduce memory footprint on low-cost devices</h4>
+ <ul>
+  <li>Adjusting your memory footprint dynamically helps to ensure compatibility
+   across devices with different RAM configurations.</li>
+  <li>Methods such as {@link android.app.ActivityManager#isLowRamDevice} and
+   {@link android.app.ActivityManager#getMemoryClass()} help determine memory 
+   constraints at runtime. Based on this information, you can scale down your 
+   memory usage. As an example, you can use lower resolution images on low memory 
+   devices.</li>
+  <li>For more information on managing your app’s memory, see the Android
+   training on <a href="{@docRoot}training/articles/memory.html">Managing 
+   Your App's Memory</a>.</li>
+ </ul>
+<h4 id="memory-longprocesses">Avoid long-running processes</h4>
+ <ul>
+  <li>Long-running processes stay resident in memory and can result in slowing
+   down the device. In most situations, your app should wake up for a given 
+   event, process data, and shut down. You should use <a 
+   href="https://developers.google.com/cloud-messaging">Google Cloud Messaging 
+   (GCM)</a> and/or <a 
+   href="https://developers.google.com/android/reference/com/google/android/gms/gcm/GcmNetworkManager">
+   <code>GcmNetworkManager</code></a> to avoid long running background 
+   services and reduce memory pressure on the user’s device.</li>
+ </ul>
+<h4 id="memory-benchmark">Benchmark memory usage</h4>
+ <ul>
+  <li>Android Studio provides memory benchmarking and profiling tools, enabling
+   you to measure memory usage at run time. Benchmarking your app’s memory
+    footprint enables you to monitor memory usage over multiple versions of 
+    the app. This can help catch unintentional memory footprint growth. These 
+    tools can be used in the following ways:
+  <ul>
+   <li>Use the <a
+    href="{@docRoot}tools/performance/memory-monitor/index.html">Memory 
+    Monitor</a> tool to find out whether undesirable garbage collection (GC) 
+    event patterns might be causing performance problems.</li> 
+   <li>Run <a
+    href="{@docRoot}tools/performance/heap-viewer/index.html">Heap Viewer</a>
+    to identify object types that get or stay allocated unexpectedly or 
+    unnecessarily.</li>
+   <li>Use <a
+   href="{@docRoot}tools/performance/allocation-tracker/index.html">
+   Allocation Tracker</a> to identify where in your code the problem might 
+   be.</li>
+  </ul>
+  </li>
+  <li>For more information on benchmarking memory usage, see the <a
+   href="{@docRoot}tools/performance/comparison.html">
+   Memory Profilers</a> tools on the Android Developers site.</li>
+ </ul>
+
+<h3 class="rel-resources clearfloat">Related resources</h3>
+<div class="resource-widget resource-flow-layout col-13"
+  data-query="collection:distribute/essentials/billionsquality/capability"
+  data-sortOrder="-timestamp"
+  data-cardSizes="6x3"
+  data-maxResults="6"></div>
+
+<!-- cost -->
+<div class="headerLine">
+  <h2 id="cost">Data Cost</h2>
+</div>
+<p>Data plans in some countries can cost upwards of 10% of monthly income.
+ Conserve data and give control to optimize user experience. Reduce data 
+ consumption and give users control over your app’s use of data.</p>
+
+<h3 id="appsize">Reduce app size</h3>
+<h4 id="appsize-graphics">Reduce APK graphical asset size</h4>
+ <ul>
+  <li>Graphical assets are often the largest contributor to the size of the
+   APK. Optimizing these can result in smaller downloads and thus faster 
+   installation times for users.</li>
+  <li>For graphical assets like icons, use Scalable Vector Graphics (SVG)
+   format. SVG images are relatively tiny in size and can be rendered at 
+   runtime to any resolution. The <a 
+   href="{@docRoot}tools/support-library/index.html">Android Support</a> 
+   library provides a backward-compatible implementation for vector resources as 
+   far back as Android 2.1 (API level 7). Get started with vectors with <a 
+   class="external-link" 
+   href="https://medium.com/@chrisbanes/appcompat-v23-2-age-of-the-vectors-91cbafa87c88">
+   this Medium post</a>. </li>
+  <li>For non-vector images, like photos, use <a
+   href="https://developers.google.com/speed/webp/">WebP</a>. WebP reduces 
+   image load times, saves network bandwidth, and is proven to result in 
+   smaller file sizes than its PNG and JPG counterparts, with at least the 
+   same image quality. Even at lossy settings, WebP can produce a nearly 
+   identical image. Android has had lossy WebP support since Android 4.0 (API 
+   level 14: Ice Cream Sandwich) and support for lossless / transparent WebP since Android 4.2 (API level 17: Jelly Bean).</li>
+  <li>If you have many large images across multiple densities, consider
+   using <a href="{@docRoot}google/play/publishing/multiple-apks.html">Multiple 
+   APK support</a> to split your APK by density. This results in builds 
+   targeted for specific densities, meaning users with low-density devices 
+   won’t have to incur the penalty of unused high-density assets.</li>
+  <li>A detailed guide on reducing your APK size can be found in <a
+   class="external-link" href="https://medium.com/@wkalicinski/smallerapk-part-4-multi-apk-through-abi-and-density-splits-477083989006">
+   series of Medium posts</a>.</li>
+ </ul>
+<h4 id="appsize-code">Reduce code size</h4>
+ <ul>
+  <li>Be careful about using external libraries because not all libraries are
+   meant to be used in mobile apps. Ensure that the libraries your app is 
+   using are optimized for mobile use.</li>
+  <li>Every library in your Android project is adding potentially unused code
+   to your APK. There are also some libraries that aren’t designed with mobile 
+   development in mind. These libraries can end up contributing to significant 
+   APK bloat.</li>
+  <li>Consider optimizing your compiled code using a tool such as <a
+   href="{@docRoot}tools/help/proguard.html">ProGuard</a>. ProGuard identifies 
+   code that isn’t being used and removes it from your APK. Also <a 
+   class="external-link" 
+   href="http://tools.android.com/tech-docs/new-build-system/resource-shrinking">
+   enable resource shrinking</a> at build time by setting 
+   <code>minifyEnabled=true</code>, <code>shrinkResources=true</code> in 
+   <code>build.gradle</code>—this automatically removes unused resources from 
+   your APK.</li>
+  <li>When using Google Play services, you should <a
+   href="{@docRoot}google/play-services/setup.html#add_google_play_services_to_your_project">
+   selectively include</a> only the necessary APIs into your APK.</li>
+  <li>For more information on reducing code size in your APK, see the Android
+   training on how to <a 
+   href="{@docRoot}training/articles/memory.html#DependencyInjection">Avoid 
+   dependency injection frameworks</a>.</li>
+ </ul>
+<h4 id="appsize-external">Allow app to be moved to external (SD) storage</h4>
+ <ul>
+  <li>Low-cost devices often come with little on-device storage. Users can
+   extend this with SD cards; however, apps need to explicitly declare that 
+   they support being installed to external storage before users can move them.
+  </li>
+  <li>Allow your app to be installed to external storage using the <a
+   href="{@docRoot}guide/topics/manifest/manifest-element.html#install"><code>
+   android:installLocation</code></a> flag in your AndroidManifest. For more 
+   information on enabling your app to be moved to external storage, see the 
+   Android guide on <a 
+   href="{@docRoot}guide/topics/data/install-location.html">App Install 
+   Location</a>.</li>
+ </ul>
+
+<h4 id="appsize-postinstall">Reduce post-install app disk usage</h4>
+ <ul>
+  <li>Keeping your app’s disk usage low means that users are less likely to
+   uninstall your app when the device is low on free space. When using caches, 
+   it’s important to apply bounds around your caches—this prevents your app’s 
+   disk usage from growing indefinitely. Be sure you put your cached data in 
+   {@link android.content.Context#getCacheDir()}—the system can delete files 
+   placed here as needed, so they won’t show up as storage committed to the 
+   app.</li>
+ </ul>
+
+<h3 id="configurablenetwork">Offer configurable network usage</h3>
+<h4 id="configurablenetwork-onboarding">Provide onboarding experiences for 
+subjective user choices</h4>
+ <ul>
+  <li>Apps that allow users to reduce data usage are well received, even if
+   they demand heavy data requirements. If your app uses a considerable amount 
+   of bandwidth (for example, video streaming apps), you can provide an 
+   onboarding experience for users to configure network usage. For example, 
+   you could allow the user to force lower-bitrate video streams on cellular 
+   networks. </li>
+  <li>Additional settings for users to control data syncing, prefetching, and
+   network usage behavior (for example, prefetch all starred news categories on 
+   Wi-Fi only), also help users tailor your app’s behavior to their needs.</li>
+  <li>For more information on managing network usage, see the Android training
+   on <a href="{@docRoot}training/basics/network-ops/managing.html">Managing 
+   Network Usage</a>.</li>
+ </ul>
+<h4 id="configurablenetwork-preferences">Provide a network preferences 
+screen</h4>
+ <ul>
+  <li>You can navigate to the app’s network settings from outside the app by
+   means of a network preferences screen. You can invoke this screen from 
+   either the system settings screen or the system data usage screen.</li>
+  <li>To provide a network preferences screen that users can access from within
+   your app as well as from the system settings, in your app include an 
+   activity that supports the 
+   {@link android.content.Intent#ACTION_MANAGE_NETWORK_USAGE} action.</li>
+  <li>For further information on adding a network preferences screen, see the
+   Android training on <a 
+   href="{@docRoot}training/basics/network-ops/managing.html#prefs">
+   Implementing a Preferences Activity</a>.</li>
+ </ul>
+
+
+
+<h3 class="rel-resources clearfloat">Related resources</h3>
+<div class="resource-widget resource-flow-layout col-13"
+  data-query="collection:distribute/essentials/billionsquality/cost"
+  data-sortOrder="-timestamp"
+  data-cardSizes="6x3"
+  data-maxResults="6"></div>
+
+
+<!-- consumption -->
+<div class="headerLine">
+  <h2 id="consumption">Battery Consumption</h2>
+</div>
+<p>Access to reliable power supplies varies, and outages can disrupt planned 
+charges. Defend your users' batteries against unnecessary drain by benchmarking 
+your battery use,  avoiding wakelocks, scheduling tasks, and monitoring sensor 
+requests.</p>
+<h3 id="consumption-reduce">Reduce battery consumption</h3>
+ <ul>
+  <li>Your app should do minimal activity when in the background and when the
+   device is running on battery power.</li>
+  <li><a href="{@docRoot}reference/android/os/PowerManager.WakeLock.html">Wake
+   locks</a> are mechanisms to keep devices on so that they can perform 
+   background activities. Avoid using wake locks because they prevent the 
+   device from going into low-power states.</li>
+  <li>To reduce the number of device wake-ups, batch network activity. For more
+   information on batching, see the Android training on <a 
+   href="{@docRoot}training/efficient-downloads/efficient-network-access.html">
+   Optimizing Downloads for Efficient Network Access</a>.</li>
+  <li><a 
+   href="https://developers.google.com/android/reference/com/google/android/gms/gcm/GcmNetworkManager">
+   <code>GcmNetworkManager</code></a> schedules tasks and lets Google Play 
+   services batch operations across the system. This greatly 
+   simplifies the implementation of common patterns, such as waiting for 
+   network connectivity, device charging state, retries, and backoff. Use 
+   <code>GcmNetworkManager</code> to perform non-essential background activity 
+   when the device is charging and is connected to an unmetered network.</li>
+  <li>Sensors, like GPS, can also have a significant drain on the battery. The
+   recommended way to request location is to use the FusedLocationProvider API. 
+   The <a 
+   href="https://developers.google.com/android/reference/com/google/android/gms/location/FusedLocationProviderApi">FusedLocationProvider</a> API manages the 
+   underlying location technology and provides a simple API so that you can 
+   specify requirements&mdash;like high accuracy or low power&mdash;at a high 
+   level. It also optimizes the device's use of battery power by caching 
+   locations and batching requests across apps. For  more information on the 
+   ideal ways to request location, see the <a 
+   href="{@docRoot}training/location/retrieve-current.html">Getting the Last 
+   Known Location</a> training guide.
+  </li>
+ </ul>
+<h3 id="consumption-benchmark">Benchmark battery usage</h3>
+ <ul>
+  <li>Benchmarking your app’s usage in a controlled environment helps you
+   understand the battery-heavy tasks in your app. It is a good practice to 
+   benchmark your app’s battery usage to gauge efficiency and track changes 
+   over time.
+</li>
+  <li><a
+   href="{@docRoot}tools/performance/batterystats-battery-historian/index.html">
+   Batterystats</a> collects battery data about your apps, and <a 
+   href="{@docRoot}tools/performance/batterystats-battery-historian/index.html">
+   Battery Historian</a> converts that data into an HTML visualization. For 
+   more information on reducing battery usage, see the Android training on <a 
+   href="{@docRoot}training/monitoring-device-state/index.html">Optimizing 
+   Battery Life</a>.</li>
+ </ul>
+
+<h3 class="rel-resources clearfloat">Related resources</h3>
+<div class="resource-widget resource-flow-layout col-13"
+  data-query="collection:distribute/essentials/billionsquality/consumption"
+  data-sortOrder="-timestamp"
+  data-cardSizes="6x3"
+  data-maxResults="6"></div>
+
+<!-- content -->
+<div class="headerLine">
+  <h2 id="contentsection">Content</h2>
+</div>
+<p>Make sure that your app works well on a variety of screens: offering good,
+ crisp graphics and appropriate layouts on low resolution and physically small 
+ screens. Ensure that your app is designed to be easily localized by 
+ accommodating the variations between languages: allow for spacing, density, 
+ order, emphasis, and wording variations. Also make sure that date, time, and 
+ the like are internationalized and displayed according to the phone’s 
+ settings.</p>
+
+<h3 id="content-responsive">Fast and responsive UI</h3>
+<h4 id="content-feedback">Touch feedback on all touchable items</h4>
+ <ul>
+  <li>Touch feedback adds a tactile feeling to the user interface. You should
+   ensure your app provides touch feedback on all touchable elements to reduce 
+   the perceived app latency as much as possible.
+</li>
+  <li><a
+   href="https://www.google.com/design/spec/animation/responsive-interaction.html">
+   Responsive interaction</a> encourages deeper exploration of an app by 
+   creating timely, logical, and delightful screen reactions to user input. 
+   Responsive interaction elevates an app from an information-delivery service 
+   to an experience that communicates using multiple visual and tactile 
+   responses.</li>
+  <li>For more information, see the Android training on <a
+   href="{@docRoot}training/material/animations.html#Touch">Customizing Touch 
+   Feedback</a>.</li>
+ </ul>
+<h4 id="content-interactive">UI should always be interactive</h4>
+ <ul>
+  <li>Apps that are unresponsive when performing background activity feel slow
+   and reduce user satisfaction. Ensure your app always has a responsive UI 
+   regardless of any background activity. Achieve this by performing network 
+   operations or any heavy-duty operations in a background thread—keep the UI 
+   thread as idle as you can.</li>
+  <li>Material Design apps use minimal visual changes when your app is loading
+   content by representing each operation with a single activity indicator. 
+   Avoid blocking dialogs with <a  
+   href="https://www.google.com/design/spec/components/progress-activity.html">
+   loading indicators</a>.</li>
+  <li><a
+   href="http://www.google.com/design/spec/patterns/empty-states.html">Empty 
+   states</a> occur when the regular content of a view can’t be shown. It might 
+   be a list that has no items or a search that returns no results. Avoid 
+   completely empty states. The most basic empty state displays a 
+   non-interactive image and a text tagline. Where you don’t have an image, or 
+   the image is still loading, you should always show either a static 
+   placeholder, or create a dynamic placeholder by using the <a 
+   href="{@docRoot}tools/support-library/features.html#v7-palette">Palette 
+   library</a> to generate placeholder colors that match the target image.</li>
+  <li>For more information, see the Android training on <a
+   href="{@docRoot}training/articles/perf-anr.html">Keeping Your App 
+   Responsive</a>.</li>
+ </ul>
+<h4 id="content-60fps">Target 60 frames per second on low-cost devices</h4>
+ <ul>
+  <li>Ensure that your app always runs fast and smoothly, even on low-cost
+   devices.</li>
+  <li>Overdraw can significantly slow down your app—it occurs when the pixels
+   are being drawn more than once per pass. An example of this is when you have 
+   an image with a button placed on top of it. While some overdraw is 
+   unavoidable, it should be minimized to ensure a smooth frame rate. Perform 
+   <a href="{@docRoot}tools/performance/debug-gpu-overdraw/index.html">Debug 
+   GPU overdraw</a> on your app to ensure it is minimized.</li>
+  <li>Android devices refresh the screen at 60 frames per second (fps), meaning
+   your app has to update the screen within roughly 16 milliseconds. <a 
+   href="{@docRoot}tools/performance/profile-gpu-rendering/index.html">Profile 
+   your app</a> using on-device tools to see if and when your app is not 
+   meeting this 16-ms average.</li>
+  <li>Reduce or remove animations on low-cost devices to lessen the burden on
+   the device’s CPU and GPU.  For more information, see the Android training on 
+   <a href="{@docRoot}training/improving-layouts/index.html">Improving Layout 
+   Performance</a>. </li>
+ </ul>
+<h4 id="content-firstload">If anticipated start speed is low, use launch screen 
+on first load</h4>
+ <ul>
+  <li>The launch screen is a user’s first experience of your application.
+   Launching your app while displaying a blank canvas increases its perceived 
+   loading time, so consider using a placeholder UI or a branded launch screen 
+   to reduce the perceived loading time.</li>
+  <li>A<a href="https://www.google.com/design/spec/patterns/launch-screens.html#launch-screens-types-of-launch-screens">
+   placeholder UI</a> is the most seamless launch transition, appropriate for 
+   both app launches and in-app activity transitions.</li>
+  <li><a
+   href="https://www.google.com/design/spec/patterns/launch-screens.html#launch-screens-placeholder-ui">
+   Branded launch screens</a> provide momentary brand exposure, freeing the UI 
+   to focus on content.</li>
+  <li>For more information on implementing splash screens, see the <a
+   href="https://www.google.com/design/spec/patterns/launch-screens.html">
+   Launch screens</a> section of the Material Design spec.</li>
+ </ul>
+<h3 id="ui">UI best practices</h3>
+ <ul>
+  <li><a
+   href="https://www.google.com/design/spec/material-design/introduction.html">
+   Material Design</a> is a visual language that synthesizes the classic 
+   principles of good design with the innovation and possibility of technology 
+   and science. Material Design aims to develop a single underlying system that 
+   allows for a unified experience across platforms and device sizes. Consider 
+   using key Material Design components so that users intuitively know how to 
+   use your app.</li>
+  <li>Ready-to-use Material Design components are available via the <a
+   href="{@docRoot}tools/support-library/features.html#design">Design Support 
+   library</a>. These components are supported in Android 2.1 (API level 7) and 
+   above.</li>
+ </ul>
+<h3 id="localization">Localization</h3>
+ <ul>
+  <li>Your users could be from any part of the world and their first language
+   may not be yours. If you don’t present your app in a language that your 
+   users can read, it is a missed opportunity. You should therefore 
+   localize your app for key regional languages.</li>
+  <li>To learn more, visit the Android training on <a 
+ href="{@docRoot}training/basics/supporting-devices/languages.html">
+ Supporting Different Languages</a>.</li>
+ </ul>
+
+<h3 class="rel-resources clearfloat">Related resources</h3>
+<div class="resource-widget resource-flow-layout col-13"
+  data-query="collection:distribute/essentials/billionsquality/content"
+  data-sortOrder="-timestamp"
+  data-cardSizes="6x3"
+  data-maxResults="6"></div>
diff --git a/docs/html/distribute/images/billions-guidelines.png b/docs/html/distribute/images/billions-guidelines.png
new file mode 100644
index 0000000..05f71b6
--- /dev/null
+++ b/docs/html/distribute/images/billions-guidelines.png
Binary files differ
diff --git a/docs/html/jd_collections.js b/docs/html/jd_collections.js
index ef9ac98..dc09cfd 100644
--- a/docs/html/jd_collections.js
+++ b/docs/html/jd_collections.js
@@ -351,7 +351,7 @@
       "distribute/essentials/quality/tv.html",
       "distribute/essentials/quality/wear.html",
       "distribute/essentials/quality/auto.html",
-      "https://developers.google.com/edu/guidelines"
+      "distribute/essentials/quality/billions.html"
     ]
   },
   "distribute/essentials/zhcn": {
@@ -483,7 +483,7 @@
       "distribute/essentials/quality/wear.html",
       "distribute/essentials/quality/tv.html",
       "distribute/essentials/quality/auto.html",
-      "https://developers.google.com/edu/guidelines"
+      "distribute/essentials/quality/billions.html"
     ]
   },
   "distribute/essentials/tools": {
@@ -984,6 +984,44 @@
       "google/play/filters.html"
     ]
   },
+ "distribute/essentials/billionsquality/connectivity": {
+    "title": "",
+    "resources": [
+      "training/basics/network-ops/managing.html",
+      "training/monitoring-device-state/connectivity-monitoring.html",
+      "guide/topics/providers/content-providers.html"
+    ]
+  },
+  "distribute/essentials/billionsquality/capability": {
+    "title": "",
+    "resources": [
+      "guide/practices/screens_support.html",
+      "training/multiscreen/screendensities.html",
+      "training/articles/memory.html"
+    ]
+  },
+  "distribute/essentials/billionsquality/cost": {
+    "title": "",
+    "resources": [
+      "https://medium.com/@wkalicinski/smallerapk-part-6-image-optimization-zopfli-webp-4c462955647d#.23hlddo3x",
+      "training/basics/network-ops/managing.html"
+    ]
+  },
+  "distribute/essentials/billionsquality/consumption": {
+    "title": "",
+    "resources": [
+      "training/efficient-downloads/efficient-network-access.html",
+      "training/monitoring-device-state/index.html"
+    ]
+  },
+  "distribute/essentials/billionsquality/content": {
+    "title": "",
+    "resources": [
+      "training/material/animations.html#Touch",
+      "training/articles/perf-anr.html",
+      "training/improving-layouts/index.html"
+    ]
+  },
   "distribute/essentials/tabletguidelines": {
     "title": "",
     "resources": [
diff --git a/docs/html/jd_extras.js b/docs/html/jd_extras.js
index 82f1912..9286875 100644
--- a/docs/html/jd_extras.js
+++ b/docs/html/jd_extras.js
@@ -39,6 +39,17 @@
     "type":"medium"
   },
   {
+    "title":"SmallerAPK, Part 6: Image optimization, Zopfli & WebP",
+    "category":"",
+    "summary":"Series of posts on minimizing your APK size.",
+    "url":"https://medium.com/@wkalicinski/smallerapk-part-6-image-optimization-zopfli-webp-4c462955647d#.23hlddo3x",
+    "group":"",
+    "keywords": [],
+    "tags": [],
+    "image":"https://cdn-images-1.medium.com/max/2000/1*chMiA9mGa_FBUOoesHHk3Q.png",
+    "type":"medium"
+  },
+  {
     "title":"Measure your app’s user acquisition channels",
     "titleFriendly":"",
     "summary":"Get details on how to use the Developer Console User Acquisitions reports to discover where your users come from.",
@@ -1006,6 +1017,19 @@
     "lang": "en",
     "group": "",
     "tags": [],
+    "url": "training/material/animations.html#Touch",
+    "timestamp": 1194884220000,
+    "image": null,
+    "title": "Customize Touch Feedback",
+    "summary": "Provide visual confirmation when users interact with your UI.",
+    "keywords": [],
+    "type": "develop",
+    "category": "guide"
+  },
+  {
+    "lang": "en",
+    "group": "",
+    "tags": [],
     "url": "guide/topics/manifest/uses-feature-element.html#testing",
     "timestamp": 1194884220000,
     "image": null,
diff --git a/docs/html/jd_extras_en.js b/docs/html/jd_extras_en.js
index d727b2c..200da47 100644
--- a/docs/html/jd_extras_en.js
+++ b/docs/html/jd_extras_en.js
@@ -17,6 +17,18 @@
  /* TODO Remove standard resources from here, such as below
  */
   {
+    "title":"SmallerAPK, Part 6: Image optimization, Zopfli & WebP",
+    "category":"",
+    "summary":"Series of posts on minimizing your APK size.",
+    "url":"https://medium.com/@wkalicinski/smallerapk-part-6-image-optimization-zopfli-webp-4c462955647d#.23hlddo3x",
+    "group":"",
+    "keywords": [],
+    "tags": [],
+    "image":"https://cdn-images-1.medium.com/max/2000/1*chMiA9mGa_FBUOoesHHk3Q.png",
+    "type":"medium"
+  },
+
+  {
     "title":"Measure your app’s user acquisition channels",
     "category":"google",
     "summary":"Get details on how to use the Developer Console User Acquisitions reports to discover where your users come from.",
@@ -891,6 +903,20 @@
     "lang": "en",
     "group": "",
     "tags": [],
+    "url": "training/material/animations.html#Touch",
+    "timestamp": 1194884220000,
+    "image": null,
+    "title": "Customize Touch Feedback",
+    "summary": "Provide visual confirmation when users interact with your UI.",
+    "keywords": [],
+    "type": "develop",
+    "category": "guide"
+  },
+
+  {
+    "lang": "en",
+    "group": "",
+    "tags": [],
     "url": "guide/topics/manifest/uses-feature-element.html#testing",
     "timestamp": 1194884220000,
     "image": null,
@@ -3540,6 +3566,7 @@
       "distribute/essentials/quality/tv.html",
       "distribute/essentials/quality/wear.html",
       "distribute/essentials/quality/auto.html",
+      "distribute/essentials/quality/billions.html",
       "https://developers.google.com/edu/guidelines"
     ]
   },
@@ -3662,6 +3689,7 @@
       "distribute/essentials/quality/wear.html",
       "distribute/essentials/quality/tv.html",
       "distribute/essentials/quality/auto.html",
+      "distribute/essentials/quality/billions.html",
       "https://developers.google.com/edu/guidelines"
     ]
   },
@@ -4119,6 +4147,44 @@
       "distribute/tools/promote/device-art.html"
     ]
   },
+ "distribute/essentials/billionsquality/connectivity": {
+    "title": "",
+    "resources": [
+      "training/basics/network-ops/managing.html",
+      "training/monitoring-device-state/connectivity-monitoring.html",
+      "guide/topics/providers/content-providers.html"
+    ]
+  },
+  "distribute/essentials/billionsquality/capability": {
+    "title": "",
+    "resources": [
+      "guide/practices/screens_support.html",
+      "training/multiscreen/screendensities.html",
+      "training/articles/memory.html"
+    ]
+  },
+  "distribute/essentials/billionsquality/cost": {
+    "title": "",
+    "resources": [
+      "https://medium.com/@wkalicinski/smallerapk-part-4-multi-apk-through-abi-and-density-splits-477083989006#.23hlddo3x",
+      "training/basics/network-ops/managing.html"
+    ]
+  },
+  "distribute/essentials/billionsquality/consumption": {
+    "title": "",
+    "resources": [
+      "training/efficient-downloads/efficient-network-access.html",
+      "training/monitoring-device-state/index.html"
+    ]
+  },
+  "distribute/essentials/billionsquality/content": {
+    "title": "",
+    "resources": [
+      "training/material/animations.html#Touch",
+      "training/articles/perf-anr.html",
+      "training/improving-layouts/index.html"
+    ]
+  },
   "distribute/getusers/notifications": {
     "title": "",
     "resources": [
diff --git a/docs/html/ndk/guides/_book.yaml b/docs/html/ndk/guides/_book.yaml
index fdcfe46..85c54ee 100644
--- a/docs/html/ndk/guides/_book.yaml
+++ b/docs/html/ndk/guides/_book.yaml
@@ -60,6 +60,16 @@
     path: /ndk/guides/audio/basics.html
   - title: OpenSL ES for Android
     path: /ndk/guides/audio/opensl-for-android.html
+  - title: Audio Input Latency
+    path: /ndk/guides/audio/input-latency.html
+  - title: Audio Output Latency
+    path: /ndk/guides/audio/output-latency.html
+  - title: Floating-Point Audio
+    path: /ndk/guides/audio/floating-point.html
+  - title: Sample Rates
+    path: /ndk/guides/audio/sample-rates.html
+  - title: OpenSL ES Programming Notes
+    path: /ndk/guides/audio/opensl-prog-notes.html
 
 - title: Graphics
   path: /ndk/guides/graphics/index.html
diff --git a/docs/html/ndk/guides/audio/basics.jd b/docs/html/ndk/guides/audio/basics.jd
index a5f0ff5..bdb85fb 100644
--- a/docs/html/ndk/guides/audio/basics.jd
+++ b/docs/html/ndk/guides/audio/basics.jd
@@ -1,4 +1,4 @@
-page.title=OpenSL ES™ Basics
+page.title=High-Performance Audio Basics
 @jd:body
 
 <div id="qv-wrapper">
@@ -6,26 +6,51 @@
       <h2>On this page</h2>
 
       <ol>
+        <li><a href="#overview">Building Great Audio Apps</a></li>
         <li><a href="#adding">Adding OpenSL ES to Your App</a></li>
         <li><a href="#building">Building and Debugging</a></li>
+        <li><a href="#power">Audio Power Consumption</a></li>
         <li><a href="#samples">Samples</a></li>
       </ol>
     </div>
   </div>
 
+<a href="https://www.youtube.com/watch?v=d3kfEeMZ65c" class="notice-developers-video">
+<div>
+    <h3>Video</h3>
+    <p>Google I/O 2013 - High Performance Audio</p>
+</div>
+</a>
 
 <p>
-The Khronos Group's OpenSL ES standard exposes audio features
+The Khronos Group's OpenSL ES™ standard exposes audio features
 similar to those in the {@link android.media.MediaPlayer} and {@link android.media.MediaRecorder}
 APIs in the Android Java framework. OpenSL ES provides a C language interface as well as
 C++ bindings, allowing you to call it from code written in either language.
 </p>
 
 <p>
-This page describes how to add these audio APIs into your app's source code, and how to incorporate
-them into the build process.
+This page describes the typical use cases for these high-performance audio APIs, how to add them
+into your app's source code, and how to incorporate them into the build process.
 </p>
 
+<h2 id="overview">Building Great Audio Apps</h2>
+
+<p>
+The OpenSL ES APIs are available to help you develop and improve your app's audio performance.
+ Some typical use cases include the following:</p>
+
+<ul>
+  <li>Digital Audio Workstations (DAWs).</li>
+  <li>Synthesizers.</li>
+  <li>Drum machines.</li>
+  <li>Music learning apps.</li>
+  <li>Karaoke apps.</li>
+  <li>DJ mixing.</li>
+  <li>Audio effects.</li>
+  <li>Video/audio conferencing.</li>
+</ul>
+
 <h2 id="adding">Adding OpenSL ES to your App</h2>
 
 <p>
@@ -45,6 +70,18 @@
 #include &lt;SLES/OpenSLES_Android.h&gt;
 </pre>
 
+<p>
+When you include the {@code OpenSLES_Android.h} header file, the following headers are included
+automatically:
+</p>
+<pre>
+#include &lt;SLES/OpenSLES_AndroidConfiguration.h&gt;
+#include &lt;SLES/OpenSLES_AndroidMetadata.h&gt;
+</pre>
+
+<p class="note"><strong>Note: </strong>
+These headers are not required, but are shown as an aid in learning the API.
+</p>
 
 <h2 id="building">Building and Debugging</h2>
 
@@ -69,9 +106,9 @@
 </p>
 
 <p>
-We use asserts in our <a href="https://github.com/googlesamples/android-ndk">examples</a>, because
-they help catch unrealistic conditions that would indicate a coding error. We have used explicit
-error handling for other conditions more likely to occur in production.
+We use asserts in our <a class="external-link" href="https://github.com/googlesamples/android-ndk">
+examples</a>, because they help catch unrealistic conditions that would indicate a coding error. We
+have used explicit error handling for other conditions more likely to occur in production.
 </p>
 
 <p>
@@ -91,18 +128,25 @@
 </pre>
 
 <p>
-To examine the log from Android Studio, either click the <em>Logcat</em> tab in the
-<a href="{@docRoot}tools/debugging/debugging-studio.html#runDebug"><em>Debug</em></a>
-window, or click the <em>Devices | logcat</em> tab in the
-<a href="{@docRoot}tools/debugging/debugging-studio.html#systemLogView"><em>Android DDMS</em></a>
+To examine the log from Android Studio, either click the <strong>Logcat</strong> tab in the
+<a href="{@docRoot}tools/debugging/debugging-studio.html#runDebug">Debug</a>
+window, or click the <strong>Devices | logcat</strong> tab in the
+<a href="{@docRoot}tools/debugging/debugging-studio.html#systemLogView">Android DDMS</a>
 window.
 </p>
-
+<h2 id="power">Audio Power Consumption</h2>
+<p>Constantly outputting audio incurs significant power consumption. Ensure that you stop the
+ output in the
+ <a href="{@docRoot}reference/android/app/Activity.html#onPause()">onPause()</a> method.
+ Also consider pausing the silent output after some period of user inactivity.
+</p>
 <h2 id="samples">Samples</h2>
 
 <p>
 Supported and tested example code that you can use as a model for your own code resides both locally
-and on GitHub. The local examples are located in
+and on
+<a class="external-link" href="https://github.com/googlesamples/android-audio-high-performance/">
+GitHub</a>. The local examples are located in
 {@code platforms/android-9/samples/native-audio/}, under your NDK root installation directory.
 On GitHub, they are available from the
 <a class="external-link" href="https://github.com/googlesamples/android-ndk">{@code android-ndk}</a>
@@ -122,4 +166,4 @@
 For more information on differences between the reference specification and the
 Android implementation, see
 <a href="{@docRoot}ndk/guides/audio/opensl-for-android.html">
-OpenSL ES™ for Android</a>.
+OpenSL ES for Android</a>.
diff --git a/docs/html/ndk/guides/audio/floating-point.jd b/docs/html/ndk/guides/audio/floating-point.jd
new file mode 100644
index 0000000..76efce3
--- /dev/null
+++ b/docs/html/ndk/guides/audio/floating-point.jd
@@ -0,0 +1,101 @@
+page.title=Floating-Point Audio
+@jd:body
+
+<div id="qv-wrapper">
+    <div id="qv">
+      <h2>On this page</h2>
+
+      <ol>
+        <li><a href="#best">Best Practices for Floating-Point Audio</a></li>
+        <li><a href="#support">Floating-Point Audio in Android SDK</a></li>
+        <li><a href="#more">For More Information</a></li>
+      </ol>
+    </div>
+  </div>
+
+<a href="https://www.youtube.com/watch?v=sIcieUqMml8" class="notice-developers-video">
+<div>
+    <h3>Video</h3>
+    <p>Will it Float? The Glory and Shame of Floating-Point Audio</p>
+</div>
+</a>
+
+<p>Using floating-point numbers to represent audio data can significantly enhance audio
+ quality in high-performance audio applications. Floating point offers the following
+ advantages:</p>
+
+<ul>
+<li>Wider dynamic range.</li>
+<li>Consistent accuracy across the dynamic range.</li>
+<li>More headroom to avoid clipping during intermediate calculations and transients.</li>
+</ul>
+
+<p>While floating-point can enhance audio quality, it does present certain disadvantages:</p>
+
+<ul>
+<li>Floating-point numbers use more memory.</li>
+<li>Floating-point operations employ unexpected properties, for example, addition is
+ not associative.</li>
+<li>Floating-point calculations can sometimes lose arithmetic precision due to rounding or
+ numerically unstable algorithms.</li>
+<li>Using floating-point effectively requires greater understanding to achieve accurate
+ and reproducible results.</li>
+</ul>
+
+<p>
+  Formerly, floating-point was notorious for being unavailable or slow. This is
+  still true for low-end and embedded processors. But processors on modern
+  mobile devices now have hardware floating-point with performance that is
+  similar (or in some cases even faster) than integer. Modern CPUs also support
+  <a href="http://en.wikipedia.org/wiki/SIMD" class="external-link">SIMD</a>
+  (Single instruction, multiple data), which can improve performance further.
+</p>
+
+<h2 id="best">Best Practices for Floating-Point Audio</h2>
+<p>The following best practices help you avoid problems with floating-point calculations:</p>
+<ul>
+<li>Use double precision floating-point for infrequent calculations,
+such as computing filter coefficients.</li>
+<li>Pay attention to the order of operations.</li>
+<li>Declare explicit variables for intermediate values.</li>
+<li>Use parentheses liberally.</li>
+<li>If you get a NaN or infinity result, use binary search to discover
+where it was introduced.</li>
+</ul>
+
+<h2 id="support">Floating-Point Audio in Android SDK</h2>
+
+<p>For floating-point audio, the audio format encoding
+ <code>AudioFormat.ENCODING_PCM_FLOAT</code> is used similarly to
+ <code>ENCODING_PCM_16_BIT</code> or <code>ENCODING_PCM_8_BIT</code> for specifying
+ AudioTrack data
+formats. The corresponding overloaded method <code>AudioTrack.write()</code>
+ takes in a float array to deliver data.</p>
+
+<pre>
+   public int write(float[] audioData,
+        int offsetInFloats,
+        int sizeInFloats,
+        int writeMode)
+</pre>
+
+<h2 id="more">For More Information</h2>
+
+<p>The following Wikipedia pages are helpful in understanding floating-point audio:</p>
+
+<ul>
+<li><a href="http://en.wikipedia.org/wiki/Audio_bit_depth" class="external-link" >Audio bit depth</a></li>
+<li><a href="http://en.wikipedia.org/wiki/Floating_point" class="external-link" >Floating point</a></li>
+<li><a href="http://en.wikipedia.org/wiki/IEEE_floating_point" class="external-link" >IEEE 754 floating-point</a></li>
+<li><a href="http://en.wikipedia.org/wiki/Loss_of_significance" class="external-link" >Loss of significance</a>
+ (catastrophic cancellation)</li>
+<li><a href="https://en.wikipedia.org/wiki/Numerical_stability" class="external-link" >Numerical stability</a></li>
+</ul>
+
+<p>The following article provides information on those aspects of floating-point that have a
+ direct impact on designers of computer systems:</p>
+<ul>
+<li><a href="http://docs.oracle.com/cd/E19957-01/806-3568/ncg_goldberg.html" class="external-link" >What every
+ computer scientist should know about floating-point arithmetic</a>
+by David Goldberg, Xerox PARC (edited reprint).</li>
+</ul>
diff --git a/docs/html/ndk/guides/audio/index.jd b/docs/html/ndk/guides/audio/index.jd
index ac6e539..12d9320 100644
--- a/docs/html/ndk/guides/audio/index.jd
+++ b/docs/html/ndk/guides/audio/index.jd
@@ -1,15 +1,27 @@
-page.title=NDK Audio: OpenSL ES&#8482
+page.title=NDK High-Performance Audio
 @jd:body
 
 <p>The NDK package includes an Android-specific implementation of the
-<a href="https://www.khronos.org/opensles/">OpenSL ES</a> API
-specification from the <a href="https://www.khronos.org">Khronos Group</a>. This library
-allows you to use C or C++ to implement high-performance, low-latency audio in your game or other
-demanding app.</p>
+<a class="external-link" href="https://www.khronos.org/opensles/">OpenSL ES™</a> API
+specification from the <a class="external-link" href="https://www.khronos.org">Khronos Group</a>.
+This library allows you to use C or C++ to implement high-performance, low-latency audio, whether
+you are writing a synthesizer, digital audio workstation, karaoke, game,
+ or other real-time app.</p>
 
 <p>This section begins by providing some
-<a href="{@docRoot}ndk/guides/audio/basics.html">basic information</a> about the API, including how
-to incorporate it into your app. It then explains what you need to know about the
-<a href="{@docRoot}ndk/guides/audio/opensl-for-android.html">Android-specific implementation</a>
-of OpenSL ES, focusing on differences between this implementation and the reference specification.
-</p>
+<a href="{@docRoot}ndk/guides/audio/basics.html">basic information</a> about the API, including
+typical use cases and how to incorporate it into your app. It then explains what you need to know
+about the <a href="{@docRoot}ndk/guides/audio/opensl-for-android.html">Android-specific
+implementation</a> of OpenSL ES, focusing on the differences between this implementation and the
+reference specification. Next, you'll learn how to minimze
+ <a href="{@docRoot}ndk/guides/audio/input-latency.html">input latency</a>
+ when using built-in or external microphones
+and some actions that you can take to minimize
+ <a href="{@docRoot}ndk/guides/audio/output-latency.html">output latency</a>.
+ It describes the reasons that you should use
+ <a href="{@docRoot}ndk/guides/audio/floating-point.html">floating-point</a>
+ numbers to represent your audio data, and it provides information that will help you choose the
+optimal <a href="{@docRoot}ndk/guides/audio/sample-rates.html">sample rate</a>. This section
+ concludes with some supplemental <a href="{@docRoot}ndk/guides/audio/opensl-prog-notes.html">
+ programming notes</a> to ensure proper implementation of OpenSL ES.
+ </p>
diff --git a/docs/html/ndk/guides/audio/input-latency.jd b/docs/html/ndk/guides/audio/input-latency.jd
new file mode 100644
index 0000000..f1103fc
--- /dev/null
+++ b/docs/html/ndk/guides/audio/input-latency.jd
@@ -0,0 +1,95 @@
+page.title=Audio Input Latency
+@jd:body
+
+<div id="qv-wrapper">
+    <div id="qv">
+      <h2>On this page</h2>
+
+      <ol>
+        <li><a href="#check-list">Checklist</a></li>
+        <li><a href="#ways">Ways to Reduce Audio Input Latency</a></li>
+        <li><a href="#avoid">What to Avoid</a></li>
+      </ol>
+    </div>
+  </div>
+
+
+<p>This page provides guidelines to help you reduce audio input latency when recording with a
+built-in microphone or an external headset microphone.</p>
+
+<h2 id="check-list">Checklist</h2>
+
+<p>Here are a few important prerequisites:</p>
+
+<ul>
+  <li>You must use the Android-specific implementation of the
+  <a class="external-link" href="https://www.khronos.org/opensles/">OpenSL ES™</a> API.
+
+  <li>If you haven't already done so, download and install the
+  <a href="{@docRoot}tools/sdk/ndk/index.html">Android NDK</a>.</li>
+
+  <li>Many of the same requirements for low-latency audio output also apply to low-latency input,
+  so read the requirements for low-latency output in
+  <a href="{@docRoot}ndk/guides/audio/output-latency.html">Audio Output Latency</a>.</li>
+</ul>
+
+<h2 id="ways">Ways to Reduce Audio Input Latency</h2>
+
+<p>The following are some methods to help ensure low audio input latency:
+
+<ul>
+  <li>Suggest to your users, if your app relies on low-latency audio, that they use a headset
+(for example, by displaying a <em>Best with headphones</em> screen on first run). Note
+that just using the headset doesn’t guarantee the lowest possible latency. You may need to
+perform other steps to remove any unwanted signal processing from the audio path, such as by
+using the <a href="http://developer.android.com/reference/android/media/MediaRecorder.AudioSource.html#VOICE_RECOGNITION">
+VOICE_RECOGNITION</a> preset when recording.</li>
+
+  <li>It's difficult to test audio input and output latency in isolation. The best solution to
+determine the lowest possible audio input latency is to measure round-trip audio and divide
+by two.</li>
+ <li> Be prepared to handle nominal sample rates of 44,100 and 48,000 Hz as reported by
+<a href="{@docRoot}reference/android/media/AudioManager.html#getProperty(java.lang.String)">
+getProperty(String)</a> for
+<a href="{@docRoot}reference/android/media/AudioManager.html#PROPERTY_OUTPUT_SAMPLE_RATE">
+PROPERTY_OUTPUT_SAMPLE_RATE</a>. Other sample rates are possible, but rare.</li>
+
+  <li>Be prepared to handle the buffer size reported by
+<a href="{@docRoot}reference/android/media/AudioManager.html#getProperty(java.lang.String)">
+getProperty(String)</a> for
+<a href="{@docRoot}reference/android/media/AudioManager.html#PROPERTY_OUTPUT_FRAMES_PER_BUFFER">
+PROPERTY_OUTPUT_FRAMES_PER_BUFFER</a>. Typical buffer sizes include 96, 128, 160, 192, 240, 256,
+or 512 frames, but other values are possible.</li>
+</ul>
+
+<h2 id="avoid">What to Avoid</h2>
+
+<p>Be sure to take these things into account to help avoid latency issues:</p>
+
+<ul>
+  <li>Don’t assume that the speakers and microphones used in mobile devices generally have good
+acoustics. Due to their small size, the acoustics are generally poor so signal processing is
+added to improve the sound quality. This signal processing introduces latency.</li>
+
+  <li>Don't assume that your input and output callbacks are synchronized. For simultaneous input
+and output, separate buffer queue completion handlers are used for each side. There is no
+guarantee of the relative order of these callbacks or the synchronization of the audio clocks,
+even when both sides use the same sample rate. Your application should buffer the data with
+proper buffer synchronization.</li>
+
+  <li>Don't assume that the actual sample rate exactly matches the nominal sample rate. For
+example, if the nominal sample rate is 48,000 Hz, it is normal for the audio clock to advance
+at a slightly different rate than the operating system {@code CLOCK_MONOTONIC}. This is because
+the audio and system clocks may derive from different crystals.</li>
+
+  <li>Don't assume that the actual playback sample rate exactly matches the actual capture sample
+rate, especially if the endpoints are on separate paths. For example, if you are capturing from
+the on-device microphone at 48,000 Hz nominal sample rate, and playing on USB audio
+at 48,000 Hz nominal sample rate, the actual sample rates are likely to be slightly different
+from each other.</li>
+</ul>
+
+<p>A consequence of potentially independent audio clocks is the need for asynchronous sample rate
+conversion. A simple (though not ideal for audio quality) technique for asynchronous sample rate
+conversion is to duplicate or drop samples as needed near a zero-crossing point. More
+sophisticated conversions are possible.</p>
diff --git a/docs/html/ndk/guides/audio/opensl-for-android.jd b/docs/html/ndk/guides/audio/opensl-for-android.jd
index 763da5a..fa5e260 100644
--- a/docs/html/ndk/guides/audio/opensl-for-android.jd
+++ b/docs/html/ndk/guides/audio/opensl-for-android.jd
@@ -1,4 +1,4 @@
-page.title=Native Audio: OpenSL ES&#8482; for Android
+page.title=OpenSL ES for Android
 @jd:body
 
 <div id="qv-wrapper">
@@ -6,23 +6,158 @@
       <h2>On this page</h2>
 
       <ol>
+        <li><a href="#getstarted">Getting Started</a></li>
         <li><a href="#inherited">Features Inherited from the Reference Specification</a></li>
+        <li><a href="#planning">Planning for Future Versions of OpenSL ES</a></li>
         <li><a href="#ae">Android Extensions</a></li>
+        <li><a href="#notes">Programming Notes</a></li>
+        <li><a href="#platform-issues">Platform Issues</a></li>
       </ol>
     </div>
   </div>
 
 <p>
-This page provides details about how the NDK implementation of OpenSL ES™ differs
-from the reference specification for OpenSL ES 1.0.1. When using sample code from the
+This page provides details about how the
+<a href="{@docRoot}tools/sdk/ndk/index.html">NDK</a> implementation of OpenSL
+ES™ differs from the reference specification for OpenSL ES 1.0.1. When using sample code from the
 specification, you may need to modify it to work on Android.
 </p>
 
+<p>
+Unless otherwise noted, all features are available at Android 2.3 (API level 9) and higher.
+ Some features are only available for Android 4.0 (API level 14); these are noted.
+</p>
+
+<p class="note"><strong>Note: </strong>
+The Android Compatibility Definition Document (CDD) enumerates the hardware and software
+requirements of a compatible Android device. See
+<a class="external-link" href="https://source.android.com/compatibility/">Android Compatibility</a>
+for more information on the overall compatibility program, and
+<a class="external-link" href="https://static.googleusercontent.com/media/source.android.com/en//compatibility/android-cdd.pdf">
+CDD</a> for the actual CDD document.
+</p>
+
+<p>
+<a class="external-link" href="https://www.khronos.org/opensles/">OpenSL ES</a> provides a C
+language interface that is also accessible using C++. It exposes features similar to the audio
+portions of these Android Java APIs:
+</p>
+
+<ul>
+  <li><a href="{@docRoot}reference/android/media/MediaPlayer.html">
+  android.media.MediaPlayer</a></li>
+  <li><a href="{@docRoot}reference/android/media/MediaRecorder.html">
+  android.media.MediaRecorder</a></li>
+</ul>
+
+<p>
+As with all of the Android Native Development Kit (NDK), the primary purpose of OpenSL ES for
+Android is to facilitate the implementation of shared libraries to be called using the Java Native
+Interface (<a class="external-link" href="https://en.wikipedia.org/wiki/Java_Native_Interface">JNI
+</a>). NDK is not intended for writing pure C/C++ applications. However, OpenSL ES is a
+full-featured API, and we expect that you should be able to accomplish most of your audio needs
+using only this API, without up-calls to code running in the Android runtime.
+</p>
+
+<p class="note"><strong>Note: </strong>
+Though based on OpenSL ES, the Android native audio (high-performance audio) API  is not a
+conforming implementation of any OpenSL ES 1.0.1 profile (game, music, or phone). This is because
+Android does not implement all of the features required by any one of the profiles. Any known cases
+where Android behaves differently than the specification are described in the <a href="#ae">
+Android extensions</a> section below.
+</p>
+
+<h2 id="getstarted">Getting Started</h2>
+
+<p>
+This section provides the information needed to get started using the OpenSL ES APIs.
+</p>
+
+<h3>Example code</h3>
+
+<p>
+We recommend using supported and tested example code that is usable as a model for your own
+code, which is located in the NDK folder {@code platforms/android-9/samples/native-audio/}, as well
+as in the
+<a class="external-link" href="https://github.com/googlesamples/android-ndk/tree/master/audio-echo">audio-echo</a>
+and
+<a class="external-link" href="https://github.com/googlesamples/android-ndk/tree/master/native-audio">native-audio</a>
+folders of the
+<a class="external-link" href="https://github.com/googlesamples/android-ndk">android-ndk</a> GitHub
+repository.
+</p>
+
+<p class="caution"><strong>Caution: </strong>
+The OpenSL ES 1.0.1 specification contains example code in the appendices (see
+<a class="external-link" href="https://www.khronos.org/registry/sles/">Khronos OpenSL ES Registry</a>
+for more details). However, the examples in <em>Appendix B: Sample Code</em> and
+<em>Appendix C: Use Case Sample Code</em> use features that are not supported by Android. Some
+examples also contain typographical errors, or use APIs that are likely to change. Proceed with
+caution when referring to these; though the code may be helpful in understanding the full OpenSL ES
+standard, it should not be used as-is with Android.
+</p>
+
+<h3>Makefile</h3>
+
+<p>
+Modify your {@code Android.mk} file as follows:
+</p>
+<pre>
+LOCAL_LDLIBS += -lOpenSLES
+</pre>
+
+<h3>Audio content</h3>
+
+<p>
+The following are some of the many ways to package audio content for your application:
+</p>
+
+<ul>
+  <li><strong>Resources</strong>: By placing your audio files into the {@code res/raw/} folder,
+  they can be accessed easily by the associated APIs for
+  <a href="{@docRoot}reference/android/content/res/Resources.html">Resources</a>.
+  However, there is no direct native access to resources, so you must write Java
+  programming language code to copy them out before use.</li>
+  <li><strong>Assets</strong>: By placing your audio files into the {@code assets/} folder, they
+  are directly accessible by the Android native asset manager APIs. See the header files {@code
+  android/asset_manager.h} and {@code android/asset_manager_jni.h} for more information on these
+  APIs. The example code located in the NDK folder {@code platforms/android-9/samples/native-audio/}
+  uses these native asset manager APIs in conjunction with the Android file descriptor data
+  locator.</li>
+  <li><strong>Network</strong>: You can use the URI data locator to play audio content directly
+  from the network. However, be sure to read the <a href="#sandp">Security and permissions</a>
+  section below.</li>
+  <li><strong>Local file system</strong>: The URI data locator supports the {@code file:} scheme
+  for local files, provided the files are accessible by the application. Note that the Android
+  security framework restricts file access via the Linux user ID and group ID mechanisms.</li>
+  <li><strong>Recorded</strong>: Your application can record audio data from the microphone input,
+  store this content, and then play it back later. The example code uses this method for the <em>
+  Playback</em> clip.</li>
+  <li><strong>Compiled and linked inline</strong>: You can link your audio content directly into
+  the shared library, and then play it using an audio player with buffer queue data locator. This
+  is most suitable for short PCM format clips. The example code uses this technique for the <em>
+  Hello</em> and <em>Android</em> clips. The PCM data was converted to hex strings using a
+  {@code bin2c} tool (not supplied).</li>
+  <li><strong>Real-time synthesis</strong>: Your application can synthesize PCM data on the fly and
+  then play it using an audio player with buffer queue data locator. This is a relatively advanced
+  technique, and the details of audio synthesis are beyond the scope of this article.</li>
+</ul>
+
+<p class="note"><strong>Note: </strong>
+Finding or creating useful audio content for your application is beyond the scope of this article.
+You can use web search terms such as <em>interactive audio</em>, <em>game audio</em>, <em>sound
+design</em>, and <em>audio programming</em> to locate more information. 
+</p>
+<p class="caution"><strong>Caution:</strong> It is your responsibility
+to ensure that you are legally permitted to play or record content. There may be privacy
+considerations for recording content.
+</p>
+
 <h2 id="inherited">Features Inherited from the Reference Specification</h2>
 
 <p>
 The Android NDK implementation of OpenSL ES inherits much of the feature set from
-the reference specification, although with certain limitations.
+the reference specification, with certain limitations.
 </p>
 
 <h3>Global entry points</h3>
@@ -44,8 +179,9 @@
 <h3>Objects and interfaces</h3>
 
 <p>
-Table 1 shows which objects and interfaces the Android NDK implementation of
-OpenSL ES supports. Green cells indicate features available in this implementation.
+Table 1 shows the objects and interfaces that the Android NDK implementation of
+OpenSL ES supports. If a <em>Yes</em> appears in the cell, then the feature is available in this
+implementation.
 </p>
 
 <p class="table-caption" id="Objects-and-interfaces">
@@ -214,7 +350,9 @@
   </tr>
   </table>
 
-The next section explains limitations of some of these features.
+<p>
+The next section explains the limitations for some of these features.
+</p>
 
 <h3>Limitations</h3>
 
@@ -265,7 +403,7 @@
 to either <code>NULL</code> or a valid UTF-8 string. You must also initialize
 <code>containerType</code> to a valid value.
 In the absence of other considerations, such as portability to other
-implementations, or content format that an app cannot identify by header,
+implementations or content format that an app cannot identify by header,
 we recommend that you
 set <code>mimeType</code> to <code>NULL</code> and <code>containerType</code>
 to <code>SL_CONTAINERTYPE_UNSPECIFIED</code>.
@@ -275,30 +413,32 @@
 Android platform supports them as well:</p>
 
 <ul>
-<li>WAV PCM</li>
-<li>WAV alaw</li>
-<li>WAV ulaw</li>
-<li>MP3 Ogg Vorbis</li>
-<li>AAC LC</li>
-<li>HE-AACv1 (AAC+)</li>
-<li>HE-AACv2 (enhanced AAC+)</li>
-<li>AMR</li>
-<li>FLAC</li>
+<li><a class="external-link" href="https://en.wikipedia.org/wiki/WAV">WAV</a> PCM.</li>
+<li>WAV alaw.</li>
+<li>WAV ulaw.</li>
+<li>MP3 Ogg Vorbis.</li>
+<li>AAC LC.</li>
+<li>HE-AACv1 (AAC+).</li>
+<li>HE-AACv2 (enhanced AAC+).</li>
+<li>AMR.</li>
+<li>FLAC.</li>
 </ul>
 
-<p>
+<p class="note"><strong>Note: </strong>
 For a list of audio formats that Android supports, see
 <a href="{@docRoot}guide/appendix/media-formats.html">Supported Media Formats</a>.
 </p>
 
 <p>
-The following limitations apply to handling of these and other formats in this
+The following limitations apply to the handling of these and other formats in this
 implementation of OpenSL ES:
 </p>
 
 <ul>
-<li>AAC formats must be reside within an MP4 or ADTS container.</li>
-<li>OpenSL ES for Android does not support MIDI.</li>
+<li><a class="external-link" href="https://en.wikipedia.org/wiki/Advanced_Audio_Coding">AAC</a>
+formats must reside within an MP4 or ADTS container.</li>
+<li>OpenSL ES for Android does not support
+<a class="external-link" href="https://source.android.com/devices/audio/midi.html">MIDI</a>.</li>
 <li>WMA is not part of <a class="external-link" href="https://source.android.com/">AOSP</a>, and we
 have not verified its compatibility with OpenSL ES for Android.</li>
 <li>The Android NDK implementation of OpenSL ES does not support direct
@@ -333,13 +473,23 @@
 <li>8-bit unsigned or 16-bit signed.</li>
 <li>Mono or stereo.</li>
 <li>Little-endian byte ordering.</li>
-<li>Sample rates of: 8,000, 11,025, 12,000, 16,000, 22,050, 24,000, 32,000, 44,100, or
-48,000 Hz.</li>
+<li>Sample rates of:
+  <ul>
+    <li>8,000 Hz.</li>
+    <li>11,025 Hz.</li>
+    <li>12,000 Hz.</li>
+    <li>16,000 Hz.</li>
+    <li>22,050 Hz.</li>
+    <li>24,000 Hz.</li>
+    <li>32,000 Hz.</li>
+    <li>44,100 Hz.</li>
+    <li>48,000 Hz.</li>
+  </ul></li>
 </ul>
 
 <p>
 The configurations that OpenSL ES for Android supports for recording are
-device-dependent; usually, 16,000 Hz mono 16-bit signed is available regardless of device.
+device-dependent; usually, 16,000 Hz mono/16-bit signed is available regardless of the device.
 </p>
 <p>
 The value of the <code>samplesPerSec</code> field is in units of milliHz, despite the misleading
@@ -393,7 +543,7 @@
 An audio player or recorder with a data locator for a buffer queue supports PCM data format only.
 </p>
 
-<h4>I/O Device data locator</h4>
+<h4>I/O device data locator</h4>
 
 <p>
 OpenSL ES for Android only supports use of an I/O device data locator when you have
@@ -421,6 +571,150 @@
 We have not verified support for {@code rtsp:} with audio on the Android platform.
 </p>
 
+<h4>Data structures</h4>
+
+<p>
+Android supports these OpenSL ES 1.0.1 data structures:
+</p>
+<ul>
+  <li>{@code SLDataFormat_MIME}</li>
+  <li>{@code SLDataFormat_PCM}</li>
+  <li>{@code SLDataLocator_BufferQueue}</li>
+  <li>{@code SLDataLocator_IODevice}</li>
+  <li>{@code SLDataLocator_OutputMix}</li>
+  <li>{@code SLDataLocator_URI}</li>
+  <li>{@code SLDataSink}</li>
+  <li>{@code SLDataSource}</li>
+  <li>{@code SLEngineOption}</li>
+  <li>{@code SLEnvironmentalReverbSettings}</li>
+  <li>{@code SLInterfaceID}</li>
+</ul>
+
+<h4>Platform configuration</h4>
+
+<p>
+OpenSL ES for Android is designed for multi-threaded applications and is thread-safe. It supports a
+single engine per application, and up to 32 objects per engine. Available device memory and CPU may
+further restrict the usable number of objects.
+</p>
+
+<p>
+These engine options are recognized, but ignored by {@code slCreateEngine}:
+</p>
+
+<ul>
+  <li>{@code SL_ENGINEOPTION_THREADSAFE}</li>
+  <li>{@code SL_ENGINEOPTION_LOSSOFCONTROL}</li>
+</ul>
+
+<p>
+OpenMAX AL and OpenSL ES may be used together in the same application. In this case, there is
+a single shared engine object internally, and the 32 object limit is shared between OpenMAX AL
+and OpenSL ES. The application should first create both engines, use both engines, and finally
+destroy both engines. The implementation maintains a reference count on the shared engine so that
+it is correctly destroyed during the second destroy operation.
+</p>
+
+<h2 id="planning">Planning for Future Versions of OpenSL ES</h2>
+
+<p>
+The Android high-performance audio APIs are based on
+<a class="external-link" href="https://www.khronos.org/registry/sles/">Khronos Group OpenSL ES
+1.0.1</a>. Khronos has released a revised version 1.1 of the standard. The
+revised version includes new features, clarifications, corrections of typographical errors, and
+some incompatibilities. Most of the expected incompatibilities are relatively minor or are in
+areas of OpenSL ES that are not supported by Android.
+</p>
+
+<p>
+An application
+developed with this version should work on future versions of the Android platform, provided
+that you follow the guidelines that are outlined in the <a href="#binary-compat">Planning for
+binary compatibility</a> section below.
+</p>
+
+<p class="note"><strong>Note: </strong>
+Future source compatibility is not a goal. That is, if you upgrade to a newer version of the NDK,
+you may need to modify your application source code to conform to the new API. We expect that most
+such changes will be minor; see details below.
+</p>
+
+<h3 id="binary-compat">Planning for binary compatibility</h3>
+
+<p>
+We recommend that your application follow these guidelines to improve future binary compatibility:
+</p>
+
+<ul>
+  <li>Use only the documented subset of Android-supported features from OpenSL ES 1.0.1.</li>
+  <li>Do not depend on a particular result code for an unsuccessful operation; be prepared to deal
+  with a different result code.</li>
+  <li>Application callback handlers generally run in a restricted context. They should be written
+  to perform their work quickly, and then return as soon as possible. Do not run complex operations
+  within a callback handler. For example, within a buffer queue completion callback, you can
+  enqueue another buffer, but do not create an audio player.</li>
+  <li>Callback handlers should be prepared to be called more or less frequently, to receive
+  additional event types, and should ignore event types that they do not recognize. Callbacks that
+  are configured with an event mask made of enabled event types should be prepared to be called
+  with multiple event type bits set simultaneously. Use "&" to test for each event bit rather than
+  a switch case.</li>
+  <li>Use prefetch status and callbacks as a general indication of progress, but do not depend on
+  specific hard-coded fill levels or callback sequences. The meaning of the prefetch status fill
+  level, and the behavior for errors that are detected during prefetch, may change.</li>
+</ul>
+
+<p class="note"><strong>Note: </strong>
+See the <a href="#bq-behavior">Buffer queue behavior</a> section below for more details.
+</p>
+
+<h3>Planning for source compatibility</h3>
+
+<p>
+As mentioned, source code incompatibilities are expected in the next version of OpenSL ES from
+Khronos Group. The likely areas of change include:
+</p>
+
+<ul>
+  <li>The buffer queue interface is expected to have significant changes, especially in the areas
+  of {@code BufferQueue::Enqueue}, the parameter list for {@code slBufferQueueCallback}, and the
+  name of field {@code SLBufferQueueState.playIndex}. We recommend that your application code use
+  Android simple buffer queues instead. In the example
+  code that is supplied with the NDK, we have used Android simple buffer queues for playback for
+  this reason. (We also use Android simple buffer queue for recording and decoding to PCM, but that
+  is because standard OpenSL ES 1.0.1 does not support record or decode to a buffer queue data
+  sink.)</li>
+  <li>There will be an addition of {@code const} to the input parameters passed by reference, and
+  to {@code SLchar *} struct fields used as input values. This should not require any changes to
+  your code.</li>
+  <li>There will be a substitution of unsigned types for some parameters that are currently signed.
+  You may need to change a parameter type from {@code SLint32} to {@code SLuint32} or similar, or
+  add a cast.</li>
+  <li>{@code Equalizer::GetPresetName} copies the string to application memory instead of returning
+  a pointer to implementation memory. This will be a significant change, so we recommend that you
+  either avoid calling this method, or isolate your use of it.</li>
+  <li>There will be additional fields in the struct types. For output parameters, these new fields
+  can be ignored, but for input parameters the new fields will need to be initialized. Fortunately,
+  all of these are expected to be in areas that are not supported by Android.</li>
+  <li>Interface <a class="external-link" href="http://en.wikipedia.org/wiki/Globally_unique_identifier">
+  GUIDs</a> will change. Refer to interfaces by symbolic name rather than GUID to avoid a
+  dependency.</li>
+  <li>{@code SLchar} will change from {@code unsigned char} to {@code char}. This primarily affects
+  the URI data locator and MIME data format.</li>
+  <li>{@code SLDataFormat_MIME.mimeType} will be renamed to {@code pMimeType}, and
+  {@code SLDataLocator_URI.URI} will be renamed to {@code pURI}. We recommend that you initialize
+  the {@code SLDataFormat_MIME} and {@code SLDataLocator_URI} data structures using a
+  brace-enclosed, comma-separated list of values, rather than by field name, to isolate your code
+  from this change. This technique is used in the example code.</li>
+  <li>{@code SL_DATAFORMAT_PCM} does not permit the application to specify the representation of
+  the data as signed integer, unsigned integer, or floating-point. The Android implementation
+  assumes that 8-bit data is unsigned integer and 16-bit is signed integer. In addition, the field
+  {@code samplesPerSec} is a misnomer, as the actual units are milliHz. These issues are expected
+  to be addressed in the next OpenSL ES version, which will introduce a new extended PCM data
+  format that permits the application to explicitly specify the representation and corrects the
+  field name. As this will be a new data format, and the current PCM data format will still be
+  available (though deprecated), it should not require any immediate changes to your code.</li>
+</ul>
+
 <h2 id="ae">Android Extensions</h2>
 
 <p>
@@ -444,8 +738,8 @@
 
 <p>
 Table 2 shows the Android-specific interfaces and data locators that Android OpenSL ES supports
-for each object type. Green cells indicate interfaces and data locators available for each
-object type.
+for each object type. The <em>Yes</em> values in the cells indicate the interfaces and data
+locators that are available for each object type.
 </p>
 
 <p class="table-caption" id="Android-extensions">
@@ -523,7 +817,7 @@
   </tr>
 </table>
 
-<h3>Android configuration interface</h3>
+<h3 id="configuration-interface">Android configuration interface</h3>
 
 <p>
 The Android configuration interface provides a means to set
@@ -581,6 +875,11 @@
 that they provide.
 </p>
 
+<p>
+Portable applications should use the OpenSL ES 1.0.1 APIs for audio effects instead of the Android
+effect extensions.
+</p>
+
 <h3>Android file descriptor data locator</h3>
 
 <p>
@@ -597,9 +896,9 @@
 <p>
 The Android simple buffer queue data locator and interface are
 identical to those in the OpenSL ES 1.0.1 reference specification, with two exceptions: You
-can also use Android simple buffer queues with both audio players and audio recorders.  Also, PCM
+can also use Android simple buffer queues with both audio players and audio recorders. Also, PCM
 is the only data format you can use with these queues.
-In the reference specification, buffer queues are for audio players only, but
+In the reference specification, buffer queues are for audio players only, but they are
 compatible with data formats beyond PCM.
 </p>
 <p>
@@ -613,7 +912,7 @@
 buffer queues instead of OpenSL ES 1.0.1 buffer queues.
 </p>
 
-<h3>Dynamic interfaces at object creation</h3>
+<h3 id="dynamic-interfaces">Dynamic interfaces at object creation</h3>
 
 <p>
 For convenience, the Android implementation of OpenSL ES 1.0.1
@@ -622,7 +921,7 @@
 to add these interfaces after instantiation.
 </p>
 
-<h3>Buffer queue behavior</h3>
+<h3 id="bq-behavior">Buffer queue behavior</h3>
 
 <p>
 The Android implementation does not include the
@@ -641,7 +940,7 @@
 <p>
 Similarly, there is no specification governing whether the trigger for a buffer queue callback must
 be a transition to <code>SL_PLAYSTATE_STOPPED</code> or execution of
-<code>BufferQueue::Clear()</code>. Therefore, we recommend against creating a dependency on
+<code>BufferQueue::Clear()</code>. Therefore, we recommend that you do not create a dependency on
 one or the other; instead, your app should be able to handle both.
 </p>
 
@@ -679,27 +978,34 @@
 </tr>
 <tr>
   <td>13 and below</td>
-  <td>An open-source codec with a suitable license.</td>
+  <td>An open-source codec with a suitable license</td>
 </tr>
 <tr>
   <td>14 to 15</td>
-  <td>An open-source codec with a suitable license.</td>
+  <td>An open-source codec with a suitable license</td>
 </tr>
 <tr>
   <td>16 to 20</td>
   <td>
-    The {@link android.media.MediaCodec} class or an open-source codec with a suitable license.
+    The {@link android.media.MediaCodec} class or an open-source codec with a suitable license
   </td>
 </tr>
 <tr>
   <td>21 and above</td>
   <td>
     NDK MediaCodec in the {@code &lt;media/NdkMedia*.h&gt;} header files, the
-    {@link android.media.MediaCodec} class, or an open-source codec with a suitable license.
+    {@link android.media.MediaCodec} class, or an open-source codec with a suitable license
   </td>
 </tr>
 </table>
 
+<p class="note"><strong>Note: </strong>
+There is currently no documentation for the NDK version of the {@code MediaCodec} API. However,
+you can refer to the
+<a class="external-link" href="https://github.com/googlesamples/android-ndk/tree/master/native-codec">
+native-codec</a> sample code for an example.
+</p>
+
 <p>
 A standard audio player plays back to an audio device, specifying the output mix as the data sink.
 The Android extension differs in that an audio player instead
@@ -710,17 +1016,18 @@
 
 <p>
 This feature is primarily intended for games to pre-load their audio assets when changing to a
-new game level, similar to the functionality that the {@link android.media.SoundPool} class
-provides.
+new game level, which is similar to the functionality that the {@link android.media.SoundPool}
+class provides.
 </p>
 
 <p>
 The application should initially enqueue a set of empty buffers in the Android simple
-buffer queue. After that, the app fills the buffers with with PCM data. The Android simple
+buffer queue. After that, the app fills the buffers with PCM data. The Android simple
 buffer queue callback fires after each buffer is filled. The callback handler processes
 the PCM data, re-enqueues the now-empty buffer, and then returns. The application is responsible for
 keeping track of decoded buffers; the callback parameter list does not include
-sufficient information to indicate which buffer contains data or which buffer to enqueue next.
+sufficient information to indicate the buffer that contains data or the buffer that should be
+enqueued next.
 </p>
 
 <p>
@@ -753,8 +1060,8 @@
 To decode an encoded stream to PCM but not play back immediately, for apps running on
 Android 4.x (API levels 16&ndash;20), we recommend using the {@link android.media.MediaCodec} class.
 For new applications running on Android 5.0 (API level 21) or higher, we recommend using the NDK
-equivalent, {@code &lt;NdkMedia*.h&gt;}. These header files reside under
-the {@code media/} directory, under your installation root.
+equivalent, {@code &lt;NdkMedia*.h&gt;}. These header files reside in
+the {@code media/} directory under your installation root.
 </p>
 
 <h3>Decode streaming ADTS AAC to PCM</h3>
@@ -796,7 +1103,7 @@
 The Android buffer queue callback fires after each buffer is emptied.
 The callback handler should refill and re-enqueue the buffer, and then return.
 The application need not keep track of encoded buffers; the callback parameter
-list includes sufficient information to indicate which buffer to enqueue next.
+list includes sufficient information to indicate the buffer that should be enqueued next.
 The end of stream is explicitly marked by enqueuing an EOS item.
 After EOS, no more enqueues are permitted.
 </p>
@@ -812,6 +1119,7 @@
 In all respects except for the data source, the streaming decode method is the same as
 the one that <a href="#da">Decode audio to PCM</a> describes.
 </p>
+
 <p>
 Despite the similarity in names, an Android buffer queue is <em>not</em>
 the same as an <a href="#simple">Android simple buffer queue</a>. The streaming decoder
@@ -840,8 +1148,8 @@
 practice is to query for the key indices in the main thread after calling the {@code
 Object::Realize} method, and to read the PCM format metadata values in the Android simple
 buffer queue callback handler when calling it for the first time. Consult the
-<a href="https://github.com/googlesamples/android-ndk">example code in the NDK package</a>
-for examples of working with this interface.
+<a class="external-link" href="https://github.com/googlesamples/android-ndk">example code in the
+NDK package</a> for examples of working with this interface.
 </p>
 
 <p>
@@ -879,3 +1187,25 @@
 audiosrc.pLocator = ...
 audiosrc.pFormat = &amp;pcm;
 </pre>
+
+<h2 id="notes">Programming Notes</h2>
+<p><a href="{@docRoot}ndk/guides/audio/opensl-prog-notes.html">OpenSL ES Programming Notes</a>
+ provides supplemental information to ensure proper implementation of OpenSL ES.</p>
+<p class="note"><strong>Note: </strong>
+For your convenience, we have included a copy of the OpenSL ES 1.0.1 specification with the NDK in
+{@code docs/opensles/OpenSL_ES_Specification_1.0.1.pdf}.
+</p>
+
+<h2 id="platform-issues">Platform Issues</h2>
+
+<p>
+This section describes known issues in the initial platform release that supports these APIs.
+</p>
+
+<h3>Dynamic interface management</h3>
+
+<p>
+{@code DynamicInterfaceManagement::AddInterface} does not work. Instead, specify the interface in
+the array that is passed to Create, as shown in the example code for environmental reverb.
+</p>
+
diff --git a/docs/html/ndk/guides/audio/opensl-prog-notes.jd b/docs/html/ndk/guides/audio/opensl-prog-notes.jd
new file mode 100644
index 0000000..3263145
--- /dev/null
+++ b/docs/html/ndk/guides/audio/opensl-prog-notes.jd
@@ -0,0 +1,461 @@
+page.title=OpenSL ES Programming Notes
+@jd:body
+
+<div id="qv-wrapper">
+    <div id="qv">
+      <h2>On this page</h2>
+
+      <ol>
+        <li><a href="#init">Objects and Interface Initialization</a></li>
+        <li><a href="#prefetch">Audio Player Prefetch</a></li>
+        <li><a href="#destroy">Destroy</a></li>
+        <li><a href="#panning">Stereo Panning</a></li>
+        <li><a href="#callbacks">Callbacks and Threads</a></li>
+        <li><a href="#perform">Performance</a></li>
+        <li><a href="#sandp">Security and Permissions</a></li>
+      </ol>
+    </div>
+</div>
+
+<p>
+The notes in this section supplement the
+<a class="external-link" href="https://www.khronos.org/registry/sles/">OpenSL ES 1.0.1
+specification</a>.
+</p>
+
+<h2 id="init">Objects and Interface Initialization</h2>
+
+<p>
+Two aspects of the OpenSL ES programming model that may be unfamiliar to new developers are the
+distinction between objects and interfaces, and the initialization sequence.
+</p>
+
+<p>
+Briefly, an OpenSL ES object is similar to the object concept in
+ programming languages such as Java
+and C++, except an OpenSL ES object is only visible via its associated interfaces.
+ This includes
+the initial interface for all objects, called {@code SLObjectItf}.
+ There is no handle for an object
+itself, only a handle to the {@code SLObjectItf} interface of the object.
+</p>
+
+<p>
+An OpenSL ES object is first <em>created</em>, which returns an {@code SLObjectItf}, then
+<em>realized</em>. This is similar to the common programming pattern of first constructing an
+object (which should never fail other than for lack of memory or invalid parameters), and then
+completing initialization (which may fail due to lack of resources). The realize step gives the
+implementation a logical place to allocate additional resources if needed.
+</p>
+
+<p>
+As part of the API to create an object, an application specifies an array of desired interfaces
+that it plans to acquire later. Note that this array does not automatically
+ acquire the interfaces;
+it merely indicates a future intention to acquire them. Interfaces are distinguished as
+<em>implicit</em> or <em>explicit</em>. An explicit interface must be listed in the array if it
+will be acquired later. An implicit interface need not be listed in the
+ object create array, but
+there is no harm in listing it there. OpenSL ES has one more kind of interface called
+<em>dynamic</em>, which does not need to be specified in the object
+ create array and can be added
+later after the object is created. The Android implementation provides
+ a convenience feature to
+avoid this complexity; see the
+ <a href="#dynamic-interfaces">Dynamic interfaces at object creation</a> section above.
+</p>
+
+<p>
+After the object is created and realized, the application should acquire interfaces for each
+feature it needs, using {@code GetInterface} on the initial {@code SLObjectItf}.
+</p>
+
+<p>
+Finally, the object is available for use via its interfaces, though note that
+ some objects require
+further setup. In particular, an audio player with URI data source needs a bit
+ more preparation in
+order to detect connection errors. See the
+ <a href="#prefetch">Audio player prefetch</a> section for details.
+</p>
+
+<p>
+After your application is done with the object, you should explicitly destroy it; see the
+<a href="#destroy">Destroy</a> section below.
+</p>
+
+<h2 id="prefetch">Audio Player Prefetch</h2>
+
+<p>
+For an audio player with URI data source, {@code Object::Realize} allocates
+ resources but does not
+connect to the data source (<em>prepare</em>) or begin pre-fetching data. These occur once the
+player state is set to either {@code SL_PLAYSTATE_PAUSED} or {@code SL_PLAYSTATE_PLAYING}.
+</p>
+
+<p>
+Some information may still be unknown until relatively late in this sequence. In
+particular, initially {@code Player::GetDuration} returns {@code SL_TIME_UNKNOWN} and
+{@code MuteSolo::GetChannelCount} either returns successfully with channel count zero or the
+error result {@code SL_RESULT_PRECONDITIONS_VIOLATED}. These APIs return the proper values
+once they are known.
+</p>
+
+<p>
+Other properties that are initially unknown include the sample rate and
+ actual media content type
+based on examining the content's header (as opposed to the
+ application-specified MIME type and
+container type). These are also determined later during
+ prepare/prefetch, but there are no APIs to
+retrieve them.
+</p>
+
+<p>
+The prefetch status interface is useful for detecting when all information
+ is available, or your
+application can poll periodically. Note that some information, such as the
+ duration of a streaming
+MP3, may <em>never</em> be known.
+</p>
+
+<p>
+The prefetch status interface is also useful for detecting errors. Register a callback
+ and enable
+at least the {@code SL_PREFETCHEVENT_FILLLEVELCHANGE} and {@code SL_PREFETCHEVENT_STATUSCHANGE}
+events. If both of these events are delivered simultaneously, and
+{@code PrefetchStatus::GetFillLevel} reports a zero level, and
+{@code PrefetchStatus::GetPrefetchStatus} reports {@code SL_PREFETCHSTATUS_UNDERFLOW},
+ then this
+indicates a non-recoverable error in the data source. This includes the inability to
+ connect to the
+data source because the local filename does not exist or the network URI is invalid.
+</p>
+
+<p>
+The next version of OpenSL ES is expected to add more explicit support for
+ handling errors in the
+data source. However, for future binary compatibility, we intend to continue
+ to support the current
+method for reporting a non-recoverable error.
+</p>
+
+<p>
+In summary, a recommended code sequence is:
+</p>
+
+<ol>
+  <li>{@code Engine::CreateAudioPlayer}</li>
+  <li>{@code Object:Realize}</li>
+  <li>{@code Object::GetInterface} for {@code SL_IID_PREFETCHSTATUS}</li>
+  <li>{@code PrefetchStatus::SetCallbackEventsMask}</li>
+  <li>{@code PrefetchStatus::SetFillUpdatePeriod}</li>
+  <li>{@code PrefetchStatus::RegisterCallback}</li>
+  <li>{@code Object::GetInterface} for {@code SL_IID_PLAY}</li>
+  <li>{@code Play::SetPlayState} to {@code SL_PLAYSTATE_PAUSED}, or
+  {@code SL_PLAYSTATE_PLAYING}</li>
+</ol>
+
+<p class="note"><strong>Note: </strong>
+Preparation and prefetching occur here; during this time your callback is called with
+periodic status updates.
+</p>
+
+<h2 id="destroy">Destroy</h2>
+
+<p>
+Be sure to destroy all objects when exiting from your application.
+ Objects should be destroyed in
+reverse order of their creation, as it is not safe to destroy an object that has any dependent
+objects. For example, destroy in this order: audio players and recorders, output mix, and then
+finally the engine.
+</p>
+
+<p>
+OpenSL ES does not support automatic garbage collection or
+<a class="external-link" href="http://en.wikipedia.org/wiki/Reference_counting">reference
+counting</a> of interfaces. After you call {@code Object::Destroy}, all extant
+ interfaces that are
+derived from the associated object become undefined.
+</p>
+
+<p>
+The Android OpenSL ES implementation does not detect the incorrect use of such interfaces.
+Continuing to use such interfaces after the object is destroyed can cause your application to
+crash or behave in unpredictable ways.
+</p>
+
+<p>
+We recommend that you explicitly set both the primary object interface and all associated
+interfaces to NULL as part of your object destruction sequence, which prevents the accidental
+misuse of a stale interface handle.
+</p>
+
+<h2 id="panning">Stereo Panning</h2>
+
+<p>
+When {@code Volume::EnableStereoPosition} is used to enable stereo panning of a mono source,
+ there is a 3-dB reduction in total
+<a class="external-link" href="http://en.wikipedia.org/wiki/Sound_power_level">sound power
+level</a>. This is needed to permit the total sound power level to remain constant as
+ the source is
+panned from one channel to the other. Therefore, only enable stereo positioning if you need
+it. See the Wikipedia article on
+<a class="external-link" href="http://en.wikipedia.org/wiki/Panning_(audio)">audio panning</a>
+ for more information.
+</p>
+
+<h2 id="callbacks">Callbacks and Threads</h2>
+
+<p>
+Callback handlers are generally called synchronously with respect to the event. That is, at the
+moment and location that the event is detected by the implementation. This point is
+asynchronous with respect to the application, so you should use a non-blocking synchronization
+mechanism to control access to any variables shared between the application and the callback
+handler. In the example code, such as for buffer queues, we have either omitted this
+synchronization or used blocking synchronization in the interest of simplicity. However, proper
+non-blocking synchronization is critical for any production code.
+</p>
+
+<p>
+Callback handlers are called from internal non-application threads that are not attached to the
+Android runtime, so they are ineligible to use JNI. Because these internal threads are
+critical to
+the integrity of the OpenSL ES implementation, a callback handler should also not block
+ or perform
+excessive work.
+</p>
+
+<p>
+If your callback handler needs to use JNI or execute work that is not proportional to the
+callback, the handler should instead post an event for another thread to process. Examples of
+acceptable callback workload include rendering and enqueuing the next output buffer
+(for an AudioPlayer), processing the just-filled input buffer and enqueueing the next
+ empty buffer
+(for an AudioRecorder), or simple APIs such as most of the <em>Get</em> family. See the
+<a href="#perform">Performance</a> section below regarding the workload.
+</p>
+
+<p>
+Note that the converse is safe: an Android application thread that has entered JNI
+ is allowed to
+directly call OpenSL ES APIs, including those that block. However, blocking calls are not
+recommended from the main thread, as they may result in
+ <em>Application Not Responding</em> (ANR).
+</p>
+
+<p>
+The determination regarding the thread that calls a callback handler is largely left up to the
+implementation. The reason for this flexibility is to permit future optimizations,
+ especially on
+multi-core devices.
+</p>
+
+<p>
+The thread on which the callback handler runs is not guaranteed to have the same
+ identity across
+different calls. Therefore, do not rely on the {@code pthread_t returned by pthread_self()}
+ or the
+{@code pid_t returned by gettid()} to be consistent across calls. For the same reason,
+ do not use
+the thread local storage (TLS) APIs such as {@code pthread_setspecific()} and
+{@code pthread_getspecific()} from a callback.
+</p>
+
+<p>
+The implementation guarantees that concurrent callbacks of the same kind, for the
+ same object, does
+not occur. However, concurrent callbacks of different kinds for the same object are possible on
+different threads.
+</p>
+
+<h2 id="perform">Performance</h2>
+
+<p>
+As OpenSL ES is a native C API, non-runtime application threads that call OpenSL ES have no
+runtime-related overhead such as garbage collection pauses. With one exception described below,
+there is no additional performance benefit to the use of OpenSL ES other than this.
+ In particular,
+the use of OpenSL ES does not guarantee enhancements such as lower audio latency and higher
+scheduling priority over that which the platform generally provides. On the other hand, as the
+Android platform and specific device implementations continue to evolve, an OpenSL ES application
+can expect to benefit from any future system performance improvements.
+</p>
+
+<p>
+One such evolution is support for reduced
+<a href="{@docRoot}ndk/guides/audio/output-latency.html">audio output latency</a>.
+The underpinnings for reduced
+output latency were first included in Android 4.1 (API level 16), and then
+continued progress occurred in Android 4.2 (API level 17). These improvements are available via
+OpenSL ES for device implementations that
+ claim feature {@code android.hardware.audio.low_latency}.
+If the device doesn't claim this feature but supports Android 2.3 (API level 9)
+or later, then you can still use the OpenSL ES APIs but the output latency may be higher.
+ The lower
+output latency path is used only if the application requests a buffer size and sample rate
+ that are
+compatible with the device's native output configuration. These parameters are
+ device-specific and
+should be obtained as described below.
+</p>
+
+<p>
+Beginning with Android 4.2 (API level 17), an application can query for the
+platform native or optimal output sample rate and buffer size for the device's primary output
+stream. When combined with the feature test just mentioned, an app can now configure itself
+appropriately for lower latency output on devices that claim support.
+</p>
+
+<p>
+For Android 4.2 (API level 17) and earlier, a buffer count of two or more is
+required for lower latency. Beginning with Android 4.3 (API level 18), a buffer
+count of one is sufficient for lower latency.
+</p>
+
+<p>
+All OpenSL ES interfaces for output effects preclude the lower latency path.
+</p>
+
+<p>
+The recommended sequence is as follows:
+</p>
+
+<ol>
+  <li>Check for API level 9 or higher to confirm the use of OpenSL ES.</li>
+  <li>Check for the {@code android.hardware.audio.low_latency} feature using code such as this:
+    <pre>import android.content.pm.PackageManager;
+...
+PackageManager pm = getContext().getPackageManager();
+boolean claimsFeature = pm.hasSystemFeature(PackageManager.FEATURE_AUDIO_LOW_LATENCY);
+    </pre></li>
+  <li>Check for API level 17 or higher to confirm the use of
+  {@code android.media.AudioManager.getProperty()}.</li>
+  <li>Get the native or optimal output sample rate and buffer size for this device's
+  primary output
+  stream using code such as this:
+    <pre>import android.media.AudioManager;
+...
+AudioManager am = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
+String sampleRate = am.getProperty(AudioManager.PROPERTY_OUTPUT_SAMPLE_RATE));
+String framesPerBuffer = am.getProperty(AudioManager.PROPERTY_OUTPUT_FRAMES_PER_BUFFER));
+    </pre>
+  Note that {@code sampleRate} and {@code framesPerBuffer} are <em>strings</em>. First check for
+  null and then convert to int using {@code Integer.parseInt()}.</li>
+    <li>Now use OpenSL ES to create an AudioPlayer with PCM buffer queue data locator.</li>
+</ol>
+
+<p class="note"><strong>Note: </strong>
+You can use the
+<a class="external-link"
+ href="https://play.google.com/store/apps/details?id=com.levien.audiobuffersize">
+ Audio Buffer Size</a>
+test app to determine the native buffer size and sample rate for OpenSL ES audio
+applications on your audio device. You can also visit GitHub to view <a class="external-link"
+href="https://github.com/gkasten/high-performance-audio/tree/master/audio-buffer-size">
+audio-buffer-size</a> samples.
+
+<p>
+The number of lower latency audio players is limited. If your application requires more 
+than a few
+audio sources, consider mixing your audio at the application level. Be sure to destroy your audio
+players when your activity is paused, as they are a global resource shared with other apps.
+</p>
+
+<p>
+To avoid audible glitches, the buffer queue callback handler must execute within a small and
+predictable time window. This typically implies no unbounded blocking on mutexes, conditions,
+or I/O operations. Instead consider <em>try locks</em>, locks and waits with timeouts, and
+<a class="external-link"
+ href="https://source.android.com/devices/audio/avoiding_pi.html#nonBlockingAlgorithms">
+ non-blocking algorithms</a>.
+</p>
+
+<p>
+The computation required to render the next buffer (for AudioPlayer) or consume the previous
+buffer (for AudioRecord) should take approximately the same amount of time for each callback.
+Avoid algorithms that execute in a non-deterministic amount of time or are <em>bursty</em> in
+their computations. A callback computation is bursty if the CPU time spent in any given callback
+is significantly larger than the average. In summary, the ideal is for the CPU execution time of
+the handler to have variance near zero, and for the handler to not block for unbounded times.
+</p>
+
+<p>
+Lower latency audio is possible for these outputs only:
+</p>
+
+<ul>
+  <li>On-device speakers.</li>
+  <li>Wired headphones.</li>
+  <li>Wired headsets.</li>
+  <li>Line out.</li>
+  <li><a class="external-link" href="https://source.android.com/devices/audio/usb.html">
+  USB digital
+  audio</a>.</li>
+</ul>
+
+<p>
+On some devices, speaker latency is higher than other paths due to digital signal processing for
+speaker correction and protection.
+</p>
+
+<p>
+As of API level 21,
+<a href="{@docRoot}ndk/guides/audio/input-latency.html">lower latency audio input</a>
+ is supported
+on select devices. To take advantage of
+this feature, first confirm that lower latency output is available as described above. The
+capability for lower latency output is a prerequisite for the lower latency input feature. Then,
+create an AudioRecorder with the same sample rate and buffer size as would be used for output.
+OpenSL ES interfaces for input effects preclude the lower latency path. The record preset
+{@code SL_ANDROID_RECORDING_PRESET_VOICE_RECOGNITION} must be used for lower latency; this preset
+disables device-specific digital signal processing that may add latency to the input path. For
+more information on record presets, see the <a href="#configuration-interface">Android
+configuration interface</a> section above.
+</p>
+
+<p>
+For simultaneous input and output, separate buffer queue completion handlers are used for each
+side. There is no guarantee of the relative order of these callbacks, or the synchronization of
+the audio clocks, even when both sides use the same sample rate. Your application
+ should buffer the
+data with proper buffer synchronization.
+</p>
+
+<p>
+One consequence of potentially independent audio clocks is the need for asynchronous sample rate
+conversion. A simple (though not ideal for audio quality) technique for asynchronous sample rate
+conversion is to duplicate or drop samples as needed near a zero-crossing point.
+ More sophisticated
+conversions are possible.
+</p>
+
+<h2 id="sandp">Security and Permissions</h2>
+
+<p>
+As far as who can do what, security in Android is done at the process level. Java programming
+language code cannot do anything more than native code, nor can native code do anything more than
+Java programming language code. The only differences between them are the available APIs.
+</p>
+
+<p>
+Applications using OpenSL ES must request the permissions that they would need for similar
+non-native APIs. For example, if your application records audio, then it needs the
+{@code android.permission.RECORD_AUDIO} permission. Applications that use audio effects need
+{@code android.permission.MODIFY_AUDIO_SETTINGS}. Applications that play network URI resources
+need {@code android.permission.NETWORK}. See
+<a href="https://developer.android.com/training/permissions/index.html">Working with System
+Permissions</a> for more information.
+</p>
+
+<p>
+Depending on the platform version and implementation, media content parsers and
+ software codecs may
+run within the context of the Android application that calls OpenSL ES (hardware codecs are
+abstracted but are device-dependent). Malformed content designed to exploit parser and codec
+vulnerabilities is a known attack vector. We recommend that you play media only from trustworthy
+sources or that you partition your application such that code that handles media from
+untrustworthy sources runs in a relatively <em>sandboxed</em> environment. For example, you could
+process media from untrustworthy sources in a separate process. Though both processes would still
+run under the same UID, this separation does make an attack more difficult.
+</p>
diff --git a/docs/html/ndk/guides/audio/output-latency.jd b/docs/html/ndk/guides/audio/output-latency.jd
new file mode 100644
index 0000000..4aa97a6
--- /dev/null
+++ b/docs/html/ndk/guides/audio/output-latency.jd
@@ -0,0 +1,310 @@
+page.title=Audio Output Latency
+@jd:body
+
+<div id="qv-wrapper">
+    <div id="qv">
+      <h2>On this page</h2>
+
+      <ol>
+        <li><a href="#prereq">Prerequisites</a></li>
+        <li><a href="#low-lat-track">Obtain a Low-Latency Track</a></li>
+        <li><a href="#buffer-size">Use the Optimal Buffer Size When Enqueuing Audio Data</a></li>
+        <li><a href="#warmup-lat">Avoid Warmup Latency</a></li>
+      </ol>
+      <h2>Also read</h2>
+
+      <ol>
+        <li><a href="https://source.android.com/devices/audio/latency_app.html" class="external-link">
+        Audio Latency for App Developers</a></li>
+        <li><a href="https://source.android.com/devices/audio/latency_contrib.html" class="external-link">
+        Contributors to Audio Latency</a></li>
+        <li><a href="https://source.android.com/devices/audio/latency_measure.html" class="external-link">
+        Measuring Audio Latency</a></li>
+        <li><a href="https://source.android.com/devices/audio/warmup.html" class="external-link">
+        Audio Warmup</a></li>
+        <li><a href="https://en.wikipedia.org/wiki/Latency_%28audio%29" class="external-link">
+        Latency (audio)</a></li>
+        <li><a href="https://en.wikipedia.org/wiki/Round-trip_delay_time" class="external-link">
+        Round-trip delay time</a></li>
+      </ol>
+    </div>
+  </div>
+
+<a href="https://www.youtube.com/watch?v=PnDK17zP9BI" class="notice-developers-video">
+<div>
+    <h3>Video</h3>
+    <p>Audio latency: buffer sizes</p>
+</div>
+</a>
+
+<a href="https://www.youtube.com/watch?v=92fgcUNCHic" class="notice-developers-video">
+<div>
+    <h3>Video</h3>
+    <p>Building great multi-media experiences on Android</p>
+</div>
+</a>
+
+<p>This page describes how to develop your audio app for low-latency output and how to avoid
+warmup latency.</p>
+
+<h2 id="prereq">Prerequisites</h2>
+
+<p>Low-latency audio is currently only supported when using Android's implementation of the
+OpenSL ES™ API specification, and the Android NDK:
+</p>
+
+<ol>
+  <li>Download and install the <a href="{@docRoot}tools/sdk/ndk/index.html">Android NDK</a>.</li>
+  <li>Read the <a href="{@docRoot}ndk/guides/audio/opensl-for-android.html">OpenSL ES
+  documentation</a>.
+</ol>
+
+<h2 id="low-lat-track">Obtain a Low-Latency Track</h2>
+
+<p>Latency is the time it takes for a signal to travel through a system.  These are the common
+types of latency related to audio apps:
+
+<ul>
+  <li><strong>Audio output latency</strong> is the time between an audio sample being generated by an
+app and the sample being played through the headphone jack or built-in speaker.</li>
+
+  <li><strong>Audio input latency</strong> is the time between an audio signal being received by a
+device’s audio input, such as the microphone, and that same audio data being available to an
+app.</li>
+
+  <li><strong>Round-trip latency</strong> is the sum of input latency, app processing time, and
+  output latency.</li>
+
+  <li><strong>Touch latency</strong> is the time between a user touching the screen and that
+touch event being received by an app.</li>
+</ul>
+
+<p>It is difficult to test audio output latency in isolation since it requires knowing exactly
+when the first sample is sent into the audio path (although this can be done using a
+<a href="https://source.android.com/devices/audio/testing_circuit.html" class="external-link">
+light testing circuit</a> and an oscilloscope). If you know the round-trip audio latency, you can
+use the rough rule of thumb: <strong>audio output latency is half the round-trip audio latency
+over paths without signal processing</strong>.
+</p>
+
+<p>To obtain the lowest latency, you must supply audio data that matches the device's optimal
+sample rate and buffer size. For more information, see
+<a href="https://source.android.com/devices/audio/latency_design.html" class="external-link">
+Design For Reduced Latency</a>.</p>
+
+<h3>Obtain the optimal sample rate</h3>
+
+<p>In Java, you can obtain the optimal sample rate from AudioManager as shown in the following
+code example:</p>
+
+<pre>
+AudioManager am = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
+String frameRate = am.getProperty(AudioManager.PROPERTY_OUTPUT_SAMPLE_RATE);
+int frameRateInt = Integer.parseInt(frameRate);
+if (frameRateInt == 0) frameRateInt = 44100; // Use a default value if property not found
+</pre>
+
+<p class="note">
+  <strong>Note:</strong> The sample rate refers to the rate of each stream. If your source audio
+  has two channels (stereo), then you will have one stream playing a pair of samples (frame) at
+  <a href="{@docRoot}reference/android/media/AudioManager.html#PROPERTY_OUTPUT_SAMPLE_RATE">
+  PROPERTY_OUTPUT_SAMPLE_RATE</a>.
+</p>
+
+<h3>Use the optimal sample rate when creating your audio player</h3>
+
+<p>Once you have the optimal sample output rate, you can supply it when creating your player
+using OpenSL ES:</p>
+
+<pre>
+// create buffer queue audio player
+void Java_com_example_audio_generatetone_MainActivity_createBufferQueueAudioPlayer
+        (JNIEnv* env, jclass clazz, jint sampleRate, jint framesPerBuffer)
+{
+   ...
+   // specify the audio source format
+   SLDataFormat_PCM format_pcm;
+   format_pcm.numChannels = 2;
+   format_pcm.samplesPerSec = (SLuint32) sampleRate * 1000;
+   ...
+}
+</pre>
+
+<p class="note">
+  <strong>Note:</strong> {@code samplesPerSec} refers to the <em>sample rate per channel in
+  millihertz</em> (1 Hz = 1000 mHz).
+</p>
+
+<h3>Avoid adding output interfaces that involve signal processing</h3>
+
+<p>Only these interfaces are supported by the fast mixer:</p>
+
+<ul>
+  <li>SL_IID_ANDROIDSIMPLEBUFFERQUEUE</li>
+  <li>SL_IID_VOLUME</li>
+  <li>SL_IID_MUTESOLO</li>
+</ul>
+
+<p>These interfaces are not allowed because they involve signal processing and will cause
+your request for a fast-track to be rejected:</p>
+
+<ul>
+  <li>SL_IID_BASSBOOST</li>
+  <li>SL_IID_EFFECTSEND</li>
+  <li>SL_IID_ENVIRONMENTALREVERB</li>
+  <li>SL_IID_EQUALIZER</li>
+  <li>SL_IID_PLAYBACKRATE</li>
+  <li>SL_IID_PRESETREVERB</li>
+  <li>SL_IID_VIRTUALIZER</li>
+  <li>SL_IID_ANDROIDEFFECT</li>
+  <li>SL_IID_ANDROIDEFFECTSEND</li>
+</ul>
+
+<p>When you create your player, make sure you only add <em>fast</em> interfaces, as shown in
+the following example:</p>
+
+<pre>
+const SLInterfaceID interface_ids[2] = { SL_IID_ANDROIDSIMPLEBUFFERQUEUE, SL_IID_VOLUME };
+</pre>
+
+<h3>Verify you're using a low-latency track</h3>
+
+<p>Complete these steps to verify that you have successfully obtained a low-latency track:</p>
+
+<ol>
+  <li>Launch your app and then run the following command:</li>
+
+<pre>
+adb shell ps | grep your_app_name
+</pre>
+
+  <li>Make a note of your app's process ID.</li>
+
+  <li>Now, play some audio from your app. You have approximately three seconds to run the
+following command from the terminal:</li>
+
+<pre>
+adb shell dumpsys media.audio_flinger
+</pre>
+
+  <li>Scan for your process ID. If you see an <em>F</em> in the <em>Name</em> column, it's on a
+low-latency track (the F stands for <em>fast track</em>).</li>
+
+</ol>
+
+<h3>Measure round-trip latency</h3>
+
+<p>You can measure round-trip audio latency by creating an app that generates an audio signal,
+listens for that signal, and measures the time between sending it and receiving it.
+Alternatively, you can install this
+<a href="https://play.google.com/store/apps/details?id=org.drrickorang.loopback" class="external-link">
+latency testing app</a>. This performs a round-trip latency test using the
+<a href="https://source.android.com/devices/audio/latency_measure.html#larsenTest" class="external-link">
+Larsen test</a>. You can also
+<a href="https://github.com/gkasten/drrickorang/tree/master/LoopbackApp" class="external-link">
+view the source code</a> for the latency testing app.</p>
+
+<p>Since the lowest latency is achieved over audio paths with minimal signal processing, you may
+also want to use an
+<a href="https://source.android.com/devices/audio/latency_measure.html#loopback" class="external-link">
+Audio Loopback Dongle</a>, which allows the test to be run over the headset connector.</p>
+
+<p>The lowest possible round-trip audio latency varies greatly depending on device model and
+Android build. You can measure it yourself using the latency testing app and loopback
+dongle. When creating apps for <em>Nexus devices</em>, you can also use the
+<a href="https://source.android.com/devices/audio/latency_measurements.html" class="external-link">
+published measurements</a>.</p>
+
+<p>You can also get a rough idea of audio performance by testing whether the device reports
+support for the
+<a href="http://developer.android.com/reference/android/content/pm/PackageManager.html#FEATURE_AUDIO_LOW_LATENCY">
+low_latency</a> and
+<a href="http://developer.android.com/reference/android/content/pm/PackageManager.html#FEATURE_AUDIO_PRO">
+pro</a> hardware features.</p>
+
+<h3>Review the CDD and audio latency</h3>
+
+<p>The Android Compatibility Definition Document (CDD) enumerates the hardware and software
+requirements of a compatible Android device.
+See <a href="https://source.android.com/compatibility/" class="external-link">
+Android Compatibility</a> for more information on the overall compatibility program, and
+<a href="https://static.googleusercontent.com/media/source.android.com/en//compatibility/android-cdd.pdf" class="external-link">
+CDD</a> for the actual CDD document.</p>
+
+<p>In the CDD, round-trip latency is specified as 20&nbsp;ms or lower (even though musicians
+generally require 10&nbsp;ms). This is because there are important use cases that are enabled by
+20&nbsp;ms.</p>
+
+<p>There is currently no API to determine audio latency over any path on an Android device at
+runtime. You can, however, use the following hardware feature flags to find out whether the
+device makes any guarantees for latency:</p>
+
+<ul>
+  <li><a href="http://developer.android.com/reference/android/content/pm/PackageManager.html#FEATURE_AUDIO_LOW_LATENCY">
+android.hardware.audio.low_latency</a> indicates a continuous output latency of 45&nbsp;ms or
+less.</li>
+
+  <li><a href="http://developer.android.com/reference/android/content/pm/PackageManager.html#FEATURE_AUDIO_PRO">
+android.hardware.audio.pro</a> indicates a continuous round-trip latency of 20&nbsp;ms or
+less.</li>
+</ul>
+
+<p>The criteria for reporting these flags is defined in the CDD in sections <em>5.6 Audio
+Latency</em> and <em>5.10 Professional Audio</em>.</p>
+
+<p>Here’s how to check for these features in Java:</p>
+
+<pre>
+boolean hasLowLatencyFeature =
+    getPackageManager().hasSystemFeature(PackageManager.FEATURE_AUDIO_LOW_LATENCY);
+
+boolean hasProFeature =
+    getPackageManager().hasSystemFeature(PackageManager.FEATURE_AUDIO_PRO);
+</pre>
+
+<p>Regarding the relationship of audio features, the {@code android.hardware.audio.low_latency}
+feature is a prerequisite for {@code android.hardware.audio.pro}. A device can implement
+{@code android.hardware.audio.low_latency} and not {@code android.hardware.audio.pro}, but not
+vice-versa.</p>
+
+<h2 id="buffer-size">Use the Optimal Buffer Size When Enqueuing Audio Data</h2>
+
+<p>You can obtain the optimal buffer size in a similar way to the optimal frame rate, using the
+AudioManager API:</p>
+
+<pre>
+AudioManager am = (AudioManager) getSystemService(Context.AUDIO_SERVICE);
+String framesPerBuffer = am.getProperty(AudioManager.PROPERTY_OUTPUT_FRAMES_PER_BUFFER);
+int framesPerBufferInt = Integer.parseInt(framesPerBuffer);
+if (framesPerBufferInt == 0) framesPerBufferInt = 256; // Use default
+</pre>
+
+<p>The
+<a href="{@docRoot}reference/android/media/AudioManager.html#PROPERTY_OUTPUT_FRAMES_PER_BUFFER">
+PROPERTY_OUTPUT_FRAMES_PER_BUFFER</a> property indicates the number of audio frames
+that the HAL (Hardware Abstraction Layer) buffer can hold. You should construct your audio
+buffers so that they contain an exact multiple of this number. If you use the correct number
+of audio frames, your callbacks occur at regular intervals, which reduces jitter.</p>
+
+<p>It is important to use the API to determine buffer size rather than using a hardcoded value,
+ because HAL buffer sizes differ across devices and across Android builds.</p>
+
+<h2 id="warmup-lat">Avoid Warmup Latency</h2>
+
+<p>When you enqueue audio data for the first time, it takes a small, but still significant,
+amount of time for the device audio circuit to warm up. To avoid this warmup latency, you should
+enqueue buffers of audio data containing silence, as shown in the following code example:</p>
+
+<pre>
+#define CHANNELS 1
+static short* silenceBuffer;
+int numSamples = frames * CHANNELS;
+silenceBuffer = malloc(sizeof(*silenceBuffer) * numSamples);
+    for (i = 0; i < numSamples; i++) {
+        silenceBuffer[i] = 0;
+    }
+</pre>
+
+<p>At the point when audio should be produced, you can switch to enqueuing buffers containing
+real audio data.</p>
+
diff --git a/docs/html/ndk/guides/audio/sample-rates.jd b/docs/html/ndk/guides/audio/sample-rates.jd
new file mode 100644
index 0000000..da68597
--- /dev/null
+++ b/docs/html/ndk/guides/audio/sample-rates.jd
@@ -0,0 +1,151 @@
+page.title=Sample Rates
+@jd:body
+
+<div id="qv-wrapper">
+    <div id="qv">
+      <h2>On this page</h2>
+
+      <ol>
+        <li><a href="#best">Best Practices for Sampling and Resampling</a></li>
+        <li><a href="#info">For More Information</a></li>
+      </ol>
+    </div>
+  </div>
+
+<a class="notice-developers-video" href="https://www.youtube.com/watch?v=6Dl6BdrA-sQ">
+<div>
+    <h3>Video</h3>
+    <p>Sample Rates: Why Can't We All Just Agree?</p>
+</div>
+</a>
+
+<p>As of Android 5.0 (Lollipop), the audio resamplers are now entirely based
+on FIR filters derived from a Kaiser windowed-sinc function. The Kaiser windowed-sinc
+offers the following properties:
+<ul>
+    <li>It is straightforward to calculate for its design parameters (stopband
+ ripple, transition bandwidth, cutoff frequency, filter length).</li>
+<li>It is nearly optimal for reduction of stopband energy compared to overall
+energy.</li>
+</ul>
+See P.P. Vaidyanathan, <a class="external-link"
+href="https://drive.google.com/file/d/0B7tBh7YQV0DGak9peDhwaUhqY2c/view">
+<i>Multirate Systems and Filter Banks</i></a>, p. 50 for discussions of the
+Kaiser Window and its optimality and relationship to Prolate Spheroidal
+Windows.</p>
+
+<p>The design parameters are automatically computed based on internal
+quality determination and the sampling ratios desired. Based on the
+design parameters, the windowed-sinc filter is generated.  For music use,
+the resampler for 44.1 to 48 kHz and vice versa is generated at a higher
+quality than for arbitrary frequency conversion.</p>
+
+<p>The audio resamplers provide increased quality, as well as speed
+to achieve that quality. But resamplers can introduce small amounts
+of passband ripple and aliasing harmonic noise, and they can cause some high
+frequency loss in the transition band, so avoid using them unnecessarily.</p>
+
+<h2 id="best">Best Practices for Sampling and Resampling</h2>
+<p>This section describes some best practices to help you avoid problems with sampling rates.</p>
+<h3>Choose the sampling rate to fit the device</h3>
+
+<p>In general, it is best to choose the sampling rate to fit the device,
+typically 44.1 kHz or 48 kHz.  Use of a sample rate greater than
+48 kHz will typically result in decreased quality because a resampler must be
+used to play back the file.</p>
+
+<h3>Use simple resampling ratios (fixed versus interpolated polyphases)</h3>
+
+<p>The resampler operates in one of the following modes:</p>
+<ul>
+    <li>Fixed polyphase mode. The filter coefficients for each polyphase are precomputed.</li>
+    <li>Interpolated polyphase mode. The filter coefficients for each polyphase must
+be interpolated from the nearest two precomputed polyphases.</li>
+</ul>
+<p>The resampler is fastest in fixed polyphase mode, when the ratio of input
+rate over output rate L/M (taking out the greatest common divisor)
+has M less than 256.  For example, for 44,100 to 48,000 conversion, L = 147,
+M = 160.</p>
+
+<p>In fixed polyphase mode, the sampling rate is locked for as
+many samples converted and does not change.  In interpolated polyphase
+mode, the sampling rate is approximate. The drift is generally on the
+order of one sample over a few hours of playback on a 48-kHz device.
+This is not usually a concern because approximation error is much less than
+frequency error of internal quartz oscillators, thermal drift, or jitter
+ (typically tens of ppm).</p>
+
+<p>Choose simple-ratio sampling rates such as 24 kHz (1:2) and 32 kHz (2:3) when playing back
+ on a 48-kHz device, even though other sampling
+rates and ratios may be permitted through AudioTrack.</p>
+
+<h3>Use upsampling rather than downsampling when changing sample rates</h3>
+
+<p>Sampling rates can be changed on the fly. The granularity of
+such change is based on the internal buffering (typically a few hundred
+samples), not on a sample-by-sample basis. This can be used for effects.</p>
+
+<p>Do not dynamically change sampling rates when
+downsampling. When changing sample rates after an audio track is
+created, differences of around 5 to 10 percent from the original rate may
+trigger a filter recomputation when downsampling (to properly suppress
+aliasing). This can consume computing resources and may cause an audible click
+if the filter is replaced in real time.</p>
+
+<h3>Limit downsampling to no more than 6:1</h3>
+
+<p>Downsampling is typically triggered by hardware device requirements. When the
+ Sample Rate converter is used for downsampling,
+try to limit the downsampling ratio to no more than 6:1 for good aliasing
+suppression (for example, no greater downsample than 48,000 to 8,000). The filter
+lengths adjust to match the downsampling ratio, but you sacrifice more
+transition bandwidth at higher downsampling ratios to avoid excessively
+increasing the filter length. There are no similar aliasing concerns for
+upsampling.  Note that some parts of the audio pipeline
+may prevent downsampling greater than 2:1.</p>
+
+<h3 id="latency">If you are concerned about latency, do not resample</h3>
+
+<p>Resampling prevents the track from being placed in the FastMixer
+path, which means that significantly higher latency occurs due to the additional,
+ larger buffer in the ordinary Mixer path. Furthermore,
+ there is an implicit delay from the filter length of the resampler,
+ though this is typically on the order of one millisecond or less,
+ which is not as large as the additional buffering for the ordinary Mixer path
+ (typically 20 milliseconds).</p>
+
+<h2 id="info">For More Information</h2>
+<p>This section lists some additional resources about sampling and resampling.</p>
+
+<h3>Sample rates</h3>
+
+<p>
+<a href="http://en.wikipedia.org/wiki/Sampling_%28signal_processing%29" class="external-link" >
+Sampling (signal processing)</a> at Wikipedia.</p>
+
+<h3>Resampling</h3>
+
+<p><a href="http://en.wikipedia.org/wiki/Sample_rate_conversion" class="external-link" >
+Sample rate conversion</a> at Wikipedia.</p>
+
+<p><a href="http://source.android.com/devices/audio/src.html" class="external-link" >
+Sample Rate Conversion</a> at source.android.com.</p>
+
+<h3>The high bit-depth and high kHz controversy</h3>
+
+<p><a href="http://people.xiph.org/~xiphmont/demo/neil-young.html" class="external-link" >
+24/192 Music Downloads ... and why they make no sense</a>
+by Christopher "Monty" Montgomery of Xiph.Org.</p>
+
+<p><a href="https://www.youtube.com/watch?v=cIQ9IXSUzuM" class="external-link" >
+D/A and A/D | Digital Show and Tell</a>
+video by Christopher "Monty" Montgomery of Xiph.Org.</p>
+
+<p><a href="http://www.trustmeimascientist.com/2013/02/04/the-science-of-sample-rates-when-higher-is-better-and-when-it-isnt/" class="external-link">
+The Science of Sample Rates (When Higher Is Better - And When It Isn't)</a>.</p>
+
+<p><a href="http://www.image-line.com/support/FLHelp/html/app_audio.htm" class="external-link" >
+Audio Myths &amp; DAW Wars</a></p>
+
+<p><a href="http://forums.stevehoffman.tv/threads/192khz-24bit-vs-96khz-24bit-debate-interesting-revelation.317660/" class="external-link">
+192kHz/24bit vs. 96kHz/24bit "debate"- Interesting revelation</a></p>
diff --git a/docs/html/ndk/guides/guides_toc.cs b/docs/html/ndk/guides/guides_toc.cs
index 98fc54d..09b2a12 100644
--- a/docs/html/ndk/guides/guides_toc.cs
+++ b/docs/html/ndk/guides/guides_toc.cs
@@ -63,13 +63,23 @@
       </ul>
    </li>
 
-      <li class="nav-section">
+   <li class="nav-section">
       <div class="nav-section-header"><a href="<?cs var:toroot ?>ndk/guides/audio/index.html">
       <span class="en">Audio</span></a></div>
       <ul>
       <li><a href="<?cs var:toroot ?>ndk/guides/audio/basics.html">Basics</a></li>
       <li><a href="<?cs var:toroot ?>ndk/guides/audio/opensl-for-android.html">OpenSL ES for
       Android</a></li>
+      <li><a href="<?cs var:toroot ?>ndk/guides/audio/input-latency.html">Audio Input
+      Latency</a></li>
+      <li><a href="<?cs var:toroot ?>ndk/guides/audio/output-latency.html">Audio Output
+      Latency</a></li>
+      <li><a href="<?cs var:toroot ?>ndk/guides/audio/floating-point.html">Floating-Point
+      Audio</a></li>
+      <li><a href="<?cs var:toroot ?>ndk/guides/audio/sample-rates.html">Sample Rates
+      </a></li>
+      <li><a href="<?cs var:toroot ?>ndk/guides/audio/opensl-prog-notes.html">OpenSL ES Programming Notes
+      </a></li>
       </ul>
    </li>
 
diff --git a/docs/html/sdk/sdk_vars.cs b/docs/html/sdk/sdk_vars.cs
index 6e58ddd..f0f25b6 100644
--- a/docs/html/sdk/sdk_vars.cs
+++ b/docs/html/sdk/sdk_vars.cs
@@ -1,27 +1,27 @@
 <?cs
-set:studio.version='2.1.0.9' ?><?cs
-set:studio.release.date='April 26, 2016' ?><?cs
+set:studio.version='2.1.1.0' ?><?cs
+set:studio.release.date='May 11, 2016' ?><?cs
 
 
-set:studio.linux_bundle_download='android-studio-ide-143.2790544-linux.zip' ?><?cs
-set:studio.linux_bundle_bytes='298122012' ?><?cs
-set:studio.linux_bundle_checksum='45dad9b76ad0506c354483aaa67ea0e2468d03a5' ?><?cs
+set:studio.linux_bundle_download='android-studio-ide-143.2821654-linux.zip' ?><?cs
+set:studio.linux_bundle_bytes='298125051' ?><?cs
+set:studio.linux_bundle_checksum='55d69ad2da0068d818718b26ba43550fbcbeb7e9' ?><?cs
 
-set:studio.mac_bundle_download='android-studio-ide-143.2790544-mac.dmg' ?><?cs
-set:studio.mac_bundle_bytes='298589307' ?><?cs
-set:studio.mac_bundle_checksum='d667d93ae2e4e0f3fc1b95743329a46222dbf11d' ?><?cs
+set:studio.mac_bundle_download='android-studio-ide-143.2821654-mac.dmg' ?><?cs
+set:studio.mac_bundle_bytes='298597716' ?><?cs
+set:studio.mac_bundle_checksum='4a7ca7532a95c65ee59ed50193c0e976f0272472' ?><?cs
 
-set:studio.win_bundle_download='android-studio-ide-143.2790544-windows.zip' ?><?cs
-set:studio.win_bundle_bytes='300627540' ?><?cs
-set:studio.win_bundle_checksum='9689ba415e5f09e2dcf5263ea302e7b1d98a8fc6' ?><?cs
+set:studio.win_bundle_download='android-studio-ide-143.2821654-windows.zip' ?><?cs
+set:studio.win_bundle_bytes='300630577' ?><?cs
+set:studio.win_bundle_checksum='9bec4905e40f0ac16ac7fde63a50f3fbc1eec4d9' ?><?cs
 
-set:studio.win_bundle_exe_download='android-studio-bundle-143.2790544-windows.exe' ?><?cs
-set:studio.win_bundle_exe_bytes='1238568304' ?><?cs
-set:studio.win_bundle_exe_checksum='c6abe7980dbb7d1d9887f7341a2942c9e506f891' ?><?cs
+set:studio.win_bundle_exe_download='android-studio-bundle-143.2821654-windows.exe' ?><?cs
+set:studio.win_bundle_exe_bytes='1238569296' ?><?cs
+set:studio.win_bundle_exe_checksum='6f7fcdc30800bd8b3fbd5a14c2b9857243144650' ?><?cs
 
-set:studio.win_notools_exe_download='android-studio-ide-143.2790544-windows.exe' ?><?cs
-set:studio.win_notools_exe_bytes='283804056' ?><?cs
-set:studio.win_notools_exe_checksum='a2065ba737ddcfb96f4921fee6a038278f46d2a7' ?><?cs
+set:studio.win_notools_exe_download='android-studio-ide-143.2821654-windows.exe' ?><?cs
+set:studio.win_notools_exe_bytes='283805040' ?><?cs
+set:studio.win_notools_exe_checksum='d8cb3968814b6155f4effe727baf23b18b9f8360' ?><?cs
 
 
 
diff --git a/docs/html/tools/help/emulator.jd b/docs/html/tools/help/emulator.jd
index 08e3f6f..5ff3367 100644
--- a/docs/html/tools/help/emulator.jd
+++ b/docs/html/tools/help/emulator.jd
@@ -42,6 +42,7 @@
       </li>
       <li><a href="#console">Using the Emulator Console</a>
         <ol>
+          <li><a href="#console-session">Starting and Stopping a Console Session</a></li>
           <li><a href="#portredirection">Port Redirection</a></li>
           <li><a href="#geo">Geo Location Provider Emulation</a></li>
           <li><a href="#events">Hardware Events Emulation</a></li>
@@ -1440,32 +1441,106 @@
 
 <p>Each running emulator instance provides a console that lets you query and control the emulated
 device environment. For example, you can use the console to manage port redirection, network
-characteristics, and telephony events while your application is running on the emulator. To
-access the console and enter commands, use telnet to connect to the console's port number.</p>
+characteristics, and telephony events while your application is running on the emulator.
 
-<p>To connect to the console of any running emulator instance at any time, use this command: </p>
+<h3 id="console-session">Starting and Stopping a Console Session</h2>
+<p>To access the console and enter commands, from a terminal window, use <code>telnet</code> to
+connect to the
+console port and provide your authentication token.</p>
 
-<pre>telnet localhost &lt;console-port&gt;</pre>
 
-<p>An emulator instance occupies a pair of adjacent ports: a console port and an  {@code adb} port.
-The port numbers differ by 1, with the  {@code adb} port having the higher port number. The console
-of the first emulator instance running on a given machine uses console port 5554 and  {@code adb}
+<p>To connect to the console of a running emulator instance:</p>
+
+<ol>
+<li>Open a terminal window and enter the following command: </li>
+
+<pre>telnet localhost <em>console-port</em></pre>
+
+<p>An emulator instance occupies a pair of adjacent ports: a console port and an <code>adb</code> port.
+The port numbers differ by 1, with the <code>adb</code> port having the higher port number. The console
+of the first emulator instance running on a particular machine uses console port 5554 and <code>adb</code>
 port 5555. Subsequent instances use port numbers increasing by two &mdash; for example, 5556/5557,
 5558/5559, and so on. Up to 16 concurrent emulator instances can run a console facility. </p>
 
-<p>To connect to the emulator console, you must specify a valid console port. If multiple emulator instances are running, you need to determine the console port of the emulator instance you want to connect to. You can find the instance's console port listed in the title of the instance window. For example, here's the window title for an instance whose console port is 5554:</p>
+<p>To connect to the emulator console, you must specify a valid console port. If multiple emulator
+  instances are running, you need to determine the console port of the emulator instance you want
+  to connect to. The emulator window title lists the console port number. For example, the
+  window title for an emulator using console port 5554
+  could be <code>5554:Nexus_5X_API_23</code>.</p>
 
-<p><code>Android Emulator (5554)</code></p>
+<p>Alternatively, you can use the <code>adb devices</code> command, which prints a list of
+  running emulator instances and their console port numbers. For more information, see
+  <a href="{@docRoot}tools/help/adb.html#devicestatus">Querying for Emulator/Device Instances</a>.
+</p>
 
-<p>Alternatively, you can use the <code>adb devices</code> command, which prints a list of running emulator instances and their console port numbers. For more information, see <a href="{@docRoot}tools/help/adb.html#devicestatus">Querying for Emulator/Device Instances</a> in the adb documentation.</p>
+<p class="note">Note: The emulator listens for connections on ports 5554 to 5587 and accepts
+  connections from localhost only.</p>
 
-<p class="note">Note: The emulator listens for connections on ports 5554-5587 and accepts connections only from localhost.</p>
+<li>After the console displays <code>OK</code>, enter the <code>auth
+<em>auth_token</em></code> command.</li>
 
-<p>Once you are connected to the console, you can then enter <code>help [command]</code> to see a list of console commands and learn about specific commands. </p>
+<p>Before you can enter console commands, the emulator console requires authentication.
+  <code><em>auth_token</em></code> must
+  match the contents of the <code>.emulator_console_auth_token</code> file in your home directory.
+</p>
 
-<p>To exit the console session, use <code>quit</code> or <code>exit</code>.</p>
+<p>If that file doesn't exist, the <code>telnet localhost <em>console-port</em></code>
+  command creates the file, which contains a randomly generated authentication token.</p>
 
-<p>The following sections below describe the major functional areas of the console.</p>
+<p>To disable authentication, delete the token from the
+  <code>.emulator_console_auth_token</code> file or create an empty file if it doesn't exist.</p>
+
+<li>After you're connected to the console, enter console commands.</li>
+
+<p>Enter <code>help</code> and <code>help <em>command</em></code> to see a
+  list of console commands and learn about specific commands.</p>
+
+<li>To exit the console session, enter <code>quit</code> or <code>exit</code>.</li>
+</ol>
+
+<p>Here's an example session:</p>
+
+<pre class="no-pretty-print">
+me-macbook$ <strong>telnet localhost 5554</strong>
+Trying ::1...
+telnet: connect to address ::1: Connection refused
+Trying 127.0.0.1...
+Connected to localhost.
+Escape character is '^]'.
+Android Console: Authentication required
+Android Console: type 'auth &lt;auth_token&gt;' to authenticate
+Android Console: you can find your &lt;auth_token&gt; in
+'/Users/me/.emulator_console_auth_token'
+OK
+<strong>auth 123456789ABCdefZ</strong>
+Android Console: type 'help' for a list of commands
+OK
+<strong>help</strong>
+Android console command help:
+
+    help|h|?         print a list of commands
+    crash            crash the emulator instance
+    kill             kill the emulator instance
+    quit|exit        quit control session
+    redir            manage port redirections
+    power            power related commands
+    event            simulate hardware events
+    avd              control virtual device execution
+    finger           manage emulator fingerprint
+    geo              Geo-location commands
+    sms              SMS related commands
+    cdma             CDMA related commands
+    gsm              GSM related commands
+    rotate           rotate the screen by 90 degrees
+
+try 'help &lt;command&gt;' for command-specific help
+OK
+<strong>exit</strong>
+Connection closed by foreign host.
+me-macbook$
+</pre>
+
+<p>The following sections describe the major functional areas of the console.</p>
 
 
 <h3 id="portredirection">Port Redirection</h3>
diff --git a/docs/html/tools/revisions/studio.jd b/docs/html/tools/revisions/studio.jd
index 5747f52..9c9ac44 100755
--- a/docs/html/tools/revisions/studio.jd
+++ b/docs/html/tools/revisions/studio.jd
@@ -53,6 +53,18 @@
 <div class="toggle-content open">
   <p><a href="#" onclick="return toggleContent(this)">
     <img src="{@docRoot}assets/images/styles/disclosure_up.png" class="toggle-content-img"
+      alt=""/>Android Studio v2.1.1</a> <em>(May 2016)</em>
+  </p>
+  <div class="toggle-content-toggleme">
+    <p>Security release update.</p>
+  </div>
+</div>
+
+
+
+<div class="toggle-content closed">
+  <p><a href="#" onclick="return toggleContent(this)">
+    <img src="{@docRoot}assets/images/styles/disclosure_down.png" class="toggle-content-img"
       alt=""/>Android Studio v2.1.0</a> <em>(April 2016)</em>
   </p>
   <div class="toggle-content-toggleme">
diff --git a/docs/html/tools/sdk/tools-notes.jd b/docs/html/tools/sdk/tools-notes.jd
index f72d3f3..ac1f4ce 100644
--- a/docs/html/tools/sdk/tools-notes.jd
+++ b/docs/html/tools/sdk/tools-notes.jd
@@ -25,9 +25,43 @@
 </p>
 
 <div class="toggle-content opened">
+  <p><a href="#" onclick="return toggleContent(this)">
+    <img src="/assets/images/styles/disclosure_up.png" class="toggle-content-img"
+      alt=""/>SDK Tools, Revision 25.1.6</a> <em>(May 2016)</em>
+  </p>
+
+  <div class="toggle-content-toggleme">
+
+    <dl>
+    <dt>Dependencies:</dt>
+
+    <dd>
+      <ul>
+        <li>Android SDK Platform-tools revision 23 or later.</li>
+      </ul>
+    </dd>
+
+    <dt>General Notes:</dt>
+    <dd>
+      <ul>
+        <li>The Android Emulator Console now requires
+        <a href="/tools/help/emulator.html#console-session">authentication</a>
+        before you can enter commands. Enter the <code>auth <em>auth_token</em></code> command after
+          you <code>telnet</code> to an emulator instance. <code><em>auth_token</em></code> must
+        match the contents of the <code>.emulator_console_auth_token</code> file in your
+        home directory.
+        </li>
+      </ul>
+    </dd>
+
+  </div>
+</div>
+
+
+<div class="toggle-content closed">
   <p>
     <a href="#" onclick="return toggleContent(this)"><img src=
-    "{@docRoot}assets/images/styles/disclosure_up.png" class=
+    "{@docRoot}assets/images/styles/disclosure_down.png" class=
     "toggle-content-img" alt="">SDK Tools, Revision 25.0.0</a>
     <em>(April 2016)</em>
   </p>
diff --git a/docs/html/tools/support-library/index.jd b/docs/html/tools/support-library/index.jd
index 64d43b8..dfa8863 100644
--- a/docs/html/tools/support-library/index.jd
+++ b/docs/html/tools/support-library/index.jd
@@ -185,6 +185,135 @@
 <p>This section provides details about the Support Library package releases.</p>
 
 <div class="toggle-content opened">
+  <p id="rev23-4-0">
+    <a href="#" onclick="return toggleContent(this)"><img src=
+    "{@docRoot}assets/images/styles/disclosure_up.png" class=
+    "toggle-content-img" alt="">Android Support Library, revision 23.4.0</a>
+    <em>(May 2016)</em>
+  </p>
+
+  <div class="toggle-content-toggleme">
+    <dl>
+      <dt>
+        Changes for <a href=
+        "{@docRoot}tools/support-library/features.html#v4">v4 Support
+        Library</a>:
+      </dt>
+
+      <dd>
+        <ul>
+          <li>Fixed issue where fragments were added in the wrong order.
+          (<a class="external-link" href=
+          "https://code.google.com/p/android/issues/detail?id=206901">Issue
+          206901</a>)
+          </li>
+
+          <li>Fixed issue where app bar wasn't drawn after being scrolled
+          offscreen. (<a class="external-link" href=
+          "https://code.google.com/p/android/issues/detail?id=178037">Issue
+          178037</a>)
+          </li>
+        </ul>
+      </dd>
+
+      <dt>
+        Changes for <a href=
+        "{@docRoot}tools/support-library/features.html#v7-appcompat">v7
+        appcompat library</a>:
+      </dt>
+
+      <dd>
+        <ul>
+          <li>Added <!-- TODO: Link to method -->
+             <code><a href=
+            "{@docRoot}reference/android/support/v7/app/AppCompatDelegate.html">
+            AppCompatDelegate</a>.setCompatVectorFromResourcesEnabled()</code>
+            method to re-enable usage of vector drawables in {@link
+            android.graphics.drawable.DrawableContainer} objects on devices
+            running Android 4.4 (API level 19) and lower. See <a href=
+            "https://medium.com/@chrisbanes/appcompat-v23-2-age-of-the-vectors-91cbafa87c88#.44uulkfal"
+            class="external-link">AppCompat v23.2 — Age of the vectors</a> for
+            more information.
+          </li>
+
+          <li>Fixed an issue in API 23 with <a href=
+          "{@docRoot}reference/android/support/v7/app/AppCompatDelegate.html#setDefaultNightMode(int)">
+            <code>AppCompatDelegate.setDefaultNightMode()</code></a> not
+            loading correct resources in API level 23. (<a class=
+            "external-link" href=
+            "https://code.google.com/p/android/issues/detail?id=206573">Issue
+            206573</a>)
+          </li>
+
+          <li>Fixed issue that could cause {@link
+          java.lang.NullPointerException}. (<a class="external-link" href=
+          "https://code.google.com/p/android/issues/detail?id=207638">Issue
+          207638</a>)
+          </li>
+        </ul>
+      </dd>
+
+      <dt>
+        Changes for <a href=
+        "{@docRoot}tools/support-library/features.html#design">Design Support
+        Library</a>:
+      </dt>
+
+      <dd>
+        <ul>
+          <li>Fixed an issue where {@link
+          android.support.design.widget.TextInputLayout} doesn't clear error
+          tint after {@link
+          android.support.design.widget.TextInputLayout#setErrorEnabled
+          setErrorEnabled(false)} on API level 21 - 22 (<a class=
+          "external-link" href=
+          "https://code.google.com/p/android/issues/detail?id=202829">Issue
+          202829</a>)
+          </li>
+
+          <li>Fixed an issue where {@link
+          android.support.design.widget.FloatingActionButton} does not return
+          when animations are disabled. (<a class="external-link" href=
+          "https://code.google.com/p/android/issues/detail?id=206416">Issue
+          206416</a>)
+          </li>
+
+          <li>Fixed issue in {@link android.support.design.widget.AppBarLayout}
+          snap functionality when used with <code>{@link
+                    android.support.design.R.id#scroll}|{@link
+                    android.support.design.R.id#enterAlways}|{@link
+                    android.support.design.R.id#enterAlwaysCollapsed}|{@link
+                    android.support.design.R.id#snap}</code> scroll flags.
+          (<a class="external-link" href=
+          "https://code.google.com/p/android/issues/detail?id=207398">Issue
+          207398</a>)
+          </li>
+        </ul>
+      </dd>
+
+      <dt>
+        Changes for <!-- TODO: Add link -->Vector Drawable library:
+      </dt>
+
+      <dd>
+        <ul>
+          <li>Fixed a bug where <!-- TODO: Javadoc link -->
+             <code>VectorDrawableCompat</code> does not render correctly in
+            {@link android.widget.TextView} on API level 23. (<a class=
+            "external-link" href=
+            "https://code.google.com/p/android/issues/detail?id=206227">Issue
+            206227</a>)
+          </li>
+        </ul>
+      </dd>
+    </dl>
+  </div>
+</div>
+
+<!-- end of collapsible section: 23.4.0 -->
+
+
+<div class="toggle-content closed">
   <p id="rev23-3-0">
     <a href="#" onclick="return toggleContent(this)"><img src=
     "{@docRoot}assets/images/styles/disclosure_up.png" class="toggle-content-img"