Merge "Docs: Vulkan updates for N, arch reorg" into nyc-dev
diff --git a/src/compatibility/source/android-cdd-cover.html b/src/compatibility/source/android-cdd-cover.html
index ee76ef8..12c0db0 100644
--- a/src/compatibility/source/android-cdd-cover.html
+++ b/src/compatibility/source/android-cdd-cover.html
@@ -1,6 +1,6 @@
<!DOCTYPE html>
<head>
-<title>Android 6.0 Compatibility Definition</title>
+<title>Android 7.0 Compatibility Definition</title>
<link rel="stylesheet" type="text/css" href="android-cdd-cover.css"/>
</head>
@@ -17,15 +17,16 @@
<tr>
<td>
-<img src="images/android-marshmallow-1.png" alt="Marshmallow logo" style="border-top: 5px solid orange; border-bottom: 5px solid orange"/>
+<img src="images/android-nougat-dark.png" alt="Nougat cover images"
+style="border-top: 5px solid orange; border-bottom: 5px solid orange"/>
</td>
</tr>
<tr>
<td>
-<p class="subtitle">Android 6.0</p>
-<p class="cover-text">Last updated: October 7th, 2015</p>
-<p class="cover-text">Copyright © 2015, Google Inc. All rights reserved.</p>
+<p class="subtitle">Android 7.0</p>
+<p class="cover-text">Last updated: July 8th, 2016</p>
+<p class="cover-text">Copyright © 2016, Google Inc. All rights reserved.</p>
<p class="cover-text"><a href="mailto:compatibility@android.com">compatibility@android.com</a></p>
</td>
</tr>
diff --git a/src/compatibility/source/images/android-nougat-dark.png b/src/compatibility/source/images/android-nougat-dark.png
new file mode 100644
index 0000000..31a76ed
--- /dev/null
+++ b/src/compatibility/source/images/android-nougat-dark.png
Binary files differ
diff --git a/src/compatibility/source/images/android-nougat-light.png b/src/compatibility/source/images/android-nougat-light.png
new file mode 100644
index 0000000..8cb7e43
--- /dev/null
+++ b/src/compatibility/source/images/android-nougat-light.png
Binary files differ
diff --git a/src/devices/audio/implement-policy.jd b/src/devices/audio/implement-policy.jd
new file mode 100644
index 0000000..ae6ede2
--- /dev/null
+++ b/src/devices/audio/implement-policy.jd
@@ -0,0 +1,446 @@
+page.title=Configuring Audio Policies
+@jd:body
+
+<!--
+ Copyright 2016 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>Android 7.0 introduces a new audio policy configuration file format (XML) for
+describing your audio topology.</p>
+
+<p>Previous Android releases required using the
+<code>device/<company>/<device>/audio/audio_policy.conf</code>
+to declare the audio devices present on your product (you can see an example of
+this file for the Galaxy Nexus audio hardware in
+<code>device/samsung/tuna/audio/audio_policy.conf</code>). However, .conf is a
+simple proprietary format that is too limited to describe complex topologies for
+applications such as televisions and automobiles.</p>
+
+<p>Android 7.0 deprecates the <code>audio_policy.conf</code> and adds support
+for defining audio topology using an XML file format that is more
+human-readable, has a wide range of editing and parsing tools, and is flexible
+enough to describe complex audio topologies.</p>
+
+<p class="note".<strong>Note:</strong> Android 7.0 preserves support for using
+<code>audio_policy.conf</code>; this legacy format is used by default. To use
+the XML file format, include the build option <code>USE_XML_AUDIO_POLICY_CONF
+:= 1</code> in device makefile.</p>
+
+<h2 id=xml_advantages>Advantages of the XML format</h2>
+<p>As in the .conf file, the new XML file enables defining the number and types
+of output an input stream profiles, devices usable for playback and capture, and
+audio attributes. In addition, the XML format offers the following enhancements:
+</p>
+
+<ul>
+<li>Audio profiles are now structured similar to HDMI Simple Audio Descriptors
+and enable a different set of sampling rates/channel masks for each audio
+format.</li>
+<li>Explicit definitions of all possible connections between devices and
+streams. Previously, an implicit rule made it possible to interconnect all
+devices attached to the same HAL module, preventing the audio policy from
+controlling connections requested with audio patch APIs. In the XML format, the
+topology description now defines connection limitations.</li>
+<li>Support for <em>includes</em> avoids repeating standard A2DP, USB, or
+reroute submit definitions.</li>
+<li>Customizable volume curves. Previously, volume tables were hardcoded. In the
+XML format, volume tables are described and can be customized.</li>
+</ul>
+
+<p>The template at
+<code>frameworks/av/services/audiopolicy/config/audio_policy_configuration.xml</code>
+shows many of these features in use.</p>
+
+<h2 id=xml_file_format>File format and location</h2>
+<p>The new audio policy configuration file is
+<code>audio_policy_configuration.xml</code> and is located in
+<code>/system/etc</code>. To view a simple audio policy configuration in the new
+XML file format, view the example below.</p>
+
+<p>
+<div class="toggle-content closed">
+ <p><a href="#" onclick="return toggleContent(this)">
+ <img src="{@docRoot}assets/images/triangle-closed.png" class="toggle-content-img" />
+ <strong><span class="toggle-content-text">Show audio policy example</span>
+ <span class="toggle-content-text" style="display:none;">Hide audio policy
+ example</span></strong>
+ </a></p>
+
+ <div class="toggle-content-toggleme">
+<pre class="prettyprint">
+<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
+<audioPolicyConfiguration version="1.0" xmlns:xi="http://www.w3.org/2001/XInclude">
+ <globalConfiguration speaker_drc_enabled="true"/>
+ <modules>
+ <module name="primary" halVersion="3.0">
+ <attachedDevices>
+ <item>Speaker</item>
+ <item>Earpiece</item>
+ <item>Built-In Mic</item>
+ </attachedDevices>
+ <defaultOutputDevice>Speaker</defaultOutputDevice>
+ <mixPorts>
+ <mixPort name="primary output" role="source" flags="AUDIO_OUTPUT_FLAG_PRIMARY">
+ <profile name="" format="AUDIO_FORMAT_PCM_16_BIT"
+ samplingRates="48000" channelMasks="AUDIO_CHANNEL_OUT_STEREO"/>
+ </mixPort>
+ <mixPort name="primary input" role="sink">
+ <profile name="" format="AUDIO_FORMAT_PCM_16_BIT"
+ samplingRates="8000,16000,48000"
+ channelMasks="AUDIO_CHANNEL_IN_MONO"/>
+ </mixPort>
+ </mixPorts>
+ <devicePorts>
+ <devicePort tagName="Earpiece" type="AUDIO_DEVICE_OUT_EARPIECE" role="sink">
+ <profile name="" format="AUDIO_FORMAT_PCM_16_BIT"
+ samplingRates="48000" channelMasks="AUDIO_CHANNEL_IN_MONO"/>
+ </devicePort>
+ <devicePort tagName="Speaker" role="sink" type="AUDIO_DEVICE_OUT_SPEAKER" address="">
+ <profile name="" format="AUDIO_FORMAT_PCM_16_BIT"
+ samplingRates="48000" channelMasks="AUDIO_CHANNEL_OUT_STEREO"/>
+ </devicePort>
+ <devicePort tagName="Wired Headset" type="AUDIO_DEVICE_OUT_WIRED_HEADSET" role="sink">
+ <profile name="" format="AUDIO_FORMAT_PCM_16_BIT"
+ samplingRates="48000" channelMasks="AUDIO_CHANNEL_OUT_STEREO"/>
+ </devicePort>
+ <devicePort tagName="Built-In Mic" type="AUDIO_DEVICE_IN_BUILTIN_MIC" role="source">
+ <profile name="" format="AUDIO_FORMAT_PCM_16_BIT"
+ samplingRates="8000,16000,48000"
+ channelMasks="AUDIO_CHANNEL_IN_MONO"/>
+ </devicePort>
+ <devicePort tagName="Wired Headset Mic" type="AUDIO_DEVICE_IN_WIRED_HEADSET" role="source">
+ <profile name="" format="AUDIO_FORMAT_PCM_16_BIT"
+ samplingRates="8000,16000,48000"
+ channelMasks="AUDIO_CHANNEL_IN_MONO"/>
+ </devicePort>
+ </devicePorts>
+ <routes>
+ <route type="mix" sink="Earpiece" sources="primary output"/>
+ <route type="mix" sink="Speaker" sources="primary output"/>
+ <route type="mix" sink="Wired Headset" sources="primary output"/>
+ <route type="mix" sink="primary input" sources="Built-In Mic,Wired Headset Mic"/>
+ </routes>
+ </module>
+ <xi:include href="a2dp_audio_policy_configuration.xml"/>
+ </modules>
+
+ <xi:include href="audio_policy_volumes.xml"/>
+ <xi:include href="default_volume_tables.xml"/>
+</audioPolicyConfiguration>
+</pre></div></div>
+</p>
+
+<p>The top level structure contains modules that correspond to each audio HAL
+hardware module, where each module has a list of mix ports, device ports, and
+routes:</p>
+<ul>
+<li><strong>Mix ports</strong> describe the possible config profiles for streams
+that can be opened at the audio HAL for playback and capture.</li>
+<li><strong>Device ports</strong> describe the devices that can be attached with
+their type (and optionally address and audio properties, if relevant).</li>
+<li><strong>Routes</strong> (new) is now separated from the mix port descriptor,
+enabling description of routes from device to device or stream to device.</li>
+</ul>
+
+<p>Volume tables are simple lists of points defining the curve used to translate
+form a UI index to a volume in dB. A separate include file provides default
+curves, but each curve for a given use case and device category can be
+overwritten.</p>
+
+<div class="toggle-content closed">
+ <p><a href="#" onclick="return toggleContent(this)">
+ <img src="{@docRoot}assets/images/triangle-closed.png" class="toggle-content-img" />
+ <strong><span class="toggle-content-text">Show volume table example</span>
+ <span class="toggle-content-text" style="display:none;">Hide volume table
+ example</span></strong>
+ </a></p>
+
+ <div class="toggle-content-toggleme">
+<p><pre>
+<?xml version="1.0" encoding="UTF-8"?>
+<volumes>
+ <reference name="FULL_SCALE_VOLUME_CURVE">
+ <point>0,0</point>
+ <point>100,0</point>
+ </reference>
+ <reference name="SILENT_VOLUME_CURVE">
+ <point>0,-9600</point>
+ <point>100,-9600</point>
+ </reference>
+ <reference name="DEFAULT_VOLUME_CURVE">
+ <point>1,-4950</point>
+ <point>33,-3350</point>
+ <point>66,-1700</point>
+ <point>100,0</point>
+ </reference>
+</volumes>
+</pre></p></div></div>
+
+<div class="toggle-content closed">
+ <p><a href="#" onclick="return toggleContent(this)">
+ <img src="{@docRoot}assets/images/triangle-closed.png" class="toggle-content-img" />
+ <strong><span class="toggle-content-text">Show volumes example</span>
+ <span class="toggle-content-text" style="display:none;">Hide volumes
+ example</span></strong>
+ </a></p>
+
+ <div class="toggle-content-toggleme">
+<p><pre>
+<?xml version="1.0" encoding="UTF-8"?>
+<volumes>
+ <volume stream="AUDIO_STREAM_VOICE_CALL" deviceCategory="DEVICE_CATEGORY_HEADSET" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_VOICE_CALL" deviceCategory="DEVICE_CATEGORY_SPEAKER" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_VOICE_CALL" deviceCategory="DEVICE_CATEGORY_EARPIECE" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_VOICE_CALL" deviceCategory="DEVICE_CATEGORY_EXT_MEDIA" ref="DEFAULT_VOLUME_CURVE"/>
+
+ <volume stream="AUDIO_STREAM_SYSTEM" deviceCategory="DEVICE_CATEGORY_HEADSET" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_SYSTEM" deviceCategory="DEVICE_CATEGORY_SPEAKER" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_SYSTEM" deviceCategory="DEVICE_CATEGORY_EARPIECE" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_SYSTEM" deviceCategory="DEVICE_CATEGORY_EXT_MEDIA" ref="DEFAULT_VOLUME_CURVE"/>
+
+ <volume stream="AUDIO_STREAM_RING" deviceCategory="DEVICE_CATEGORY_HEADSET" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_RING" deviceCategory="DEVICE_CATEGORY_SPEAKER" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_RING" deviceCategory="DEVICE_CATEGORY_EARPIECE" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_RING" deviceCategory="DEVICE_CATEGORY_EXT_MEDIA"ref="DEFAULT_VOLUME_CURVE"/>
+
+ <volume stream="AUDIO_STREAM_MUSIC" deviceCategory="DEVICE_CATEGORY_HEADSET" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_MUSIC" deviceCategory="DEVICE_CATEGORY_SPEAKER">
+ <point>1,-5500</point>
+ <point>20,-4300</point>
+ <point>86,-1200</point>
+ <point>100,0</point>
+ </volume>
+ <volume stream="AUDIO_STREAM_MUSIC" deviceCategory="DEVICE_CATEGORY_EARPIECE" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_MUSIC" deviceCategory="DEVICE_CATEGORY_EXT_MEDIA" ref="DEFAULT_VOLUME_CURVE"/>
+
+ <volume stream="AUDIO_STREAM_ALARM" deviceCategory="DEVICE_CATEGORY_HEADSET" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_ALARM" deviceCategory="DEVICE_CATEGORY_SPEAKER" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_ALARM" deviceCategory="DEVICE_CATEGORY_EARPIECE" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_ALARM" deviceCategory="DEVICE_CATEGORY_EXT_MEDIA" ref="DEFAULT_VOLUME_CURVE"/>
+
+ <volume stream="AUDIO_STREAM_NOTIFICATION" deviceCategory="DEVICE_CATEGORY_HEADSET" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_NOTIFICATION" deviceCategory="DEVICE_CATEGORY_SPEAKER" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_NOTIFICATION" deviceCategory="DEVICE_CATEGORY_EARPIECE" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_NOTIFICATION" deviceCategory="DEVICE_CATEGORY_EXT_MEDIA" ref="DEFAULT_VOLUME_CURVE"/>
+
+ <volume stream="AUDIO_STREAM_BLUETOOTH_SCO" deviceCategory="DEVICE_CATEGORY_HEADSET" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_BLUETOOTH_SCO" deviceCategory="DEVICE_CATEGORY_SPEAKER" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_BLUETOOTH_SCO" deviceCategory="DEVICE_CATEGORY_EARPIECE" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_BLUETOOTH_SCO" deviceCategory="DEVICE_CATEGORY_EXT_MEDIA" ref="DEFAULT_VOLUME_CURVE"/>
+
+ <volume stream="AUDIO_STREAM_ENFORCED_AUDIBLE" deviceCategory="DEVICE_CATEGORY_HEADSET" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_ENFORCED_AUDIBLE" deviceCategory="DEVICE_CATEGORY_SPEAKER" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_ENFORCED_AUDIBLE" deviceCategory="DEVICE_CATEGORY_EARPIECE" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_ENFORCED_AUDIBLE" deviceCategory="DEVICE_CATEGORY_EXT_MEDIA" ref="DEFAULT_VOLUME_CURVE"/>
+
+ <volume stream="AUDIO_STREAM_DTMF" deviceCategory="DEVICE_CATEGORY_SPEAKER" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_DTMF" deviceCategory="DEVICE_CATEGORY_SPEAKER" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_DTMF" deviceCategory="DEVICE_CATEGORY_EARPIECE" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_DTMF" deviceCategory="DEVICE_CATEGORY_EXT_MEDIA" ref="DEFAULT_VOLUME_CURVE"/>
+
+ <volume stream="AUDIO_STREAM_TTS" deviceCategory="DEVICE_CATEGORY_HEADSET" ref="SILENT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_TTS" deviceCategory="DEVICE_CATEGORY_SPEAKER" ref="FULL_SCALE_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_TTS" deviceCategory="DEVICE_CATEGORY_EARPIECE" ref="SILENT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_TTS" deviceCategory="DEVICE_CATEGORY_EXT_MEDIA" ref="SILENT_VOLUME_CURVE"/>
+
+ <volume stream="AUDIO_STREAM_ACCESSIBILITY" deviceCategory="DEVICE_CATEGORY_HEADSET" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_ACCESSIBILITY" deviceCategory="DEVICE_CATEGORY_SPEAKER" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_ACCESSIBILITY" deviceCategory="DEVICE_CATEGORY_EARPIECE" ref="DEFAULT_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_ACCESSIBILITY" deviceCategory="DEVICE_CATEGORY_EXT_MEDIA" ref="DEFAULT_VOLUME_CURVE"/>
+
+ <volume stream="AUDIO_STREAM_REROUTING" deviceCategory="DEVICE_CATEGORY_HEADSET" ref="FULL_SCALE_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_REROUTING" deviceCategory="DEVICE_CATEGORY_SPEAKER" ref="FULL_SCALE_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_REROUTING" deviceCategory="DEVICE_CATEGORY_EARPIECE" ref="FULL_SCALE_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_REROUTING" deviceCategory="DEVICE_CATEGORY_EXT_MEDIA" ref="FULL_SCALE_VOLUME_CURVE"/>
+
+ <volume stream="AUDIO_STREAM_PATCH" deviceCategory="DEVICE_CATEGORY_HEADSET" ref="FULL_SCALE_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_PATCH" deviceCategory="DEVICE_CATEGORY_SPEAKER" ref="FULL_SCALE_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_PATCH" deviceCategory="DEVICE_CATEGORY_EARPIECE" ref="FULL_SCALE_VOLUME_CURVE"/>
+ <volume stream="AUDIO_STREAM_PATCH" deviceCategory="DEVICE_CATEGORY_EXT_MEDIA" ref="FULL_SCALE_VOLUME_CURVE"/>
+</volumes>
+</pre></p></div></div>
+
+<h2 id=file_inclusions>File inclusions</h2>
+<p>The XML Inclusions (XInclude) method can be used to include audio policy
+configuration information located in other XML files. All included files must
+follow the structure described above with the following restrictions:</p>
+<ul>
+<li>Files can contain only top-level elements.</li>
+<li>Files cannot contain Xinclude elements.</li>
+</ul>
+<p>Use includes to avoid copying standard Android Open Source Project (AOSP)
+audio HAL modules configuration information to all audio policy configuration
+files (which is prone to errors). A standard audio policy configuration xml file
+is provided for the following audio HALs:</p>
+<ul>
+<li><strong>A2DP:</strong> <code>a2dp_audio_policy_configuration.xml</code></li>
+<li><strong>Reroute submix:</strong> <code>rsubmix_audio_policy_configuration.xml</code></li>
+<li><strong>USB:</strong> <code>usb_audio_policy_configuration.xml</code></li>
+</ul>
+
+<h2 id=code_reorg>Audio policy code reorganization</h2>
+<p>Android 7.0 splits <code>AudioPolicyManager.cpp</code> into several modules
+to make it more maintainable and to highlight what is configurable. The new
+organization of <code>frameworks/av/services/audiopolicy</code> includes the
+following modules:</p>
+
+<table>
+<tr>
+<th>Module</th>
+<th>Description</th>
+</tr>
+
+<tr>
+<td><code>/managerdefault</code></td>
+<td>Includes the generic interfaces and behavior implementation common to all
+applications. Similar to <code>AudioPolicyManager.cpp</code> with engine
+functionality and common concepts abstracted away.</td>
+</tr>
+
+<tr>
+<td><code>/common</code></td>
+<td>Defines base classes (e.g data structures for input output audio stream
+profiles, audio device descriptors, audio patches, audio port, etc.). Previously
+defined inside <code>AudioPolicyManager.cpp</code>.</td>
+</tr>
+
+<tr>
+<td><code>/engine</code></td>
+<td><p>Implements the rules that define which device and volumes should be used for
+a given use case. It implements a standard interface with the generic part, such
+as to get the appropriate device for a given playback or capture use case, or to
+set connected devices or external state (i.e. a call state of forced usage) that
+can alter the routing decision.</p>
+<p>Available in two versions, customized and default; use build option
+<code>USE_CONFIGURABLE_AUDIO_POLICY</code> to select.</p></td>
+</tr>
+
+<tr>
+<td><code>/engineconfigurable</code></td>
+<td>Policy engine implementation that relies on parameter framework (see below).
+Configuration is based on the parameter framework and where the policy is
+defined by XML files.</td>
+</tr>
+
+<tr>
+<td><code>/enginedefault</code></td>
+<td>Policy engine implementation based on previous Android Audio Policy Manager
+implementations. This is the default and includes hard coded rules that
+correspond to current Nexus and AOSP implementations.</td>
+</tr>
+
+<tr>
+<td><code>/service</code></td>
+<td>Includes binder interfaces, threading and locking implementation with
+interface to the rest of the framework.</td>
+</tr>
+
+</table>
+
+<h2 id=policy_config>Configuration using parameter-framework</h2>
+<p>Android 7.0 reorganizes audio policy code to make it easier to understand and
+maintain while also supporting an audio policy defined entirely by configuration
+files. The reorganization and audio policy design is based on Intel's parameter
+framework, a plugin-based and rule-based framework for handling parameters.</p>
+
+<p>Using the new configurable audio policy enables vendors OEMs to:</p>
+<ul>
+<li>Describe a system's structure and its parameters in XML.</li>
+<li>Write (in C++) or reuse a backend (plugin) for accessing described
+parameters.</li>
+<li>Define (in XML or in a domain-specific language) conditions/rules upon which
+a given parameter must take a given value.</li>
+</ul>
+
+<p>AOSP includes an example of an audio policy configuration file that uses the parameter-framework at: <code>Frameworks/av/services/audiopolicy/engineconfigurable/parameter-framework/example/Settings/PolicyConfigurableDomains.xml</code>. For
+details, refer to Intel documentation on the
+<a href="https://github.com/01org/parameter-framework">parameter-framework</a>
+and
+<a href="http://01org.github.io/parameter-framework/hosting/Android_M_Configurable_Audio_Policy.pdf">Android
+Configurable Audio Policy</a>.</p>
+
+<h2 id=policy_routing_apis>Audio policy routing APIs</h2>
+<p>Android 6.0 introduced a public Enumeration and Selection API that sits on
+top of the audio patch/audio port infrastructure and allows application
+developers to indicate a preference for a specific device output or input for
+connected audio records or tracks.</p>
+<p>In Android 7.0, the Enumeration and Selection API is verified by CTS tests
+and is extended to include routing for native C/C++ (OpenSL ES) audio streams.
+The routing of native streams continues to be done in Java, with the addition of
+an <code>AudioRouting</code> interface that supersedes, combines, and deprecates
+the explicit routing methods that were specific to <code>AudioTrack</code> and
+<code>AudioRecord</code> classes.</p>
+
+<p>For details on the Enumeration and Selection API, refer to
+<a href="https://developer.android.com/ndk/guides/audio/opensl-for-android.html?hl=fi#configuration-interface">Android
+configuration interfaces</a> and <code>OpenSLES_AndroidConfiguration.h</code>.
+For details on audio routing, refer to
+<a href="https://developer.android.com/reference/android/media/AudioRouting.html">AudioRouting</a>.
+</p>
+
+<h2 id=multichannel>Multi-channel support</h2>
+
+<p>If your hardware and driver supports multichannel audio via HDMI, you can
+output the audio stream directly to the audio hardware (this bypasses the
+AudioFlinger mixer so it doesn't get downmixed to two channels.) The audio HAL
+must expose whether an output stream profile supports multichannel audio
+capabilities. If the HAL exposes its capabilities, the default policy manager
+allows multichannel playback over HDMI. For implementation details, see
+<code>device/samsung/tuna/audio/audio_hw.c</code>.</p>
+
+<p>To specify that your product contains a multichannel audio output, edit the
+audio policy configuration file to describe the multichannel output for your
+product. The following example from a Galaxy Nexus shows a <em>dynamic</em>
+channel mask, which means the audio policy manager queries the actual channel
+masks supported by the HDMI sink after connection.</p>
+
+<pre>
+audio_hw_modules {
+ primary {
+ outputs {
+ ...
+ hdmi {
+ sampling_rates 44100|48000
+ channel_masks dynamic
+ formats AUDIO_FORMAT_PCM_16_BIT
+ devices AUDIO_DEVICE_OUT_AUX_DIGITAL
+ flags AUDIO_OUTPUT_FLAG_DIRECT
+ }
+ ...
+ }
+ ...
+ }
+ ...
+}
+</pre>
+
+<p>You can also specify a static channel mask such as
+<code>AUDIO_CHANNEL_OUT_5POINT1</code>. AudioFlinger's mixer downmixes the
+content to stereo automatically when sent to an audio device that does not
+support multichannel audio.</p>
+
+<h2 id=codecs>Media codecs</h2>
+
+<p>Ensure the audio codecs your hardware and drivers support are properly
+declared for your product. For details, see
+<a href="{@docRoot}devices/media/index.html#expose">Exposing Codecs to the
+Framework</a>.</p>
diff --git a/src/devices/audio/implement-pre-processing.jd b/src/devices/audio/implement-pre-processing.jd
new file mode 100644
index 0000000..b7e39af
--- /dev/null
+++ b/src/devices/audio/implement-pre-processing.jd
@@ -0,0 +1,145 @@
+page.title=Configuring Pre-Processing Effects
+@jd:body
+
+<!--
+ Copyright 2016 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>The Android platform provides audio effects on supported devices in the
+<a href="http://developer.android.com/reference/android/media/audiofx/package-summary.html">audiofx</a>
+package, which is available for developers to access. For example, the Nexus 10
+supports the following pre-processing effects:</p>
+
+<ul>
+<li>
+<a href="http://developer.android.com/reference/android/media/audiofx/AcousticEchoCanceler.html">Acoustic
+Echo Cancellation</a></li>
+<li>
+<a href="http://developer.android.com/reference/android/media/audiofx/AutomaticGainControl.html">Automatic Gain Control</a></li>
+<li>
+<a href="http://developer.android.com/reference/android/media/audiofx/NoiseSuppressor.html">Noise
+Suppression</a></li>
+</ul>
+
+<h2 id=audiosources>Pairing with AudioSources</h2>
+<p>Pre-processing effects are paired with the use case mode in which the
+pre-processing is requested. In Android app development, a use case is referred
+to as an <code>AudioSource</code>; and app developers request to use the
+<code>AudioSource</code> abstraction instead of the actual audio hardware
+device. The Android Audio Policy Manager maps an <code>AudioSource</code> to a
+given capture path configuration (device, gain, pre processing, etc.) according
+to product-specific rules. The following sources are exposed to developers:</p>
+
+<ul>
+<li><code>android.media.MediaRecorder.AudioSource.CAMCORDER</code></li>
+<li><code>android.media.MediaRecorder.AudioSource.VOICE_COMMUNICATION</code></li>
+<li><code>android.media.MediaRecorder.AudioSource.VOICE_CALL</code></li>
+<li><code>android.media.MediaRecorder.AudioSource.VOICE_DOWNLINK</code></li>
+<li><code>android.media.MediaRecorder.AudioSource.VOICE_UPLINK</code></li>
+<li><code>android.media.MediaRecorder.AudioSource.VOICE_RECOGNITION</code></li>
+<li><code>android.media.MediaRecorder.AudioSource.MIC</code></li>
+<li><code>android.media.MediaRecorder.AudioSource.DEFAULT</code></li>
+</ul>
+
+<p>The default pre-processing effects applied for each <code>AudioSource</code>
+are specified in the <code>/system/etc/audio_effects.conf</code> file. To
+specify your own default effects for every <code>AudioSource</code>, create a
+<code>/system/vendor/etc/audio_effects.conf</code> file and specify the
+pre-processing effects to turn on. For an example, see the implementation for
+the Nexus 10 in <code>device/samsung/manta/audio_effects.conf</code>.
+AudioEffect instances acquire and release a session when created and destroyed,
+enabling the effects (such as the Loudness Enhancer) to persist throughout the
+duration of the session.</p>
+
+<p class="warning"><strong>Warning:</strong> For the
+<code>VOICE_RECOGNITION</code> use case, do not enable the noise suppression
+pre-processing effect. It should not be turned on by default when recording from
+this audio source, and you should not enable it in your own audio_effects.conf
+file. Turning on the effect by default will cause the device to fail the
+<a href="{@docRoot}compatibility/index.html"> compatibility requirement</a>
+regardless of whether this was on by default due to configuration file , or the
+audio HAL implementation's default behavior.</p>
+
+<p>The following example enables pre-processing for the VoIP
+<code>AudioSource</code> and Camcorder <code>AudioSource</code>. By declaring
+the <code>AudioSource</code> configuration in this manner, the framework will
+automatically request from the audio HAL the use of those effects.</p>
+
+<p><pre>
+pre_processing {
+ voice_communication {
+ aec {}
+ ns {}
+ }
+ camcorder {
+ agc {}
+ }
+}
+</pre></p>
+
+<h2 id="tuning">Source tuning</h2>
+
+<p><code>AudioSource</code> tuning does not have explicit requirements on audio
+gain or audio processing with the exception of voice recognition
+(<code>VOICE_RECOGNITION</code>). Requirements for voice recognition include:</p>
+
+<ul>
+<li>Flat frequency response (+/- 3dB) from 100Hz to 4kHz</li>
+<li>Close-talk config: 90dB SPL reads RMS of 2500 (16bit samples)</li>
+<li>Level tracks linearly from -18dB to +12dB relative to 90dB SPL</li>
+<li>THD < 1% (90dB SPL in 100 to 4000Hz range)</li>
+<li>8kHz sampling rate (anti-aliasing)</li>
+<li>Effects/pre-processing must be disabled by default</li>
+</ul>
+
+<p>Examples of tuning different effects for different sources are:</p>
+
+<ul>
+<li>Noise Suppressor
+<ul>
+<li>Tuned for wind noise suppressor for <code>CAMCORDER</code></li>
+<li>Tuned for stationary noise suppressor for <code>VOICE_COMMUNICATION</code></li>
+</ul>
+</li>
+<li>Automatic Gain Control
+<ul>
+<li>Tuned for close-talk for <code>VOICE_COMMUNICATION</code> and main phone mic</li>
+<li>Tuned for far-talk for <code>CAMCORDER</code></li>
+</ul>
+</li>
+</ul>
+
+<h2 id="more">Resources</h2>
+
+<p>For more information, refer to the following resources:</p>
+
+<ul>
+<li>Android documentation for
+<a href="http://developer.android.com/reference/android/media/audiofx/package-summary.html">
+audiofx package</a></li>
+
+<li>Android documentation for
+<a href="http://developer.android.com/reference/android/media/audiofx/NoiseSuppressor.html">
+Noise Suppression audio effect</a></li>
+
+<li><code>device/samsung/manta/audio_effects.conf</code> file for the Nexus 10</li>
+</ul>
diff --git a/src/devices/audio/implement-shared-library.jd b/src/devices/audio/implement-shared-library.jd
new file mode 100644
index 0000000..f9539a9
--- /dev/null
+++ b/src/devices/audio/implement-shared-library.jd
@@ -0,0 +1,95 @@
+page.title=Configuring a Shared Library
+@jd:body
+
+<!--
+ Copyright 2016 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+
+<p>After creating an
+<a href="{@docRoot}devices/audio/implement-policy.html">audio policy
+configuration</a>, you must package the HAL implementation into a shared library
+and copy it to the appropriate location:</p>
+
+<ol>
+<li>Create a <code>device/<company>/<device>/audio</code>
+directory to contain your library's source files.</li>
+<li>Create an <code>Android.mk</code> file to build the shared library. Ensure
+the Makefile contains the following line:
+<br>
+<pre>
+LOCAL_MODULE := audio.primary.<device>
+</pre>
+<br>
+<p>Your library must be named <code>audio.primary.<device>.so</code>
+so Android can correctly load the library. The <code>primary</code> portion of
+this filename indicates that this shared library is for the primary audio
+hardware located on the device. The module names
+<code>audio.a2dp.<device></code> and
+<code>audio.usb.<device></code> are also available for Bluetooth and
+USB audio interfaces. Here is an example of an <code>Android.mk</code> from the
+Galaxy Nexus audio hardware:</p>
+<p><pre>
+LOCAL_PATH := $(call my-dir)
+
+include $(CLEAR_VARS)
+
+LOCAL_MODULE := audio.primary.tuna
+LOCAL_MODULE_RELATIVE_PATH := hw
+LOCAL_SRC_FILES := audio_hw.c ril_interface.c
+LOCAL_C_INCLUDES += \
+ external/tinyalsa/include \
+ $(call include-path-for, audio-utils) \
+ $(call include-path-for, audio-effects)
+LOCAL_SHARED_LIBRARIES := liblog libcutils libtinyalsa libaudioutils libdl
+LOCAL_MODULE_TAGS := optional
+
+include $(BUILD_SHARED_LIBRARY)
+</pre></p>
+</li>
+<br>
+<li>If your product supports low latency audio as specified by the Android CDD,
+copy the corresponding XML feature file into your product. For example, in your
+product's <code>device/<company>/<device>/device.mk</code>
+Makefile:
+<p><pre>
+PRODUCT_COPY_FILES := ...
+
+PRODUCT_COPY_FILES += \
+frameworks/native/data/etc/android.hardware.audio.low_latency.xml:system/etc/permissions/android.hardware.audio.low_latency.xml \
+</pre></p>
+</li>
+<br>
+<li>Copy the audio policy configuration file you created earlier to the
+<code>system/etc/</code> directory in your product's
+<code>device/<company>/<device>/device.mk</code> Makefile.
+For example:
+<p><pre>
+PRODUCT_COPY_FILES += \
+ device/samsung/tuna/audio/audio_policy.conf:system/etc/audio_policy.conf
+</pre></p>
+</li>
+<br>
+<li>Declare the shared modules of your audio HAL that are required by your
+product in the product's
+<code>device/<company>/<device>/device.mk</code> Makefile.
+For example, the Galaxy Nexus requires the primary and Bluetooth audio HAL
+modules:
+<pre>
+PRODUCT_PACKAGES += \
+ audio.primary.tuna \
+ audio.a2dp.default
+</pre>
+</li>
+</ol>
diff --git a/src/devices/audio/implement.jd b/src/devices/audio/implement.jd
index 1e81136..31e795b 100644
--- a/src/devices/audio/implement.jd
+++ b/src/devices/audio/implement.jd
@@ -24,279 +24,46 @@
</div>
</div>
-<p>This page explains how to implement the audio Hardware Abstraction Layer (HAL) and configure the
-shared library.</p>
+<p>This section explains how to implement the audio Hardware Abstraction Layer
+(HAL), provides details about configuring an audio policy (file formats, code
+organization, pre-processing effects), and describes how to configure the shared
+library (creating the <code>Android.mk</code> file).</p>
-<h2 id="implementing">Implementing the HAL</h2>
+<h2 id=implementing>Implementing the audio HAL</h2>
-<p>The audio HAL is composed of three different interfaces that you must implement:</p>
+<p>The audio HAL is composed of the following interfaces:</p>
<ul>
-<li><code>hardware/libhardware/include/hardware/audio.h</code> - represents the main functions
-of an audio device.</li>
-<li><code>hardware/libhardware/include/hardware/audio_policy.h</code> - represents the audio policy
-manager, which handles things like audio routing and volume control policies.</li>
-<li><code>hardware/libhardware/include/hardware/audio_effect.h</code> - represents effects that can
-be applied to audio such as downmixing, echo, or noise suppression.</li>
+<li><code>hardware/libhardware/include/hardware/audio.h</code>. Represents the
+main functions of an audio device.</li>
+<li><code>hardware/libhardware/include/hardware/audio_effect.h</code>.
+Represents effects that can be applied to audio such as downmixing, echo, or
+noise suppression.</li>
+</ul>
+
+<p>You must implement all interfaces.</p>
+
+<h2 id=headers>Audio header files</h2>
+<p>For a reference of the properties you can define, refer to the audio header
+files:</p>
+
+<ul>
+<li>In Android 6.0 and higher, see
+<code>system/media/audio/include/system/audio.h</code>.</li>
+<li>In Android 5.1 and lower, see
+<code>system/core/include/system/audio.h</code>.</li>
</ul>
<p>For an example, refer to the implementation for the Galaxy Nexus at
<code>device/samsung/tuna/audio</code>.</p>
-<p>In addition to implementing the HAL, you need to create a
-<code>device/<company_name>/<device_name>/audio/audio_policy.conf</code> file that
-declares the audio devices present on your product. For an example, see the file for the Galaxy
-Nexus audio hardware in <code>device/samsung/tuna/audio/audio_policy.conf</code>. Also, see the
-audio header files for a reference of the properties that you can define.</p>
+<h2 id=next-steps>Next steps</h2>
-<p>In the Android M release and later, the paths are:<br />
-<code>system/media/audio/include/system/audio.h</code><br />
-<code>system/media/audio/include/system/audio_policy.h</code></p>
-
-<p>In Android 5.1 and earlier, the paths are:<br />
-<code>system/core/include/system/audio.h</code><br />
-<code>system/core/include/system/audio_policy.h</code></p>
-
-<h3 id="multichannel">Multi-channel support</h3>
-
-<p>If your hardware and driver supports multichannel audio via HDMI, you can output the audio
-stream directly to the audio hardware. This bypasses the AudioFlinger mixer so it doesn't get
-downmixed to two channels.</p>
-
-<p>The audio HAL must expose whether an output stream profile supports multichannel audio
-capabilities. If the HAL exposes its capabilities, the default policy manager allows multichannel
-playback over HDMI.</p>
-
-<p>For more implementation details, see the <code>device/samsung/tuna/audio/audio_hw.c</code> in
-the Android 4.1 release.</p>
-
-<p>To specify that your product contains a multichannel audio output, edit the
-<code>audio_policy.conf</code> file to describe the multichannel output for your product. The
-following is an example from the Galaxy Nexus that shows a "dynamic" channel mask, which means the
-audio policy manager queries the actual channel masks supported by the HDMI sink after connection.
-You can also specify a static channel mask like <code>AUDIO_CHANNEL_OUT_5POINT1</code>.</p>
-
-<pre>
-audio_hw_modules {
- primary {
- outputs {
- ...
- hdmi {
- sampling_rates 44100|48000
- channel_masks dynamic
- formats AUDIO_FORMAT_PCM_16_BIT
- devices AUDIO_DEVICE_OUT_AUX_DIGITAL
- flags AUDIO_OUTPUT_FLAG_DIRECT
- }
- ...
- }
- ...
- }
- ...
-}
-</pre>
-
-<p>AudioFlinger's mixer downmixes the content to stereo automatically when sent to an audio device
-that does not support multichannel audio.</p>
-
-<h3 id="codecs">Media codecs</h3>
-
-<p>Ensure the audio codecs your hardware and drivers support are properly declared for your
-product. For details on declaring supported codecs, see <a href="{@docRoot}devices/media.html#expose">Exposing Codecs
-to the Framework</a>.</p>
-
-<h2 id="configuring">Configuring the shared library</h2>
-
-<p>You need to package the HAL implementation into a shared library and copy it to the appropriate
-location by creating an <code>Android.mk</code> file:</p>
-
-<ol>
-<li>Create a <code>device/<company_name>/<device_name>/audio</code> directory to
-contain your library's source files.</li>
-<li>Create an <code>Android.mk</code> file to build the shared library. Ensure that the Makefile
-contains the following line:
-<pre>
-LOCAL_MODULE := audio.primary.<device_name>
-</pre>
-
-<p>Notice your library must be named <code>audio.primary.<device_name>.so</code> so
-that Android can correctly load the library. The "<code>primary</code>" portion of this filename
-indicates that this shared library is for the primary audio hardware located on the device. The
-module names <code>audio.a2dp.<device_name></code> and
-<code>audio.usb.<device_name></code> are also available for bluetooth and USB audio
-interfaces. Here is an example of an <code>Android.mk</code> from the Galaxy Nexus audio hardware:
-</p>
-
-<pre>
-LOCAL_PATH := $(call my-dir)
-
-include $(CLEAR_VARS)
-
-LOCAL_MODULE := audio.primary.tuna
-LOCAL_MODULE_RELATIVE_PATH := hw
-LOCAL_SRC_FILES := audio_hw.c ril_interface.c
-LOCAL_C_INCLUDES += \
- external/tinyalsa/include \
- $(call include-path-for, audio-utils) \
- $(call include-path-for, audio-effects)
-LOCAL_SHARED_LIBRARIES := liblog libcutils libtinyalsa libaudioutils libdl
-LOCAL_MODULE_TAGS := optional
-
-include $(BUILD_SHARED_LIBRARY)
-</pre>
-
-</li>
-
-<li>If your product supports low latency audio as specified by the Android CDD, copy the
-corresponding XML feature file into your product. For example, in your product's
-<code>device/<company_name>/<device_name>/device.mk</code> Makefile:
-
-<pre>
-PRODUCT_COPY_FILES := ...
-
-PRODUCT_COPY_FILES += \
-frameworks/native/data/etc/android.hardware.audio.low_latency.xml:system/etc/permissions/android.hardware.audio.low_latency.xml \
-</pre>
-
-</li>
-
-<li>Copy the <code>audio_policy.conf</code> file that you created earlier to the
-<code>system/etc/</code> directory in your product's
-<code>device/<company_name>/<device_name>/device.mk</code> Makefile. For example:
-
-<pre>
-PRODUCT_COPY_FILES += \
- device/samsung/tuna/audio/audio_policy.conf:system/etc/audio_policy.conf
-</pre>
-
-</li>
-
-<li>Declare the shared modules of your audio HAL that are required by your product in the
-product's <code>device/<company_name>/<device_name>/device.mk</code> Makefile. For
-example, the Galaxy Nexus requires the primary and bluetooth audio HAL modules:
-
-<pre>
-PRODUCT_PACKAGES += \
- audio.primary.tuna \
- audio.a2dp.default
-</pre>
-
-</li>
-</ol>
-
-<h2 id="preprocessing">Audio pre-processing effects</h2>
-
-<p>The Android platform provides audio effects on supported devices in the
-<a href="http://developer.android.com/reference/android/media/audiofx/package-summary.html">audiofx
-</a> package, which is available for developers to access. For example, on the Nexus 10, the
-following pre-processing effects are supported:</p>
-
-<ul>
-<li>
-<a href="http://developer.android.com/reference/android/media/audiofx/AcousticEchoCanceler.html">
-Acoustic Echo Cancellation</a></li>
-<li>
-<a href="http://developer.android.com/reference/android/media/audiofx/AutomaticGainControl.html">
-Automatic Gain Control</a></li>
-<li>
-<a href="http://developer.android.com/reference/android/media/audiofx/NoiseSuppressor.html">
-Noise Suppression</a></li>
-</ul>
-
-
-<p>Pre-processing effects are paired with the use case mode in which the pre-processing is requested
-. In Android app development, a use case is referred to as an <code>AudioSource</code>; and app
-developers request to use the <code>AudioSource</code> abstraction instead of the actual audio
-hardware device. The Android Audio Policy Manager maps an <code>AudioSource</code> to the actual
-hardware with <code>AudioPolicyManagerBase::getDeviceForInputSource(int inputSource)</code>. The
-following sources are exposed to developers:</p>
-
-<ul>
-<li><code>android.media.MediaRecorder.AudioSource.CAMCORDER</code></li>
-<li><code>android.media.MediaRecorder.AudioSource.VOICE_COMMUNICATION</code></li>
-<li><code>android.media.MediaRecorder.AudioSource.VOICE_CALL</code></li>
-<li><code>android.media.MediaRecorder.AudioSource.VOICE_DOWNLINK</code></li>
-<li><code>android.media.MediaRecorder.AudioSource.VOICE_UPLINK</code></li>
-<li><code>android.media.MediaRecorder.AudioSource.VOICE_RECOGNITION</code></li>
-<li><code>android.media.MediaRecorder.AudioSource.MIC</code></li>
-<li><code>android.media.MediaRecorder.AudioSource.DEFAULT</code></li> </ul>
-
-<p>The default pre-processing effects applied for each <code>AudioSource</code> are specified in
-the <code>/system/etc/audio_effects.conf</code> file. To specify your own default effects for every
-<code>AudioSource</code>, create a <code>/system/vendor/etc/audio_effects.conf</code> file and
-specify the pre-processing effects to turn on. For an example, see the implementation for the Nexus
-10 in <code>device/samsung/manta/audio_effects.conf</code>. AudioEffect instances acquire and
-release a session when created and destroyed, enabling the effects (such as the Loudness Enhancer)
-to persist throughout the duration of the session. </p>
-
-<p class="warning"><strong>Warning:</strong> For the <code>VOICE_RECOGNITION</code> use case, do
-not enable the noise suppression pre-processing effect. It should not be turned on by default when
-recording from this audio source, and you should not enable it in your own audio_effects.conf file.
-Turning on the effect by default will cause the device to fail the <a
-href="{@docRoot}compatibility/index.html"> compatibility requirement</a> regardless of whether this was on by
-default due to configuration file , or the audio HAL implementation's default behavior.</p>
-
-<p>The following example enables pre-processing for the VoIP <code>AudioSource</code> and Camcorder
-<code>AudioSource</code>. By declaring the <code>AudioSource</code> configuration in this manner,
-the framework will automatically request from the audio HAL the use of those effects.</p>
-
-<pre>
-pre_processing {
- voice_communication {
- aec {}
- ns {}
- }
- camcorder {
- agc {}
- }
-}
-</pre>
-
-<h3 id="tuning">Source tuning</h3>
-
-<p>For <code>AudioSource</code> tuning, there are no explicit requirements on audio gain or audio
-processing with the exception of voice recognition (<code>VOICE_RECOGNITION</code>).</p>
-
-<p>The requirements for voice recognition are:</p>
-
-<ul>
-<li>"flat" frequency response (+/- 3dB) from 100Hz to 4kHz</li>
-<li>close-talk config: 90dB SPL reads RMS of 2500 (16bit samples)</li>
-<li>level tracks linearly from -18dB to +12dB relative to 90dB SPL</li>
-<li>THD < 1% (90dB SPL in 100 to 4000Hz range)</li>
-<li>8kHz sampling rate (anti-aliasing)</li>
-<li>Effects/pre-processing must be disabled by default</li>
-</ul>
-
-<p>Examples of tuning different effects for different sources are:</p>
-
-<ul>
-<li>Noise Suppressor
-<ul>
-<li>Tuned for wind noise suppressor for <code>CAMCORDER</code></li>
-<li>Tuned for stationary noise suppressor for <code>VOICE_COMMUNICATION</code></li>
-</ul>
-</li>
-<li>Automatic Gain Control
-<ul>
-<li>Tuned for close-talk for <code>VOICE_COMMUNICATION</code> and main phone mic</li>
-<li>Tuned for far-talk for <code>CAMCORDER</code></li>
-</ul>
-</li>
-</ul>
-
-<h3 id="more">More information</h3>
-
-<p>For more information, see:</p>
-
-<ul>
-<li>Android documentation for
-<a href="http://developer.android.com/reference/android/media/audiofx/package-summary.html">
-audiofx package</a></li>
-
-<li>Android documentation for
-<a href="http://developer.android.com/reference/android/media/audiofx/NoiseSuppressor.html">
-Noise Suppression audio effect</a></li>
-
-<li><code>device/samsung/manta/audio_effects.conf</code> file for the Nexus 10</li>
-</ul>
+<p>In addition to implementing the audio HAL, you must also create an
+<a href="{@docRoot}devices/audio/implement-policy.html">audio policy
+configuration file</a> that describes your audio topology and package the HAL
+implementation into a
+<a href="{@docRoot}devices/audio/implement-shared-library.html">shared
+library</a>. You can also configure
+<a href="{@docRoot}devices/audio/implement-pre-processing.html">pre-processing
+effects</a> such as automatic gain control and noise suppression.</p>
diff --git a/src/devices/devices_toc.cs b/src/devices/devices_toc.cs
index 94e4031..7bff3fe 100644
--- a/src/devices/devices_toc.cs
+++ b/src/devices/devices_toc.cs
@@ -80,7 +80,18 @@
</div>
<ul>
<li><a href="<?cs var:toroot ?>devices/audio/terminology.html">Terminology</a></li>
- <li><a href="<?cs var:toroot ?>devices/audio/implement.html">Implementation</a></li>
+ <li class="nav-section">
+ <div class="nav-section-header">
+ <a href="<?cs var:toroot ?>devices/audio/implement.html">
+ <span class="en">Implementation</span>
+ </a>
+ </div>
+ <ul>
+ <li><a href="<?cs var:toroot ?>devices/audio/implement-policy.html">Policy Configuration</a></li>
+ <li><a href="<?cs var:toroot ?>devices/audio/implement-shared-library.html">Shared Library</a></li>
+ <li><a href="<?cs var:toroot ?>devices/audio/implement-pre-processing.html">Pre-processing Effects</a></li>
+ </ul>
+ </li>
<li><a href="<?cs var:toroot ?>devices/audio/data_formats.html">Data Formats</a></li>
<li><a href="<?cs var:toroot ?>devices/audio/attributes.html">Attributes</a></li>
<li><a href="<?cs var:toroot ?>devices/audio/warmup.html">Warmup</a></li>
@@ -217,6 +228,8 @@
</a>
</div>
<ul>
+ <li><a href="<?cs var:toroot ?>devices/media/framework-hardening.html">Framework
+ Hardening</a></li>
<li><a href="<?cs var:toroot ?>devices/media/soc.html">SoC Dependencies</a></li>
<li><a href="<?cs var:toroot ?>devices/media/oem.html">OEM Dependencies</a></li>
</ul>
@@ -286,6 +299,7 @@
<li><a href="<?cs var:toroot ?>devices/tech/dalvik/constraints.html">Constraints</a></li>
<li><a href="<?cs var:toroot ?>devices/tech/dalvik/configure.html">Configuration</a></li>
<li><a href="<?cs var:toroot ?>devices/tech/dalvik/gc-debug.html">Garbage Collection</a></li>
+ <li><a href="<?cs var:toroot ?>devices/tech/dalvik/jit-compiler.html">JIT Compilation</a></li>
</ul>
</li>
@@ -301,6 +315,7 @@
<li><a href="<?cs var:toroot ?>devices/tech/config/kernel.html">Kernel Configuration</a></li>
<li><a href="<?cs var:toroot ?>devices/tech/config/kernel_network_tests.html">Kernel Network Tests</a></li>
<li><a href="<?cs var:toroot ?>devices/tech/config/low-ram.html">Low RAM</a></li>
+ <li><a href="<?cs var:toroot ?>devices/tech/config/namespaces_libraries.html">Namespaces for Libraries</a></li>
<li><a href="<?cs var:toroot ?>devices/tech/config/renderer.html">OpenGLRenderer</a></li>
<li><a href="<?cs var:toroot ?>devices/tech/config/runtime_perms.html">Runtime Permissions</a></li>
<li><a href="<?cs var:toroot ?>devices/tech/config/uicc.html">UICC</a></li>
@@ -310,6 +325,18 @@
<li class="nav-section">
<div class="nav-section-header">
+ <a href="<?cs var:toroot ?>devices/tech/connect/index.html">
+ <span class="en">Connectivity</span>
+ </a>
+ </div>
+ <ul>
+ <li><a href="<?cs var:toroot ?>devices/tech/connect/block-numbers.html">Block Phone Numbers</a></li>
+ <li><a href="<?cs var:toroot ?>devices/tech/connect/felica.html">Host Card Emulation of FeliCa</a></li>
+ </ul>
+ </li>
+
+ <li class="nav-section">
+ <div class="nav-section-header">
<a href="<?cs var:toroot ?>devices/tech/datausage/index.html">
<span class="en">Data Usage</span>
</a>
diff --git a/src/devices/media/framework-hardening.jd b/src/devices/media/framework-hardening.jd
new file mode 100644
index 0000000..bcf4296
--- /dev/null
+++ b/src/devices/media/framework-hardening.jd
@@ -0,0 +1,213 @@
+page.title=Media Framework Hardening
+@jd:body
+
+<!--
+ Copyright 2016 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>To improve device security, Android 7.0 breaks up the monolithic
+<code>mediaserver</code> process into multiple processes with permissions and
+capabilities restricted to only those required by each process. These changes
+mitigate media framework security vulnerabilities by:</p>
+<ul>
+<li>Splitting AV pipeline components into app-specific sandboxed processes.</li>
+<li>Enabling updatable media components (extractors, codecs, etc.).</li>
+</ul>
+
+<p>These changes also improve security for end users by significantly reducing
+the severity of most media-related security vulnerabilities, keeping end user
+devices and data safe.</p>
+
+<p>OEMs and SoC vendors need to update their HAL and framework changes to make
+them compatible with the new architecture. Specifically, because vendor-provided
+Android code often assumes everything runs in the same process, vendors must
+update their code to pass around native handles (<code>native_handle</code>)
+that have meaning across processes. For a reference implementation of changes
+related to media hardening, refer to <code>frameworks/av</code> and
+<code>frameworks/native</code>.</p>
+
+<h2 id=arch_changes>Architectural changes</h2>
+<p>Previous versions of Android used a single, monolithic
+<code>mediaserver</code> process with great many permissions (camera access,
+audio access, video driver access, file access, network access, etc.). Android
+7.0 splits the <code>mediaserver</code> process into several new processes that
+each require a much smaller set of permissions:</p>
+
+<p><img src="images/ape_media_split.png" alt="mediaserver hardening"></p>
+<p class="img-caption"><strong>Figure 1.</strong> Architecture changes for
+mediaserver hardening</p>
+
+<p>This new architecture ensures that even if a process is compromised,
+malicious code does not have access to the full set of permissions previously
+held by mediaserver. Processes are restricted by SElinux and seccomp policies.
+</p>
+
+<p class=note><strong>Note:</strong> Because of vendor dependencies, some codecs
+still run in the <code>mediaserver</code> and consequently grant
+<code>mediaserver</code> more permissions than necessary. Specifically, Widevine
+Classic continues to run in the <code>mediaserver</code> for Android 7.0.</p>
+
+<h3 id=mediaserver-changes>MediaServer changes</h3>
+<p>In Android 7.0, the <code>mediaserver</code> process exists for driving
+playback and recording, e.g. passing and synchronizing buffers between
+components and processes. Processes communicate through the standard Binder
+mechanism.</p>
+<p>In a standard local file playback session, the application passes a file
+descriptor (FD) to <code>mediaserver</code> (usually via the MediaPlayer Java
+API), and the <code>mediaserver</code>:</p>
+<ol>
+<li>Wraps the FD into a Binder DataSource object that is passed to the extractor
+process, which uses it to read from the file using Binder IPC. (The
+mediaextractor doesn't get the FD but instead makes Binder calls back to the
+<code>mediaserver</code> to get the data.)</li>
+<li>Examines the file, creates the appropriate extractor for the file type
+(e.g. MP3Extractor, or MPEG4Extractor), and returns a Binder interface for the
+extractor to the <code>mediaserver</code> process.</li>
+<li>Makes Binder IPC calls to the extractor to determine the type of data in the
+file (e.g. MP3 or H.264 data).</li>
+<li>Calls into the <code>mediacodec</code> process to create codecs of the
+required type; receives Binder interfaces for these codecs.</li>
+<li>Makes repeated Binder IPC calls to the extractor to read encoded samples,
+uses the Binder IPC to send encoded data to the <code>mediacodec</code> process
+for decoding, and receives decoded data.</li>
+</ol>
+<p>In some use cases, no codec is involved (such as an offloaded playback where
+encoded data is sent directly to the output device), or the codec may render the
+decoded data directly instead of returning a buffer of decoded data (video
+playback).</p>
+
+<h3 id=mediacodecservice_changes>MediaCodecService changes</h3>
+<p>The codec service is where encoders and decoders live. Due to vendor
+dependencies, not all codecs live in the codec process yet. In Android 7.0:</p>
+<ul>
+<li>Non-secure decoders and software encoders live in the codec process.</li>
+<li>Secure decoders and hardware encoders live in the <code>mediaserver</code>
+(unchanged).</li>
+</ul>
+
+<p>An application (or mediaserver) calls the codec process to create a codec of
+the required type, then calls that codec to pass in encoded data and retrieve
+decoded data (for decoding) or to pass in decoded data and retrieve encoded data
+(for encoding). Data transfer to and from codecs uses shared memory already, so
+that process is unchanged.</p>
+
+<h3 id=mediadrmserver_changes>MediaDrmServer changes</h3>
+<p>The DRM server is used when playing DRM-protected content, such as movies in
+Google Play Movies. It handles decrypting the encrypted data in a secure way,
+and as such has access to certificate and key storage and other sensitive
+components. Due to vendor dependencies, the DRM process is not used in all cases
+yet.</p>
+
+<h3 id=audioserver_changes>AudioServer changes</h3>
+<p>The AudioServer process hosts audio related components such as audio input
+and output, the policymanager service that determines audio routing, and FM
+radio service. For details on Audio changes and implementation guidance, see
+<a href="{@docRoot}devices/audio/implement.html">Implementing Audio</a>.</p>
+
+<h3 id=cameraserver_changes>CameraServer changes</h3>
+<p>The CameraServer controls the camera and is used when recording video to
+obtain video frames from the camera and then pass them to
+<code>mediaserver</code> for further handling. For details on changes and
+implementation guidance for CameraServer changes, refer to
+<a href="{@docRoot}devices/camera/versioning.html#hardening">Camera Framework
+Hardening</a>.</p>
+
+<h3 id=extractor_service_changes>ExtractorService changes</h3>
+<p>The extractor service hosts the <em>extractors</em>, components that parse
+the various file formats supported by the media framework. The extractor service
+is the least privileged of all the services—it can't read FDs so instead
+it makes calls onto a Binder interface (provided to it by the
+<code>mediaserver for</code> each playback session) to access files.</p>
+<p>An application (or <code>mediaserver</code>) makes a call to the extractor
+process to obtain an <code>IMediaExtractor</code>, calls that
+<code>IMediaExtractor</code> to obtain<code> IMediaSources</code> for the track
+contained in the file, and then calls <code>IMediaSources</code> to read data
+from them.</p>
+<p>To transfer the data between processes, the application (or
+<code>mediaserver</code>) includes the data in the reply-Parcel as part of the
+Binder transaction or uses shared memory:</p>
+
+<ul>
+<li>Using <strong>shared memory</strong> requires an extra Binder call to
+release the shared memory but is faster and uses less power for large buffers.
+</li>
+<li>Using <strong>in-Parcel</strong> requires extra copying but is faster and
+uses less power for buffers smaller than 64KB.</li>
+</ul>
+
+<h2 id=implementation>Implementation</h2>
+<p>To support the move of <code>MediaDrm</code> and <code>MediaCrypto</code>
+components into the new <code>mediadrmserver</code> process, vendors must change
+the allocation method for secure buffers to allow buffers to be shared between
+processes.</p>
+<p>In previous Android releases, secure buffers are allocated in
+<code>mediaserver</code> by <code>OMX::allocateBuffer</code> and used during
+decryption in the same process, as shown below:</p>
+
+<p><img src="images/ape_media_buffer_alloc_pren.png"></p>
+<p class="img-caption"><strong>Figure 2.</strong> Android 6.0 and lower buffer
+allocation in mediaserver.</p>
+
+<p>In Android 7.0, the buffer allocation process has changed to a new mechanism
+that provides flexibility while minimizing the impact on existing
+implementations. With <code>MediaDrm</code> and <code>MediaCrypto</code> stacks
+in the new <code>mediadrmserver</code> process, buffers are allocated
+differently and vendors must update the secure buffer handles so they can be
+transported across binder when <code>MediaCodec</code> invokes a decrypt
+operation on <code>MediaCrypto</code>.</p>
+
+<p><img src="images/ape_media_buffer_alloc_n.png"></p>
+<p class="img-caption"><strong>Figure 3.</strong> Android 7.0 and higher buffer
+allocation in mediaserver.</p>
+
+<h3 id=native_handles>Using native handles</h3>
+<p>The <code>OMX::allocateBuffer</code> must return a pointer to a
+<code>native_handle</code> struct, which contains file descriptors (FDs) and
+additional integer data. A <code>native_handle</code> has all of the advantages
+of using FDs, including existing binder support for
+serialization/deserialization, while allowing more flexibility for vendors who
+don't currently use FDs.</p>
+<p>Use <code>native_handle_create()</code> to allocate the native handle.
+Framework code takes ownership of the allocated <code>native_handle</code>
+struct and is responsible for releasing resources in both the process where
+the <code>native_handle</code> is originally allocated and in the process where
+it is deserialized. The framework releases native handles with
+<code>native_handle_close()</code> followed by
+<code>native_handle_delete()</code> and serializes/deserializes the
+<code>native_handle</code> using
+<code>Parcel::writeNativeHandle()/readNativeHandle()</code>.
+</p>
+<p>SoC vendors who use FDs to represent secure buffers can populate the FD in
+the <code>native_handle</code> with their FD. Vendors who don't use FDs can
+represent secure buffers using additional fields in the
+<code>native_buffer</code>.</p>
+
+<h3 id=decrypt_location>Setting decryption location</h3>
+<p>Vendors must update the OEMCrypto decrypt method that operates on the
+<code>native_handle</code> to perform any vendor-specific operations necessary
+to make the <code>native_handle</code> usable in the new process space (changes
+typically include updates to OEMCrypto libraries).</p>
+<p>As <code>allocateBuffer</code> is a standard OMX operation, Android 7.0
+includes a new OMX extension
+(<code>OMX.google.android.index.allocateNativeHandle</code>) to query for this
+support and an <code>OMX_SetParameter</code> call that notifies the OMX
+implementation it should use native handles.</p>
diff --git a/src/devices/media/images/ape_media_buffer_alloc_n.png b/src/devices/media/images/ape_media_buffer_alloc_n.png
new file mode 100644
index 0000000..54f93a7
--- /dev/null
+++ b/src/devices/media/images/ape_media_buffer_alloc_n.png
Binary files differ
diff --git a/src/devices/media/images/ape_media_buffer_alloc_pren.png b/src/devices/media/images/ape_media_buffer_alloc_pren.png
new file mode 100644
index 0000000..e0e6e75
--- /dev/null
+++ b/src/devices/media/images/ape_media_buffer_alloc_pren.png
Binary files differ
diff --git a/src/devices/media/images/ape_media_split.png b/src/devices/media/images/ape_media_split.png
new file mode 100644
index 0000000..85b4a5d
--- /dev/null
+++ b/src/devices/media/images/ape_media_split.png
Binary files differ
diff --git a/src/devices/media/index.jd b/src/devices/media/index.jd
index 6d2359d..b7d2a8d 100644
--- a/src/devices/media/index.jd
+++ b/src/devices/media/index.jd
@@ -24,101 +24,107 @@
</div>
</div>
-<img style="float: right; margin: 0px 15px 15px 15px;" src="images/ape_fwk_hal_media.png" alt="Android Media HAL icon"/>
+<img style="float: right; margin: 0px 15px 15px 15px;"
+src="images/ape_fwk_hal_media.png" alt="Android Media HAL icon"/>
-<p>
- Android provides a media playback engine at the native level called
-Stagefright that comes built-in with software-based codecs for several popular
-media formats. Stagefright features for audio and video playback include
-integration with OpenMAX codecs, session management, time-synchronized
-rendering, transport control, and DRM.</p>
+<p>Android includes Stagefright, a media playback engine at the native level
+that has built-in software-based codecs for popular media formats.</p>
-<p class="note"><strong>Note:</strong> The Stagefright media playback engine
-had been updated through our <a
-href="{@docRoot}security/bulletin/index.html">monthly security update</a>
-process.</p>
+<p>Stagefright audio and video playback features include integration with
+OpenMAX codecs, session management, time-synchronized rendering, transport
+control, and DRM.</p>
- <p>In addition, Stagefright supports integration with custom hardware codecs
-that you provide. There actually isn't a HAL to implement for custom codecs,
-but to provide a hardware path to encode and decode media, you must implement
-your hardware-based codec as an OpenMax IL (Integration Layer) component.</p>
+<p>Stagefright also supports integration with custom hardware codecs provided by
+you. To set a hardware path to encode and decode media, you must implement a
+hardware-based codec as an OpenMax IL (Integration Layer) component.</p>
+
+<p class="note"><strong>Note:</strong> Stagefright updates can occur through the
+Android <a href="{@docRoot}security/bulletin/index.html">monthly security
+update</a> process and as part of an Android OS release.</p>
<h2 id="architecture">Architecture</h2>
-<p>The following diagram shows how media applications interact with the Android native multimedia framework.</p>
- <img src="images/ape_fwk_media.png" alt="Android media architecture" id="figure1" />
-<p class="img-caption">
- <strong>Figure 1.</strong> Media architecture
-</p>
+<p>Media applications interact with the Android native multimedia framework
+according to the following architecture.</p>
+<img src="images/ape_fwk_media.png" alt="Android media architecture"
+id="figure1" /><p class="img-caption"><strong>Figure 1.</strong> Media
+architecture</p>
+
<dl>
<dt>Application Framework</dt>
- <dd>At the application framework level is the app's code, which utilizes the
- <a href="http://developer.android.com/reference/android/media/package-summary.html">android.media</a>
- APIs to interact with the multimedia hardware.</dd>
- <dt>Binder IPC</dt>
- <dd>The Binder IPC proxies facilitate communication over process boundaries. They are located in
- the <code>frameworks/av/media/libmedia</code> directory and begin with the letter "I".</dd>
- <dt>Native Multimedia Framework</dt>
- <dd>At the native level, Android provides a multimedia framework that utilizes the Stagefright engine for
- audio and video recording and playback. Stagefright comes with a default list of supported software codecs
- and you can implement your own hardware codec by using the OpenMax integration layer standard. For more
- implementation details, see the various MediaPlayer and Stagefright components located in
- <code>frameworks/av/media</code>.
- </dd>
- <dt>OpenMAX Integration Layer (IL)</dt>
- <dd>The OpenMAX IL provides a standardized way for Stagefright to recognize and use custom hardware-based
- multimedia codecs called components. You must provide an OpenMAX plugin in the form of a shared library
- named <code>libstagefrighthw.so</code>. This plugin links your custom codec components to Stagefright.
- Your custom codecs must be implemented according to the OpenMAX IL component standard.
- </dd>
+<dd>At the application framework level is application code that utilizes
+<a href="http://developer.android.com/reference/android/media/package-summary.html">android.media</a>
+APIs to interact with the multimedia hardware.</dd>
+
+<dt>Binder IPC</dt>
+<dd>The Binder IPC proxies facilitate communication over process boundaries.
+They are located in the <code>frameworks/av/media/libmedia</code> directory and
+begin with the letter "I".</dd>
+
+<dt>Native Multimedia Framework</dt>
+<dd>At the native level, Android provides a multimedia framework that utilizes
+the Stagefright engine for audio and video recording and playback. Stagefright
+comes with a default list of supported software codecs and you can implement
+your own hardware codec by using the OpenMax integration layer standard. For
+more implementation details, see the MediaPlayer and Stagefright components
+located in <code>frameworks/av/media</code>.</dd>
+
+<dt>OpenMAX Integration Layer (IL)</dt>
+<dd>The OpenMAX IL provides a standardized way for Stagefright to recognize and
+use custom hardware-based multimedia codecs called components. You must provide
+an OpenMAX plugin in the form of a shared library named
+<code>libstagefrighthw.so</code>. This plugin links Stagefright with your custom
+codec components, which must be implemented according to the OpenMAX IL
+component standard.</dd>
</dl>
+<h2 id="codecs">Implementing custom codecs</h2>
+<p>Stagefright comes with built-in software codecs for common media formats, but
+you can also add your own custom hardware codecs as OpenMAX components. To do
+this, you must create the OMX components and an OMX plugin that hooks together
+your custom codecs with the Stagefright framework. For example components, see
+the <code>hardware/ti/omap4xxx/domx/</code>; for an example plugin for the
+Galaxy Nexus, see <code>hardware/ti/omap4xx/libstagefrighthw</code>.</p>
-<h2 id="codecs">
-Implementing Custom Codecs
-</h2>
-<p>Stagefright comes with built-in software codecs for common media formats, but you can also add your
- own custom hardware codecs as OpenMAX components. To do this, you need to create OMX components and also an
- OMX plugin that hooks together your custom codecs with the Stagefright framework. For an example, see
- the <code>hardware/ti/omap4xxx/domx/</code> for example components and <code>hardware/ti/omap4xx/libstagefrighthw</code>
- for an example plugin for the Galaxy Nexus.
-</p>
- <p>To add your own codecs:</p>
+<p>To add your own codecs:</p>
<ol>
-<li>Create your components according to the OpenMAX IL component standard. The component interface is located in the
- <code>frameworks/native/include/media/OpenMAX/OMX_Component.h</code> file. To learn more about the
- OpenMAX IL specification, see the <a href="http://www.khronos.org/openmax/">OpenMAX website</a>.</li>
-<li>Create a OpenMAX plugin that links your components with the Stagefright service.
- See the <code>frameworks/native/include/media/hardware/OMXPluginBase.h</code> and <code>HardwareAPI.h</code> header
- files for the interfaces to create the plugin.
-</li>
-<li>Build your plugin as a shared library with the name <code>libstagefrighthw.so</code> in your product Makefile. For example:
-<pre>LOCAL_MODULE := libstagefrighthw</pre>
-
-<p>In your device's Makefile, ensure that you declare the module as a product package:</p>
+<li>Create your components according to the OpenMAX IL component standard. The
+component interface is located in the
+<code>frameworks/native/include/media/OpenMAX/OMX_Component.h</code> file. To
+learn more about the OpenMAX IL specification, refer to the
+<a href="http://www.khronos.org/openmax/">OpenMAX website</a>.</li>
+<li>Create a OpenMAX plugin that links your components with the Stagefright
+service. For the interfaces to create the plugin, see
+<code>frameworks/native/include/media/hardware/OMXPluginBase.h</code> and
+<code>HardwareAPI.h</code> header files.</li>
+<li>Build your plugin as a shared library with the name
+<code>libstagefrighthw.so</code> in your product Makefile. For example:
+<br>
+<p><pre>LOCAL_MODULE := libstagefrighthw</pre></p>
+<p>In your device's Makefile, ensure you declare the module as a product
+package:</p>
<pre>
PRODUCT_PACKAGES += \
libstagefrighthw \
...
-</pre>
-</li>
-</ol>
+</pre></li></ol>
-<h2 id="expose">Exposing Codecs to the Framework</h2>
-<p>The Stagefright service parses the <code>system/etc/media_codecs.xml</code> and <code>system/etc/media_profiles.xml</code>
- to expose the supported codecs and profiles on the device to app developers via the <code>android.media.MediaCodecList</code> and
- <code>android.media.CamcorderProfile</code> classes. You need to create both files in the
- <code>device/<company_name>/<device_name>/</code> directory
- and copy this over to the system image's <code>system/etc</code> directory in your device's Makefile.
- For example:</p>
-
- <pre>
+<h2 id="expose">Exposing codecs to the framework</h2>
+<p>The Stagefright service parses the <code>system/etc/media_codecs.xml</code>
+and <code>system/etc/media_profiles.xml</code> to expose the supported codecs
+and profiles on the device to app developers via the
+<code>android.media.MediaCodecList</code> and
+<code>android.media.CamcorderProfile</code> classes. You must create both files
+in the <code>device/<company>/<device>/</code> directory
+and copy this over to the system image's <code>system/etc</code> directory in
+your device's Makefile. For example:</p>
+<pre>
PRODUCT_COPY_FILES += \
device/samsung/tuna/media_profiles.xml:system/etc/media_profiles.xml \
device/samsung/tuna/media_codecs.xml:system/etc/media_codecs.xml \
</pre>
-<p>See the <code>device/samsung/tuna/media_codecs.xml</code> and
- <code>device/samsung/tuna/media_profiles.xml</code> file for complete examples.</p>
+<p>For complete examples, seee <code>device/samsung/tuna/media_codecs.xml</code>
+and <code>device/samsung/tuna/media_profiles.xml</code> .</p>
-<p class="note"><strong>Note:</strong> The <code><Quirk></code> element for media codecs is no longer supported
- by Android starting in Jelly Bean.</p>
+<p class="note"><strong>Note:</strong> As of Android 4.1, the
+<code><Quirk></code> element for media codecs is no longer supported.</p>
diff --git a/src/devices/sensors/images/axis_auto.png b/src/devices/sensors/images/axis_auto.png
new file mode 100644
index 0000000..dd6b187
--- /dev/null
+++ b/src/devices/sensors/images/axis_auto.png
Binary files differ
diff --git a/src/devices/sensors/sensor-types.jd b/src/devices/sensors/sensor-types.jd
index add3796..e7a742f 100644
--- a/src/devices/sensors/sensor-types.jd
+++ b/src/devices/sensors/sensor-types.jd
@@ -24,71 +24,93 @@
</div>
</div>
-<h2 id="sensor_axis_definition">Sensor axis definition</h2>
-<p>Sensor event values from many sensors are expressed in a specific frame that is
- static relative to the phone. This API is relative only to the NATURAL
- orientation of the screen. In other words, the axes are not swapped when the
- device's screen orientation changes.</p>
+<p>This section describes sensor axes, base sensors, and composite sensors
+(activity, attitude, uncalibrated, and interaction).</p>
-<div class="figure" style="width:269px">
- <img src="http://developer.android.com/images/axis_device.png"
-alt="Coordinate system of sensor API" height="225" />
- <p class="img-caption">
- <strong>Figure 1.</strong> Coordinate system (relative to a device) that's
- used by the Sensor API.
- </p>
-</div>
+<h2 id="sensor_axis_definition">Sensor axes</h2>
+<p>Sensor event values from many sensors are expressed in a specific frame that
+is static relative to the device.
+
+<h3 id=phone_axes>Mobile device axes</h3>
+<p>The Sensor API is relative only to the natural orientation of the screen
+(axes are not swapped when the device's screen orientation changes.</p>
+
+<img src="http://developer.android.com/images/axis_device.png" alt="Coordinate
+system of sensor API for mobile devices"/>
+<p class="img-caption"><strong>Figure 1.</strong> Coordinate system (relative to
+a mobile device) used by the Sensor API.</p>
+
+<h3 id=auto_axes>Automotive axes</h3>
+<p>In Android Automotive implementations, axes are defined with respect to the
+vehicle body frame:</p>
+
+<img src="images/axis_auto.png" alt="Coordinate system of sensor API for
+automotive devices"/>
+<p class="img-caption"><strong>Figure 2.</strong> Coordinate system (relative to
+an automotive device) used by the Sensor API.</p>
+
+<ul>
+<li>X increases towards the right of the vehicle</li>
+<li>Y increases towards the nose of the body frame</li>
+<li>Z increases towards the roof of the body frame</li>
+</ul>
+
+<p>When looking from the positive direction of an axis, positive rotations are
+counterclockwise. Thus, when a vehicle is making a left turn, the z-axis
+gyroscope rate of turn is expected to be a positive value.</p>
<h2 id="base_sensors">Base sensors</h2>
-<p>Some sensor types are named directly after the physical sensors they represent.
- Sensors with such types are called “base” sensors, referring to the fact they
- relay data from a single physical sensor, contrary to “composite” sensors, for
- which the data is generated out of other sensors.</p>
-<p>Examples of base sensor types:</p>
+<p>Base sensor types are named after the physical sensors they represent. These
+sensors relay data from a single physical sensor (as opposed to composite
+sensors that generate data out of other sensors). Examples of base sensor types
+include:</p>
<ul>
<li><code>SENSOR_TYPE_ACCELEROMETER</code></li>
<li><code>SENSOR_TYPE_GYROSCOPE</code></li>
<li><code>SENSOR_TYPE_MAGNETOMETER</code></li>
</ul>
- <p> See the list of Android sensor types below for more details on each
-<h3 id="base_sensors_=_not_equal_to_physical_sensors">Base sensors != (not equal to) physical sensors</h3>
-<p>Base sensors are not to be confused with their underlying physical sensor. The
- data from a base sensor is not the raw output of the physical sensor:
- corrections are be applied, such as bias compensation and temperature
- compensation.</p>
-<p>The characteristics of a base sensor might be different from the
- characteristics of its underlying physical sensor.</p>
+
+<p class='note'><strong>Note:</strong> For details on each Android sensor type,
+review the following sections.</p>
+
+<p>However, base sensors are not equal to and should not be confused with their
+underlying physical sensor. The data from a base sensor is <strong>not</strong>
+the raw output of the physical sensor because corrections (such as bias
+compensation and temperature compensation) are applied.</p>
+
+<p>For example, the characteristics of a base sensor might be different from the
+characteristics of its underlying physical sensor in the following use cases:</p>
<ul>
- <li> For example, a gyroscope chip might be rated to have a bias range of 1 deg/sec.
- <ul>
- <li> After factory calibration, temperature compensation and bias compensation are
- applied, the actual bias of the Android sensor will be reduced, may be to a
- point where the bias is guaranteed to be below 0.01deg/sec. </li>
- <li> In this situation, we say that the Android sensor has a bias below 0.01
- deg/sec, even though the data sheet of the underlying sensor said 1 deg/sec. </li>
- </ul>
- </li>
- <li> As another example, a barometer might have a power consumption of 100uW.
- <ul>
- <li> Because the generated data needs to be transported from the chip to the SoC,
- the actual power cost to gather data from the barometer Android sensor might be
- much higher, for example 1000uW. </li>
- <li> In this situation, we say that the Android sensor has a power consumption of
- 1000uW, even though the power consumption measured at the barometer chip leads
- is 100uW. </li>
- </ul>
- </li>
- <li> As a third example, a magnetometer might consume 100uW when calibrated, but
- consume more when calibrating.
- <ul>
- <li> Its calibration routine might require activating the gyroscope, consuming
- 5000uW, and running some algorithm, costing another 900uW. </li>
- <li> In this situation, we say that the maximum power consumption of the
- (magnetometer) Android sensor is 6000uW. </li>
- <li> In this case, the average power consumption is the more useful measure, and it
- is what is reported in the sensor static characteristics through the HAL. </li>
- </ul>
- </li>
+<li>A gyroscope chip rated to have a bias range of 1 deg/sec.
+ <ul>
+ <li>After factory calibration, temperature compensation and bias compensation are
+ applied, the actual bias of the Android sensor will be reduced, may be to a
+ point where the bias is guaranteed to be below 0.01deg/sec.</li>
+ <li>In this situation, we say that the Android sensor has a bias below 0.01
+ deg/sec, even though the data sheet of the underlying sensor said 1 deg/sec.</li>
+ </ul>
+</li>
+<li>A barometer with a power consumption of 100uW.
+ <ul>
+ <li>Because the generated data needs to be transported from the chip to the SoC,
+ the actual power cost to gather data from the barometer Android sensor might be
+ much higher, for example 1000uW.</li>
+ <li>In this situation, we say that the Android sensor has a power consumption of
+ 1000uW, even though the power consumption measured at the barometer chip leads
+ is 100uW.</li>
+ </ul>
+</li>
+<li>A magnetometer that consumes 100uW when calibrated, but consumes more when
+calibrating.
+ <ul>
+ <li>Its calibration routine might require activating the gyroscope, consuming
+ 5000uW, and running some algorithm, costing another 900uW.</li>
+ <li> In this situation, we say that the maximum power consumption of the
+ (magnetometer) Android sensor is 6000uW.</li>
+ <li>In this case, the average power consumption is the more useful measure, and it
+ is what is reported in the sensor static characteristics through the HAL.</li>
+ </ul>
+</li>
</ul>
<h3 id="accelerometer">Accelerometer</h3>
<p>Reporting-mode: <em><a href="report-modes.html#continuous">Continuous</a></em></p>
@@ -227,45 +249,44 @@
<p><code>getDefaultSensor(SENSOR_TYPE_RELATIVE_HUMIDITY)</code> <em>returns a non-wake-up sensor</em></p>
<p>A relative humidity sensor measures relative ambient air humidity and returns a
value in percent.</p>
+
<h2 id="composite_sensor_types">Composite sensor types</h2>
-<p>Any sensor that is not a base sensor is called a composite sensor. Composite
- sensors generate their data by processing and/or fusing data from one or
- several physical sensors.</p>
-<p>Examples of composite sensor types:</p>
+<p>A composite sensor generates data by processing and/or fusing data from one
+or several physical sensors. (Any sensor that is not a base sensor is called a
+composite sensor.) Examples of composite sensors include:</p>
<ul>
- <li><a href="#step_detector">Step detector</a> and <a href="#significant_motion">Significant motion</a>, which are usually based on an accelerometer, but could be based on other
- sensors as well, if the power consumption and accuracy was acceptable. </li>
- <li><a href="#game_rotation_vector">Game rotation vector</a>, based on an
- accelerometer and a gyroscope. </li>
- <li><a href="#gyroscope_uncalibrated">Uncalibrated gyroscope</a>, which is
- similar to the gyroscope base sensor, but with
- the bias calibration being reported separately instead of being corrected in
- the measurement. </li>
+<li><a href="#step_detector">Step detector</a> and
+<a href="#significant_motion">Significant motion</a>, which are usually based on
+an accelerometer, but could be based on other sensors as well, if the power
+consumption and accuracy was acceptable.</li>
+<li><a href="#game_rotation_vector">Game rotation vector</a>, based on an
+accelerometer and a gyroscope.</li>
+<li><a href="#gyroscope_uncalibrated">Uncalibrated gyroscope</a>, which is
+similar to the gyroscope base sensor, but with the bias calibration being
+reported separately instead of being corrected in the measurement.</li>
</ul>
-<p>Just like base sensors, the characteristics of the composite sensors come from
- the characteristics of their final data.</p>
-<ul>
- <li> For example, the power consumption of a game rotation vector is probably equal
- to the sum of the power consumptions of: the accelerometer chip, the gyroscope
- chip, the chip processing the data, and the buses transporting the data. </li>
- <li> As another example, the drift of a game rotation vector will depend as much on
- the quality of the calibration algorithm as on the physical sensor
- characteristics. </li>
-</ul>
-<h2 id="composite_sensor_type_summary">Composite sensor type summary</h2>
-<p>The following table lists the composite sensor types. Each composite sensor
- relies on data from one or several physical sensors. Choosing other underlying
- physical sensors to approximate results should be avoided as they will provide
- a poor user experience.</p>
-<p>When there is no gyroscope on the device, and only when there is no gyroscope,
- you may implement the rotation vector, linear acceleration and gravity sensors
- without using the gyroscope.</p>
+<p>As with base sensors, the characteristics of the composite sensors come from
+the characteristics of their final data. For example, the power consumption of a
+game rotation vector is probably equal to the sum of the power consumptions of
+the accelerometer chip, the gyroscope chip, the chip processing the data, and
+the buses transporting the data. As another example, the drift of a game
+rotation vector depends as much on the quality of the calibration algorithm as
+on the physical sensor characteristics.</p>
+
+<p>The following table lists available composite sensor types. Each composite
+sensor relies on data from one or several physical sensors. Avoid choosing other
+underlying physical sensors to approximate results as they provide a poor user
+experience.</p>
+<p class="note"><strong>Note:</strong> When there is no gyroscope on the device
+(and only when there is no gyroscope), you may implement the rotation vector,
+linear acceleration, and gravity sensors without using the gyroscope.</p>
+
<table>
<tr>
- <th><p>Sensor type</p></th>
- <th><p>Category</p></th>
- <th><p>Underlying physical sensors</p></th>
- <th><p>Reporting mode</p></th>
+ <th width=34%>Sensor type</th>
+ <th width=10%>Category</th>
+ <th width=34%>Underlying physical sensors</th>
+ <th width=19%>Reporting mode</th>
</tr>
<tr>
<td><p><a href="#game_rotation_vector">Game rotation vector</a></p></td>
@@ -631,7 +652,7 @@
<img src="images/axis_positive_roll.png" alt="Depiction of orientation
relative to a device" height="253" />
<p class="img-caption">
- <strong>Figure 2.</strong> Orientation relative to a device.
+ <strong>Figure 3.</strong> Orientation relative to a device.
</p>
</div>
<p>This definition is different from yaw, pitch and roll used in aviation where
diff --git a/src/devices/tech/config/images/namespace-libraries.png b/src/devices/tech/config/images/namespace-libraries.png
new file mode 100644
index 0000000..9152fa1
--- /dev/null
+++ b/src/devices/tech/config/images/namespace-libraries.png
Binary files differ
diff --git a/src/devices/tech/config/namespaces_libraries.jd b/src/devices/tech/config/namespaces_libraries.jd
new file mode 100644
index 0000000..49c74e4
--- /dev/null
+++ b/src/devices/tech/config/namespaces_libraries.jd
@@ -0,0 +1,79 @@
+page.title=Namespaces for Native Libraries
+@jd:body
+
+<!--
+ Copyright 2016 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>
+Android N introduces namespaces for native libraries to limit internal API
+visibility and resolve situations when apps accidentally end up using platform
+libraries instead of their own. See the <a
+href="http://android-developers.blogspot.com/2016/06/improving-stability-with-private-cc.html">Improving
+Stability with Private C/C++ Symbol Restrictions in Android N</a> Android
+Developers blog post</a> for application-specific changes.
+</p>
+
+<h2 id="architecture">Architecture</h2>
+
+<p>
+The change separates system libraries from application libraries and makes it
+hard to use internal system libraries by accident (and vice versa).
+</p>
+
+<img src="images/namespace-libraries.png" alt="Namespaces for native libraries" width="466" id="namespace-libraries" />
+<p class="img-caption">
+ <strong>Figure 1.</strong> Namespaces for native libraries
+</p>
+
+<p>
+Namespaces for native libraries prevent apps from using private-platform native
+APIs (as was done with OpenSSL). It also removes situations where apps
+accidentally end up using platform libraries instead of their own (as witnessed
+with <code>libpng</code>).
+</p>
+
+<h2 id="adding-additional-native-libraries">Adding additional native
+libraries</h2>
+
+<p>
+In addition to standard public native libraries, vendors may choose to provide
+additional native libraries accessible to apps by putting them under the
+<code>/vendor</code> library folder (/vendor/lib for 32 bit libraries and,
+/vendor/lib64 for 64 bit) and listing them in:
+<code>/vendor/etc/public.libraries.txt</code>
+</p>
+
+<h2 id="updating-app-non-public">Updating apps to not use non-public native libraries</h2>
+
+<p>
+This feature is enabled only for applications targeting SDK version 24 or later;
+for backward compatibility, see <a
+href="http://android-developers.blogspot.com/2016/06/improving-stability-with-private-cc.html">Table
+1. What to expect if your app is linking against private native libraries</a>.
+The list of Android native libraries accessible to apps (also know as
+public native libraries) is listed in CDD section 3.1.1. Apps targeting 24 or
+later and using any non-public libraries should be updated. Please see <a
+href="https://developer.android.com/preview/behavior-changes.html#ndk">NDK Apps
+Linking to Platform Libraries </a> for more details.
+</p>
diff --git a/src/devices/tech/connect/block-numbers.jd b/src/devices/tech/connect/block-numbers.jd
new file mode 100644
index 0000000..d9c96c1
--- /dev/null
+++ b/src/devices/tech/connect/block-numbers.jd
@@ -0,0 +1,254 @@
+page.title=Implementing Block Phone Numbers
+@jd:body
+
+<!--
+ Copyright 2016 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>
+Because telephony is such an open communications channel - anyone may call or
+text any number at any time - Android users need the ability to easily block
+unwanted calls and texts.
+</p>
+
+<p>
+Before N, Android users had to rely on downloaded apps to restrict calls and
+texts from bothersome phone numbers. Many of those apps either do not work as
+desired or provide a less-than ideal experience because there are no proper APIs
+for blocking calls and messages.
+</p>
+
+<p>
+Some manufacturers might ship their own blocking solutions out-of-the-box, but
+if users switch devices, they may lose the blocked list completely due to lack
+of interoperability. Finally, even if users are employing dialing apps and
+messaging clients that provide such functionality, they likely still have to
+perform the block action in each app for the block to take effect for both
+calling and texting.
+</p>
+
+<h2 id="features">Features</h2>
+
+<p>
+The Android N release introduces a <code>BlockedNumberProvider</code> content
+provider that stores a list of phone numbers the user has specified should not
+be able to contact them via telephony communications (calls, SMS, MMS). The
+system will respect the numbers in the blocked list by restricting calls and
+texts from those numbers. Android N displays the list of blocked numbers and
+allows the user to add and remove numbers.
+</p>
+
+<p>
+Further, the number-blocking feature enables the system and the relevant apps on
+the platform to work together to help protect the user and to simplify the
+experience. The default dialer, default messaging client, UICC-privileged app,
+and apps with the same signature as the system can all directly read from and
+write to the blocked list. Because the blocked numbers are stored on the system,
+no matter what dialing or messaging apps the user employs, the numbers stay
+blocked. Finally, the blocked numbers list may be restored on any new device,
+regardless of the manufacturer.
+</p>
+
+<ul>
+<li>User will be guaranteed to have a blocking feature that works out-of-the-box
+and will not lose their block list when they switch apps or get a new phone. All
+relevant apps on the system can share the same list to provide the user with the
+most streamlined experience.
+<li>App developers do not need to develop their own way to manage a block list
+and the calls and messages that come in. They can simply use the
+platform-provided feature.
+<li>Dialer / messenger apps that are selected as the default by the user can
+read and write to the provider. Other apps can launch the block list management
+user interface by using <code>createManageBlockedNumbersIntent()</code>
+<li>OEMs can use platform provided feature to ship a blocking feature
+out-of-the-box. OEMs can rest assured that when users switch from another OEM’s
+device that they have a better onboarding experience because the block list will
+be transferred as well.
+<li>If carrier has their own dialer or messenger app, they can reuse platform
+feature for allowing the user to maintain a block list. They can rest assured
+that the user’s block list can stay with the users, even when they get a new
+device. Finally, all carrier-privileged apps can read the block list, so if the
+carrier wants to provide some additional more powerful blocking for the user
+based on the block list, that is now possible with this feature.</li></ul>
+
+<h2 id="data-flow">Data flow</h2>
+
+<img src="images/block-numbers-flow.png" alt="block numbers data flow" width="642" id="block-numbers-flow" />
+<p class="img-caption">
+ <strong>Figure 1.</strong> Block phone numbers data flow
+</p>
+
+<h2 id="examples-and-source">Examples and source</h2>
+
+<p>
+Here are example calls using the number-blocking new feature:
+</p>
+
+<h3 id="launch-from-app">Launch blocked number manager from app</h3>
+
+<pre>
+Context.startActivity(telecomManager.createManageBlockedNumbersIntent(), null);
+</pre>
+
+<h3 id="query-blocked-numbers">Query blocked numbers</h3>
+
+<pre>
+Cursor c = getContentResolver().query(BlockedNumbers.CONTENT_URI,
+ new String[]{BlockedNumbers.COLUMN_ID,
+ BlockedNumbers.COLUMN_ORIGINAL_NUMBER,
+ BlockedNumbers.COLUMN_E164_NUMBER}, null, null, null);
+</pre>
+
+<h3 id="put-blocked-number">Put blocked number</h3>
+
+<pre>
+ContentValues values = new ContentValues();
+values.put(BlockedNumbers.COLUMN_ORIGINAL_NUMBER, "1234567890");
+Uri uri = getContentResolver().insert(BlockedNumbers.CONTENT_URI, values);
+</pre>
+
+<h3 id="delete-blocked-number">Delete blocked number</h3>
+
+<pre>
+ContentValues values = new ContentValues();
+values.put(BlockedNumbers.COLUMN_ORIGINAL_NUMBER, "1234567890");
+Uri uri = getContentResolver().insert(BlockedNumbers.CONTENT_URI, values);
+getContentResolver().delete(uri, null, null);
+</pre>
+
+<h2 id="implementation">Implementation</h2>
+
+<p>
+These are the high-level tasks that must be completed to put the number-blocking
+feature to use:
+</p>
+
+<ul>
+<li>OEMs implement call/message-restriction features on their devices by using
+<code>BlockedNumberProvider</code>
+<li>If carrier has dialer or messenger application, implement call/message
+restriction features by using <code>BlockedNumberProvider</code>
+<li>Third-party dialer and messenger app vendors use
+<code>BlockedNumberProvider</code> for their blocking features</li>
+</ul>
+
+<h3 id="recommendations-for-oems">Recommendations for OEMs</h3>
+
+<p>
+If the device had previously never shipped with any additional call/message
+restriction features, use the number-blocking feature in the Android Open Source
+Project (AOSP) on all such devices. It is recommended that reasonable entry
+points for blocking are supported, such as blocking a number right from the call
+log or within a message thread.
+</p>
+
+<p>
+If the device had previously shipped with call/message restriction features,
+adapt the features so all <em>strict-match phone numbers</em> that are blocked
+are stored in the <code>BlockedNumberProvider,</code> and that the behavior
+around the provider satisfy the requirements for this feature outlined in the
+Android Compatibility Definition Document (CDD).
+</p>
+
+<p>
+Any other advanced feature can be implemented via custom providers and custom UI
+/ controls, as long as the CDD requirements are satisfied with regards to
+blocking strict-match phone numbers. It is recommended that those other features
+be labeled as “advanced” features to avoid confusion with the basic
+number-blocking feature.
+</p>
+
+<h3 id="apis">APIs</h3>
+
+<p>
+Here are the APIs in use:
+</p>
+<ul>
+<li><code><a
+href="http://developer.android.com/reference/android/telecom/TelecomManager.html">TelecomManager</a>
+API</code>
+ <ul>
+ <li><code>Intent createManageBlockedNumbersIntent()</code>
+ </ul>
+</li>
+<li><code><a
+href="http://developer.android.com/reference/android/telephony/CarrierConfigManager.html">Carrier
+Config</a></code>
+ <ul>
+ <li><code>KEY_DURATION_BLOCKING_DISABLED_AFTER_EMERGENCY_INT</code>
+ </ul>
+</li>
+<li>Please refer to <code>BlockedNumberContract</code>
+ <ul>
+ <li>APIs provided by <code><a
+ href="https://developer.android.com/reference/android/provider/BlockedNumberContract.html">BlockedNumberContract</a></code></li>
+ <li><code>boolean isBlocked(Context context, String phoneNumber)</code></li>
+ <li><code>int unblock(Context context, String phoneNumber)</code></li>
+ <li><code>boolean canCurrentUserBlockNumbers(Context context)</code></li>
+ </ul>
+ </li>
+</ul>
+
+<h3 id="user-interface">User interface</h3>
+<p>
+The BlockedNumbersActivity.java user interface provided in AOSP can be used as
+is. Partners may also implement their own version of the UI, as long as it
+satisfies related CDD requirements.
+</p>
+
+<p>
+Please note, the partner’s PC application for backup and restore may be needed
+to implement restoration of the block list by using
+<code>BlockedNumberProvider</code>. See the images below for the blocked
+numbers interface supplied in AOSP.
+</p>
+
+<img src="images/block-numbers-ui.png" alt="block numbers user interface" width="665" id="block-numbers-ui" />
+<p class="img-caption">
+ <strong>Figure 2.</strong> Block phone numbers user interface
+</p>
+
+<h2 id="validation">Validation</h2>
+
+<p>
+Implementers can ensure their version of the feature works as intended by
+running the following CTS tests:
+</p>
+
+<pre>
+android.provider.cts.BlockedNumberContractTest
+com.android.cts.numberblocking.hostside.NumberBlockingTest
+android.telecom.cts.ExtendedInCallServiceTest#testIncomingCallFromBlockedNumber_IsRejected
+android.telephony.cts.SmsManagerTest#testSmsBlocking
+</pre>
+
+<p>
+The <code>BlockedNumberProvider</code> can be manipulated using <code>adb</code> commands
+after running <code>$ adb root</code>. For example:
+</p>
+<pre>
+$ adb root
+$ adb shell content query --uri content://com.android.blockednumber/blocked
+$ adb shell content insert --uri / content://com.android.blockednumber/blocked --bind / original_number:s:'6501002000'
+$ adb shell content delete --uri / content://com.android.blockednumber/blocked/1
+</pre>
diff --git a/src/devices/tech/connect/felica.jd b/src/devices/tech/connect/felica.jd
new file mode 100644
index 0000000..d44a6a1
--- /dev/null
+++ b/src/devices/tech/connect/felica.jd
@@ -0,0 +1,63 @@
+page.title=Host Card Emulation of FeliCa
+@jd:body
+
+<!--
+ Copyright 2016 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>Felicity Card, or FeliCa, an RFID smart card system, is the NFC standard in
+Japan, Hong Kong and other markets in the Asia-Pacific (APAC) region. It has
+been expanding in adoption in that region and is well used among transit,
+retail, and loyalty services. Adding support for FeliCa in Android devices
+destined for that region improves their usefulness.</p>
+
+<h2 id="implementation">Implementation</h2>
+
+<p>HCE FeliCa requires NFC hardware that supports the NFC-F (JIS 6319-4) standard.</p>
+
+<p>Host Card Emulation (HCE) of FeliCa is essentially a parallel implementation to
+the existing HCE implementation on Android; it creates new classes for FeliCa
+where it makes sense and merges with the existing HCE implementation where
+possible.</p>
+
+<p>The following Android components are included in the Android Open Source Project
+(AOSP):</p>
+
+<ul>
+ <li>Framework classes
+ <ul>
+ <li>Public HostNfcFService (convenience service class)
+ <li>@hide NfcFServiceInfo
+ </ul>
+ <li>Modifications to core NFC framework</li></ul>
+ </li>
+</ul>
+
+<p>As with most Android platform features, manufacturers write the drivers to
+make the hardware work with the API.</p>
+
+<h2 id="validation">Validation</h2>
+
+<p>Use the <a href="{@docRoot}compatibility/cts/index.html">Android Compatibility
+Test Suite</a> to ensure this feature works as intended. CTS Verifier
+(NfcTestActivity) tests this implementation for devices reporting the
+<code>android.hardware.nfc.hcef</code> feature constant.</p>
diff --git a/src/devices/tech/connect/images/block-numbers-flow.png b/src/devices/tech/connect/images/block-numbers-flow.png
new file mode 100644
index 0000000..a5eb265
--- /dev/null
+++ b/src/devices/tech/connect/images/block-numbers-flow.png
Binary files differ
diff --git a/src/devices/tech/connect/images/block-numbers-ui.png b/src/devices/tech/connect/images/block-numbers-ui.png
new file mode 100644
index 0000000..093d299
--- /dev/null
+++ b/src/devices/tech/connect/images/block-numbers-ui.png
Binary files differ
diff --git a/src/devices/tech/connect/images/host_card.png b/src/devices/tech/connect/images/host_card.png
new file mode 100755
index 0000000..315c5f5
--- /dev/null
+++ b/src/devices/tech/connect/images/host_card.png
Binary files differ
diff --git a/src/devices/tech/connect/index.jd b/src/devices/tech/connect/index.jd
new file mode 100644
index 0000000..7e9fbb1
--- /dev/null
+++ b/src/devices/tech/connect/index.jd
@@ -0,0 +1,21 @@
+page.title=Ensuring Network Connectivity
+@jd:body
+
+<!--
+ Copyright 2016 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+
+<p>Follow the instructions in this section to ensure your Android devices are
+connected properly.</p>
diff --git a/src/devices/tech/dalvik/images/jit-arch.png b/src/devices/tech/dalvik/images/jit-arch.png
new file mode 100644
index 0000000..de6177b
--- /dev/null
+++ b/src/devices/tech/dalvik/images/jit-arch.png
Binary files differ
diff --git a/src/devices/tech/dalvik/images/jit-daemon.png b/src/devices/tech/dalvik/images/jit-daemon.png
new file mode 100644
index 0000000..60098b9
--- /dev/null
+++ b/src/devices/tech/dalvik/images/jit-daemon.png
Binary files differ
diff --git a/src/devices/tech/dalvik/images/jit-profile-comp.png b/src/devices/tech/dalvik/images/jit-profile-comp.png
new file mode 100644
index 0000000..0001bdc
--- /dev/null
+++ b/src/devices/tech/dalvik/images/jit-profile-comp.png
Binary files differ
diff --git a/src/devices/tech/dalvik/images/jit-workflow.png b/src/devices/tech/dalvik/images/jit-workflow.png
new file mode 100644
index 0000000..57365eb
--- /dev/null
+++ b/src/devices/tech/dalvik/images/jit-workflow.png
Binary files differ
diff --git a/src/devices/tech/dalvik/jit-compiler.jd b/src/devices/tech/dalvik/jit-compiler.jd
new file mode 100644
index 0000000..00f26e4
--- /dev/null
+++ b/src/devices/tech/dalvik/jit-compiler.jd
@@ -0,0 +1,267 @@
+page.title=Implementing ART Just-In-Time (JIT) Compiler
+@jd:body
+
+<!--
+ Copyright 2016 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+
+
+<div id="qv-wrapper">
+<div id="qv">
+ <h2 id="Contents">In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+</div>
+</div>
+
+<p>
+Android N adds a just-in-time (JIT) compiler with code profiling to Android
+runtime (ART) that constantly improves the performance of Android apps as they
+run. The JIT compiler complements ART's current ahead-of-time (AOT) compiler and
+improves runtime performance, saves storage space, and speeds app updates and
+system updates.
+</p>
+
+<p>
+The JIT compiler also improves upon the AOT compiler by avoiding system slowdown
+during automatic application updates or recompilation of applications during
+OTAs. This feature should require minimal device integration on the part of
+manufacturers.
+</p>
+
+<p>
+JIT and AOT use the same compiler with an almost identical set of optimizations.
+The generated code might not be the same but it depends. JIT makes uses of
+runtime type information and can do better inlining. Also, with JIT we sometimes
+do OSR compilation (on stack replacement) which will again generate a bit
+different code.
+</p>
+
+<p>
+See <a
+href="https://developer.android.com/preview/api-overview.html#jit_aot">Profile-guided
+JIT/AOT Compilation</a> on developer.android.com for a more thorough overview.
+</p>
+
+<h2 id="architectural-overview">Architectural Overview</h2>
+
+<img src="images/jit-arch.png" alt="JIT architecture" width="633" id="JIT-architecture" />
+<p class="img-caption">
+ <strong>Figure 1.</strong> JIT architecture - how it works
+</p>
+
+<h2 id="flow">Flow</h2>
+
+<p>
+JIT compilation works in this manner:
+</p>
+
+<ol>
+<li>The user runs the app, which then triggers ART to load the .dex file.
+<li>If the .oat file (the AOT binary for the .dex file) is available, ART uses
+them directly. Note that .oat files are generated regularly. However, that does
+not imply they contain compiled code (AOT binary).
+<li>If no .oat file is available, ART runs through either JIT or an interpreter
+to execute the .dex file. ART will always use the .oat files if available.
+Otherwise, it will use the APK and extract it in memory to get to the .dex
+incurring a big memory overhead (equal to the size of the dex files).
+<li>JIT is enabled for any application that is not compiled according to the
+"speed" compilation filter (which says, compile as much as you can from the
+app).
+<li>The JIT profile data is dumped to a file in a system directory. Only the
+application has access to the directory.
+<li>The AOT compilation (dex2oat) daemon parses that file to drive its
+compilation.</li>
+</ol>
+
+<img src="images/jit-profile-comp.png" alt="Profile-guided comp" width="452" id="JIT-profile-comp" />
+<p class="img-caption">
+ <strong>Figure 2.</strong> Profile-guided compilation
+</p>
+
+<img src="images/jit-daemon.png" alt="JIT daemon" width="718" id="JIT-daemon" />
+<p class="img-caption">
+ <strong>Figure 3.</strong> How the daemon works
+</p>
+
+<p>
+The Google Play service is an example used by other apps. These application tend
+to behave more like shared libraries.
+</p>
+
+<h2 id="jit-workflow">JIT Workflow</h2>
+<p>
+See the following high-level overview of how JIT works in the next diagram.
+</p>
+
+<img src="images/jit-workflow.png" alt="JIT architecture" width="707" id="JIT-workflow" />
+<p class="img-caption">
+ <strong>Figure 4.</strong> JIT data flow
+</p>
+
+<p>
+This means:
+</p>
+
+<ul>
+<li>Profiling information is stored in the code cache and subjected to garbage
+collection under memory pressure.
+<li>As a result, there’s no guarantee the snapshot taken when the application is
+in the background will contain the complete data (i.e. everything that was
+JITed).
+<li>There is no attempt to make sure we record everything as that will impact
+runtime performance.
+<li>Methods can be in three different states: <ul>
+ <li>interpreted (dex code)
+ <li>JIT compiled
+ <li>AOT compiled
+<li>If both, JIT and AOT code exists (e.g. due to repeated de-optimizations),
+the JITed code will be preferred.
+<li>The memory requirement to run JIT without impacting foreground app
+performance depends upon the app in question. Large apps will require more
+memory than small apps. In general, big apps stabilize around 4 MB.</li></ul>
+</li>
+</ul>
+
+<h2 id="system-properties">System Properties</h2>
+
+<p>
+These system properties control JIT behavior:
+</p><ul>
+<li><code>dalvik.vm.usejit <true|false></code> - Whether or not the JIT is
+enabled.
+<li><code>dalvik.vm.jitinitialsize</code> (default 64K) - The initial capacity
+of the code cache. The code cache will regularly GC and increase if needed. It
+is possible to view the size of the code cache for your app with:<br>
+<code> $ adb shell dumpsys meminfo -d <pid></code>
+<li><code>dalvik.vm.jitmaxsize</code> (default 64M) - The maximum capacity of
+the code cache.
+<li><code>dalvik.vm.jitthreshold <integer></code> (default 10000) - This
+is the threshold that the "hotness" counter of a method needs to pass in order
+for the method to be JIT compiled. The "hotness" counter is a metric internal
+to the runtime. It includes the number of calls, backward branches & other
+factors.
+<li><code>dalvik.vm.usejitprofiles <true|false></code> - Whether or not
+JIT profiles are enabled; this may be used even if usejit is false.
+<li><code>dalvik.vm.jitprithreadweight <integer></code> (default to
+<code>dalvik.vm.jitthreshold</code> / 20) - The weight of the JIT "samples"
+(see jitthreshold) for the application UI thread. Use to speed up compilation
+of methods that directly affect users experience when interacting with the
+app.
+<li><code>dalvik.vm.jittransitionweight <integer></code>
+(<code>dalvik.vm.jitthreshold</code> / 10) - The weight of the method
+invocation that transitions between compile code and interpreter. This helps
+make sure the methods involved are compiled to minimize transitions (which are
+expensive).
+</li>
+</ul>
+
+<h2 id="tuning">Tuning</h2>
+
+<p>
+Partners may precompile (some of) the system apps if they want so. Initial JIT
+performance vs pre-compiled depends on the the app, but in general they are
+quite close. It might be worth noting that precompiled apps will not be profiled
+and as such will take more space and may miss on other optimizations.
+</p>
+
+<p>
+In Android N, there's a generic way to specify the level of
+compilation/verification based on the different use cases. For example, the
+default option for install time is to do only verification (and postpone
+compilation to a later stage). The compilation levels can be configured via
+system properties with the defaults being:
+</p>
+
+<pre>
+pm.dexopt.install=interpret-only
+pm.dexopt.bg-dexopt=speed-profile
+pm.dexopt.ab-ota=speed-profile
+pm.dexopt.nsys-library=speed
+pm.dexopt.shared-apk=speed
+pm.dexopt.forced-dexopt=speed
+pm.dexopt.core-app=speed
+pm.dexopt.first-boot=interpret-only
+pm.dexopt.boot=verify-profile
+</pre>
+
+<p>
+Note the reference to A/B over-the-air (OTA) updates here.
+</p>
+
+<p>
+Check <code>$ adb shell cmd package compile</code> for usage. Note all commands
+are preceded by a dollar ($) sign that should be excluded when copying and
+pasting. A few common use cases:
+</p>
+
+<h3 id="turn-on-jit-logging">Turn on JIT logging</h3>
+
+<pre>
+$ adb root
+$ adb shell stop
+$ adb shell setprop dalvik.vm.extra-opts -verbose:jit
+$ adb shell start
+</pre>
+
+<h3 id="disable-jit-and-run-applications-in-interpreter">Disable JIT</h3>
+
+<pre>
+$ adb root
+$ adb shell stop
+$ adb shell setprop dalvik.vm.usejit false
+$ adb shell start
+</pre>
+
+<h3 id="force-compilation-of-a-specific-package">Force compilation of a specific
+package</h3>
+
+<ul>
+<li>Profile-based:
+<code>$ adb shell cmd package compile -m speed-profile -f
+my-package</code>
+<li>Full:
+<code>$ adb shell cmd package compile -m speed -f
+my-package</code></li>
+</ul>
+
+<h3 id="force-compilation-of-all-packages">Force compilation of all
+packages</h3>
+
+<ul>
+<li>Profile-based:
+<code>$ adb shell cmd package compile -m speed-profile -f
+-a</code>
+<li>Full:
+<code>$ adb shell cmd package compile -m speed -f -a</code></li></ul>
+
+<h3 id="clear-profile-data-and-remove-compiled-code">Clear profile data and
+remove compiled code</h3>
+
+<ul>
+<li>One package:
+<code>$ adb shell cmd package compile --reset my-package</code>
+<li>All packages
+<code>$ adb shell cmd package compile --reset
+-a</code></li>
+</ul>
+
+<h2 id="validation">Validation</h2>
+
+<p>
+To ensure their version of the feature works as intended, partners should run
+the ART test in <code>android/art/test</code>. Also, see the CTS test
+<code>hostsidetests/compilation</code> for userdedug builds.
+</p>