am 55f9dea8: Merge "Docs: Updating the encryption page for L" into lmp-dev

* commit '55f9dea8336d82714d6b4f88498e37e48736c38b':
  Docs: Updating the encryption page for L
diff --git a/src/compatibility/4.4/versions.jd b/src/compatibility/4.4/versions.jd
index d2118df..75da784 100644
--- a/src/compatibility/4.4/versions.jd
+++ b/src/compatibility/4.4/versions.jd
@@ -16,4 +16,7 @@
 <ul>
 <li>4.4</li>
 <li>4.4.1</li>
+<li>4.4.2</li>
+<li>4.4.3</li>
+<li>4.4.4</li>
 </ul>
diff --git a/src/compatibility/android-cts-manual.pdf b/src/compatibility/android-cts-manual.pdf
index c996b71..ac6a125 100644
--- a/src/compatibility/android-cts-manual.pdf
+++ b/src/compatibility/android-cts-manual.pdf
Binary files differ
diff --git a/src/devices/audio_avoiding_pi.jd b/src/devices/audio_avoiding_pi.jd
index a8cd208..49b901e 100644
--- a/src/devices/audio_avoiding_pi.jd
+++ b/src/devices/audio_avoiding_pi.jd
@@ -42,14 +42,14 @@
 avoiding artifacts due to underruns.
 </p>
 
-<h2 id="priorityInversion">Priority Inversion</h2>
+<h2 id="priorityInversion">Priority inversion</h2>
 
 <p>
 <a href="http://en.wikipedia.org/wiki/Priority_inversion">Priority inversion</a>
 is a classic failure mode of real-time systems,
 where a higher-priority task is blocked for an unbounded time waiting
-for a lower-priority task to release a resource such as [shared
-state protected by] a
+for a lower-priority task to release a resource such as (shared
+state protected by) a
 <a href="http://en.wikipedia.org/wiki/Mutual_exclusion">mutex</a>.
 </p>
 
@@ -64,7 +64,7 @@
 
 <p>
 In the Android audio implementation, priority inversion is most
-likely to occur in these places. And so we focus attention here:
+likely to occur in these places. And so you should focus your attention here:
 </p>
 
 <ul>
@@ -99,10 +99,10 @@
 similar to those for AudioTrack.
 </p>
 
-<h2 id="commonSolutions">Common Solutions</h2>
+<h2 id="commonSolutions">Common solutions</h2>
 
 <p>
-The typical solutions listed in the Wikipedia article include:
+The typical solutions include:
 </p>
 
 <ul>
@@ -130,18 +130,17 @@
 in Linux kernel, but are not currently exposed by the Android C
 runtime library
 <a href="http://en.wikipedia.org/wiki/Bionic_(software)">Bionic</a>.
-We chose not to use them in the audio system
-because they are relatively heavyweight, and because they rely on
-a trusted client.
+They are not used in the audio system because they are relatively heavyweight,
+and because they rely on a trusted client.
 </p>
 
 <h2 id="androidTechniques">Techniques used by Android</h2>
 
 <p>
-We started with "try lock" and lock with timeout. These are
+Experiments started with "try lock" and lock with timeout. These are
 non-blocking and bounded blocking variants of the mutex lock
-operation. Try lock and lock with timeout worked fairly well for
-us, but were susceptible to a couple of obscure failure modes: the
+operation. Try lock and lock with timeout worked fairly well but were
+susceptible to a couple of obscure failure modes: the
 server was not guaranteed to be able to access the shared state if
 the client happened to be busy, and the cumulative timeout could
 be too long if there was a long sequence of unrelated locks that
@@ -167,10 +166,9 @@
 In practice, we've found that the retries are not a problem.
 </p>
 
-<p>
-<strong>Note</strong>: Atomic operations and their interactions with memory barriers
-are notoriously badly misunderstood and used incorrectly. We include
-these methods here for completeness but recommend you also read the article
+<p class="note"><strong>Note:</strong> Atomic operations and their interactions with memory barriers
+are notoriously badly misunderstood and used incorrectly. These methods are
+included here for completeness but recommend you also read the article
 <a href="https://developer.android.com/training/articles/smp.html">
 SMP Primer for Android</a>
 for further information.
@@ -234,7 +232,7 @@
 
 </ul>
 
-<h2 id="nonBlockingAlgorithms">Non-Blocking Algorithms</h2>
+<h2 id="nonBlockingAlgorithms">Non-blocking algorithms</h2>
 
 <p>
 <a href="http://en.wikipedia.org/wiki/Non-blocking_algorithm">Non-blocking algorithms</a>
@@ -273,9 +271,8 @@
 </p>
 
 <p>
-For developers, we may update some of the sample OpenSL ES application
-code to use non-blocking algorithms or reference a non-Android open source
-library.
+For developers, some of the sample OpenSL ES application code may be updated to
+use non-blocking algorithms or reference a non-Android open source library.
 </p>
 
 <h2 id="tools">Tools</h2>
@@ -297,7 +294,7 @@
 not tell you in advance.
 </p>
 
-<h2 id="aFinalWord">A Final Word</h2>
+<h2 id="aFinalWord">A final word</h2>
 
 <p>
 After all of this discussion, don't be afraid of mutexes. Mutexes
diff --git a/src/devices/audio_debugging.jd b/src/devices/audio_debugging.jd
index 31d61d5..7ac3a53 100644
--- a/src/devices/audio_debugging.jd
+++ b/src/devices/audio_debugging.jd
@@ -39,12 +39,12 @@
 
 <ol>
 <li><code>cd frameworks/av/services/audioflinger</code></li>
-<li>edit <code>Configuration.h</code></li>
-<li>uncomment <code>#define TEE_SINK</code></li>
-<li>re-build <code>libaudioflinger.so</code></li>
+<li>Edit <code>Configuration.h</code>.</li>
+<li>Uncomment <code>#define TEE_SINK</code>.</li>
+<li>Re-build <code>libaudioflinger.so</code>.</li>
 <li><code>adb root</code></li>
 <li><code>adb remount</code></li>
-<li>push or sync the new <code>libaudioflinger.so</code> to the device's <code>/system/lib</code></li>
+<li>Push or sync the new <code>libaudioflinger.so</code> to the device's <code>/system/lib</code>.</li>
 </ol>
 
 <h3>Run-time setup</h3>
@@ -72,7 +72,7 @@
 </code>
 </li>
 <li><code>echo af.tee=# &gt; /data/local.prop</code>
-<br />where the <code>af.tee</code> value is a number described below
+<br />Where the <code>af.tee</code> value is a number described below.
 </li>
 <li><code>chmod 644 /data/local.prop</code></li>
 <li><code>reboot</code></li>
@@ -100,17 +100,17 @@
 <h3>Test and acquire data</h3>
 
 <ol>
-<li>Run your audio test</li>
+<li>Run your audio test.</li>
 <li><code>adb shell dumpsys media.audio_flinger</code></li>
 <li>Look for a line in dumpsys output such as this:<br />
 <code>tee copied to /data/misc/media/20131010101147_2.wav</code>
-<br />This is a PCM .wav file</br>
+<br />This is a PCM .wav file.</br>
 </li>
 <li><code>adb pull</code> any <code>/data/misc/media/*.wav</code> files of interest;
 note that track-specific dump filenames do not appear in the dumpsys output,
-but are still saved to <code>/data/misc/media</code> upon track closure
+but are still saved to <code>/data/misc/media</code> upon track closure.
 </li>
-<li>Review the dump files for privacy concerns before sharing with others</li>
+<li>Review the dump files for privacy concerns before sharing with others.</li>
 </ol>
 
 <h4>Suggestions</h4>
@@ -118,15 +118,15 @@
 <p>Try these ideas for more useful results:</p>
 
 <ul>
-<li>Disable touch sounds and key clicks</li>
-<li>Maximize all volumes</li>
+<li>Disable touch sounds and key clicks.</li>
+<li>Maximize all volumes.</li>
 <li>Disable apps that make sound or record from microphone,
-if they are not of interest to your test
+if they are not of interest to your test.
 </li>
 <li>Track-specific dumps are only saved when the track is closed;
 you may need to force close an app in order to dump its track-specific data
 <li>Do the <code>dumpsys</code> immediately after test;
-there is a limited amount of recording space available</li>
+there is a limited amount of recording space available.</li>
 <li>To make sure you don't lose your dump files,
 upload them to your host periodically.
 Only a limited number of dump files are preserved;
@@ -140,10 +140,10 @@
 Restore your build and device as follows:
 </p>
 <ol>
-<li>Revert the source code changes to <code>Configuration.h</code></li>
-<li>Re-build <code>libaudioflinger.so</code></li>
+<li>Revert the source code changes to <code>Configuration.h</code>.</li>
+<li>Re-build <code>libaudioflinger.so</code>.</li>
 <li>Push or sync the restored <code>libaudioflinger.so</code>
-to the device's <code>/system/lib</code>
+to the device's <code>/system/lib</code>.
 </li>
 <li><code>adb shell</code></li>
 <li><code>rm /data/local.prop</code></li>
@@ -157,7 +157,7 @@
 
 <p>
 The standard Java language logging API in Android SDK is
-<a class="external-link" href="http://developer.android.com/reference/android/util/Log.html" target="_android">android.util.Log</a>.
+<a href="http://developer.android.com/reference/android/util/Log.html">android.util.Log</a>.
 </p>
 
 <p>
@@ -228,15 +228,14 @@
 <h3>Benefits</h3>
 
 <p>
-The benefits of the <code>media.log</code> system include:
+The benefits of the <code>media.log</code> system are that it:
 </p>
 <ul>
-<li>doesn't spam the main log unless and until it is needed</li>
-<li>can be examined even when <code>mediaserver</code> crashes or hangs</li>
-<li>is non-blocking per timeline</li>
-<li>
-less disturbance to performance
-(of course no form of logging is completely non-intrusive)
+<li>Doesn't spam the main log unless and until it is needed.</li>
+<li>Can be examined even when <code>mediaserver</code> crashes or hangs.</li>
+<li>Is non-blocking per timeline.</li>
+<li>Offers less disturbance to performance.
+(Of course no form of logging is completely non-intrusive.)
 </li>
 </ul>
 
@@ -251,9 +250,9 @@
 Notable points:
 </p>
 <ul>
-<li><code>init</code> forks and execs <code>mediaserver</code></li>
-<li><code>init</code> detects the death of <code>mediaserver</code>, and re-forks as necessary</li>
-<li><code>ALOGx</code> logging is not shown
+<li><code>init</code> forks and execs <code>mediaserver</code>.</li>
+<li><code>init</code> detects the death of <code>mediaserver</code>, and re-forks as necessary.</li>
+<li><code>ALOGx</code> logging is not shown.
 </ul>
 
 <p>
@@ -348,8 +347,7 @@
 After you have added the logs, re-build AudioFlinger.
 </p>
 
-<b>Caution:</b>
-<p>
+<p class="caution"><strong>Caution:</strong>
 A separate <code>NBLog::Writer</code> timeline is required per thread,
 to ensure thread safety, since timelines omit mutexes by design.  If you
 want more than one thread to use the same timeline, you can protect with an
diff --git a/src/devices/audio_implement.jd b/src/devices/audio_implement.jd
index 26aa5f5..5d04074 100644
--- a/src/devices/audio_implement.jd
+++ b/src/devices/audio_implement.jd
@@ -244,7 +244,7 @@
 <p>For <code>AudioSource</code> tuning, there are no explicit requirements on audio gain or audio processing
 with the exception of voice recognition (<code>VOICE_RECOGNITION</code>).</p>
 
-<p>The following are the requirements for voice recognition:</p>
+<p>The requirements for voice recognition are:</p>
 
 <ul>
 <li>"flat" frequency response (+/- 3dB) from 100Hz to 4kHz</li>
diff --git a/src/devices/audio_src.jd b/src/devices/audio_src.jd
index b57717c..ffacba6 100644
--- a/src/devices/audio_src.jd
+++ b/src/devices/audio_src.jd
@@ -9,9 +9,11 @@
   </div>
 </div>
 
+<h2 id="srcIntro">Introduction</h2>
+
 <p>
 See the Wikipedia article
-<a class="external-link" href="http://en.wikipedia.org/wiki/Resampling_(audio)" target="_android">Resampling (audio)</a>
+<a href="http://en.wikipedia.org/wiki/Resampling_(audio)">Resampling (audio)</a>
 for a generic definition of sample rate conversion, also known as "resampling."
 The remainder of this article describes resampling within Android.
 </p>
@@ -69,49 +71,6 @@
 and identifies where they should typically be used.
 </p>
 
-<h2 id="srcTerms">Terminology</h2>
-
-<dl>
-
-<dt>downsample</dt>
-<dd>to resample, where sink sample rate &lt; source sample rate</dd>
-
-<dt>Nyquist frequency</dt>
-<dd>
-The Nyquist frequency, equal to 1/2 of a given sample rate, is the
-maximum frequency component that can be represented by a discretized
-signal at that sample rate.  For example, the human hearing range is
-typically assumed to extend up to approximately 20 kHz, and so a digital
-audio signal must have a sample rate of at least 40 kHz to represent that
-range.  In practice, sample rates of 44.1 kHz and 48 kHz are commonly
-used, with Nyquist frequencies of 22.05 kHz and 24 kHz respectively.
-See the Wikipedia articles
-<a class="external-link" href="http://en.wikipedia.org/wiki/Nyquist_frequency" target="_android">Nyquist frequency</a>
-and
-<a class="external-link" href="http://en.wikipedia.org/wiki/Hearing_range" target="_android">Hearing range</a>
-for more information.
-</dd>
-
-<dt>resampler</dt>
-<dd>synonym for sample rate converter</dd>
-
-<dt>resampling</dt>
-<dd>the process of converting sample rate</dd>
-
-<dt>sample rate converter</dt>
-<dd>a module that resamples</dd>
-
-<dt>sink</dt>
-<dd>the output of a resampler</dd>
-
-<dt>source</dt>
-<dd>the input to a resampler</dd>
-
-<dt>upsample</dt>
-<dd>to resample, where sink sample rate &gt; source sample rate</dd>
-
-</dl>
-
 <h2 id="srcResamplers">Resampler implementations</h2>
 
 <p>
diff --git a/src/devices/audio_terminology.jd b/src/devices/audio_terminology.jd
index a27703b..0b876a7 100644
--- a/src/devices/audio_terminology.jd
+++ b/src/devices/audio_terminology.jd
@@ -201,24 +201,18 @@
 <dd>
 A short range wireless technology.
 The major audio-related
-<a class="external-link" href="http://en.wikipedia.org/wiki/Bluetooth_profile"
-target="_android">Bluetooth profiles</a>
+<a href="http://en.wikipedia.org/wiki/Bluetooth_profile">Bluetooth profiles</a>
 and
-<a class="external-link" href="http://en.wikipedia.org/wiki/Bluetooth_protocols"
-target="_android">Bluetooth protocols</a>
+<a href="http://en.wikipedia.org/wiki/Bluetooth_protocols">Bluetooth protocols</a>
 are described at these Wikipedia articles:
 
 <ul>
 
-<li><a class="external-link"
-href="http://en.wikipedia.org/wiki/Bluetooth_profile#Advanced_Audio_Distribution_Profile_.28A2DP.29"
-target="_android">A2DP</a>
+<li><a href="http://en.wikipedia.org/wiki/Bluetooth_profile#Advanced_Audio_Distribution_Profile_.28A2DP.29">A2DP</a>
 for music
 </li>
 
-<li><a class="external-link"
-href="http://en.wikipedia.org/wiki/Bluetooth_protocols#Synchronous_connection-oriented_.28SCO.29_link"
-target="_android">SCO</a>
+<li><a href="http://en.wikipedia.org/wiki/Bluetooth_protocols#Synchronous_connection-oriented_.28SCO.29_link">SCO</a>
 for telephony
 </li>
 
@@ -257,14 +251,13 @@
 <dt>S/PDIF</dt>
 <dd>
 Sony/Philips Digital Interface Format is an interconnect for uncompressed PCM.
-See Wikipedia article <a class="external-link" href="http://en.wikipedia.org/wiki/S/PDIF"
-target="_android">S/PDIF</a>.
+See Wikipedia article <a href="http://en.wikipedia.org/wiki/S/PDIF">S/PDIF</a>.
 </dd>
 
 <dt>USB</dt>
 <dd>
 Universal Serial Bus.
-See Wikipedia article <a class="external-link" href="http://en.wikipedia.org/wiki/USB" target="_android">USB</a>.
+See Wikipedia article <a href="http://en.wikipedia.org/wiki/USB">USB</a>.
 </dd>
 
 </dl>
@@ -279,14 +272,12 @@
 
 See these Wikipedia articles:
 <ul>
-<li><a class="external-link" href="http://en.wikipedia.org/wiki/General-purpose_input/output"
-target="_android">GPIO</a></li>
-<li><a class="external-link" href="http://en.wikipedia.org/wiki/I%C2%B2C" target="_android">I²C</a></li>
-<li><a class="external-link" href="http://en.wikipedia.org/wiki/I%C2%B2S" target="_android">I²S</a></li>
-<li><a class="external-link" href="http://en.wikipedia.org/wiki/McASP" target="_android">McASP</a></li>
-<li><a class="external-link" href="http://en.wikipedia.org/wiki/SLIMbus" target="_android">SLIMbus</a></li>
-<li><a class="external-link" href="http://en.wikipedia.org/wiki/Serial_Peripheral_Interface_Bus"
-target="_android">SPI</a></li>
+<li><a href="http://en.wikipedia.org/wiki/General-purpose_input/output">GPIO</a></li>
+<li><a href="http://en.wikipedia.org/wiki/I%C2%B2C">I²C</a></li>
+<li><a href="http://en.wikipedia.org/wiki/I%C2%B2S">I²S</a></li>
+<li><a href="http://en.wikipedia.org/wiki/McASP">McASP</a></li>
+<li><a href="http://en.wikipedia.org/wiki/SLIMbus">SLIMbus</a></li>
+<li><a href="http://en.wikipedia.org/wiki/Serial_Peripheral_Interface_Bus">SPI</a></li>
 </ul>
 
 <h3 id="signalTerms">Audio Signal Path</h3>
@@ -307,8 +298,7 @@
 be implemented that way.  An ADC is usually preceded by a low-pass filter
 to remove any high frequency components that are not representable using
 the desired sample rate.  See Wikipedia article
-<a class="external-link" href="http://en.wikipedia.org/wiki/Analog-to-digital_converter"
-target="_android">Analog-to-digital_converter</a>.
+<a href="http://en.wikipedia.org/wiki/Analog-to-digital_converter">Analog-to-digital_converter</a>.
 </dd>
 
 <dt>AP</dt>
@@ -323,7 +313,7 @@
 Strictly, the term "codec" is reserved for modules that both encode and decode,
 however it can also more loosely refer to only one of these.
 See Wikipedia article
-<a class="external-link" href="http://en.wikipedia.org/wiki/Audio_codec" target="_android">Audio codec</a>.
+<a href="http://en.wikipedia.org/wiki/Audio_codec">Audio codec</a>.
 </dd>
 
 <dt>DAC</dt>
@@ -334,8 +324,7 @@
 a low-pass filter to remove any high frequency components introduced
 by digital quantization.
 See Wikipedia article
-<a class="external-link" href="http://en.wikipedia.org/wiki/Digital-to-analog_converter"
-target="_android">Digital-to-analog converter</a>.
+<a href="http://en.wikipedia.org/wiki/Digital-to-analog_converter">Digital-to-analog converter</a>.
 </dd>
 
 <dt>DSP</dt>
@@ -353,8 +342,7 @@
 where the relative density of 1s versus 0s indicates the signal level.
 It is commonly used by digital to analog converters.
 See Wikipedia article
-<a class="external-link" href="http://en.wikipedia.org/wiki/Pulse-density_modulation"
-target="_android">Pulse-density modulation</a>.
+<a href="http://en.wikipedia.org/wiki/Pulse-density_modulation">Pulse-density modulation</a>.
 </dd>
 
 <dt>PWM</dt>
@@ -364,8 +352,7 @@
 where the relative width of a digital pulse indicates the signal level.
 It is commonly used by analog to digital converters.
 See Wikipedia article
-<a class="external-link" href="http://en.wikipedia.org/wiki/Pulse-width_modulation"
-target="_android">Pulse-width modulation</a>.
+<a href="http://en.wikipedia.org/wiki/Pulse-width_modulation">Pulse-width modulation</a>.
 </dd>
 
 </dl>
@@ -384,7 +371,7 @@
 Advanced Linux Sound Architecture.  As the name suggests, it is an audio
 framework primarily for Linux, but it has influenced other systems.
 See Wikipedia article
-<a class="external-link" href="http://en.wikipedia.org/wiki/Advanced_Linux_Sound_Architecture" target="_android">ALSA</a>
+<a href="http://en.wikipedia.org/wiki/Advanced_Linux_Sound_Architecture">ALSA</a>
 for the general definition. As used within Android, it refers primarily
 to the kernel audio framework and drivers, not to the user-mode API. See
 tinyalsa.
@@ -394,14 +381,14 @@
 <dd>
 An API and implementation framework for output (post-processing) effects
 and input (pre-processing) effects.  The API is defined at
-<a href="http://developer.android.com/reference/android/media/audiofx/AudioEffect.html" target="_android">android.media.audiofx.AudioEffect</a>.
+<a href="http://developer.android.com/reference/android/media/audiofx/AudioEffect.html">android.media.audiofx.AudioEffect</a>.
 </dd>
 
 <dt>AudioFlinger</dt>
 <dd>
 The sound server implementation for Android. AudioFlinger
 runs within the mediaserver process. See Wikipedia article
-<a class="external-link" href="http://en.wikipedia.org/wiki/Sound_server" target="_android">Sound server</a>
+<a href="http://en.wikipedia.org/wiki/Sound_server">Sound server</a>
 for the generic definition.
 </dd>
 
@@ -418,7 +405,7 @@
 The module within AudioFlinger responsible for
 combining multiple tracks and applying attenuation
 (volume) and certain effects. The Wikipedia article
-<a class="external-link" href="http://en.wikipedia.org/wiki/Audio_mixing_(recorded_music)" target="_android">Audio mixing (recorded music)</a>
+<a href="http://en.wikipedia.org/wiki/Audio_mixing_(recorded_music)">Audio mixing (recorded music)</a>
 may be useful for understanding the generic
 concept. But that article describes a mixer more as a hardware device
 or a software application, rather than a software module within a system.
@@ -437,8 +424,7 @@
 input device such as microphone.  The data is usually in pulse-code modulation
 (PCM) format.
 The API is defined at
-<a href="http://developer.android.com/reference/android/media/AudioRecord.html"
-target="_android">android.media.AudioRecord</a>.
+<a href="http://developer.android.com/reference/android/media/AudioRecord.html">android.media.AudioRecord</a>.
 </dd>
 
 <dt>AudioResampler</dt>
@@ -452,8 +438,7 @@
 The primary low-level client API for sending data to an audio output
 device such as a speaker.  The data is usually in PCM format.
 The API is defined at
-<a href="http://developer.android.com/reference/android/media/AudioTrack.html"
-target="_android">android.media.AudioTrack</a>.
+<a href="http://developer.android.com/reference/android/media/AudioTrack.html">android.media.AudioTrack</a>.
 </dd>
 
 <dt>client</dt>
@@ -534,8 +519,7 @@
 A higher-level client API than AudioTrack, used for playing sampled
 audio clips. It is useful for triggering UI feedback, game sounds, etc.
 The API is defined at
-<a href="http://developer.android.com/reference/android/media/SoundPool.html"
-target="_android">android.media.SoundPool</a>.
+<a href="http://developer.android.com/reference/android/media/SoundPool.html">android.media.SoundPool</a>.
 </dd>
 
 <dt>Stagefright</dt>
@@ -580,11 +564,9 @@
 <dd>
 A higher-level client API than AudioTrack, used for playing DTMF signals.
 See the Wikipedia article
-<a class="external-link" href="http://en.wikipedia.org/wiki/Dual-tone_multi-frequency_signaling"
-target="_android">Dual-tone multi-frequency signaling</a>,
+<a href="http://en.wikipedia.org/wiki/Dual-tone_multi-frequency_signaling">Dual-tone multi-frequency signaling</a>,
 and the API definition at
-<a href="http://developer.android.com/reference/android/media/ToneGenerator.html"
-target="_android">android.media.ToneGenerator</a>.
+<a href="http://developer.android.com/reference/android/media/ToneGenerator.html">android.media.ToneGenerator</a>.
 </dd>
 
 <dt>track</dt>
@@ -610,8 +592,44 @@
 
 <h2 id="srcTerms">Sample Rate Conversion</h2>
 
-<p>
-For terms related to sample rate conversion, see the separate article
-<a href="audio_src.html">Sample Rate Conversion</a>.
-</p>
+<dl>
+
+<dt>downsample</dt>
+<dd>To resample, where sink sample rate &lt; source sample rate.</dd>
+
+<dt>Nyquist frequency</dt>
+<dd>
+The Nyquist frequency, equal to 1/2 of a given sample rate, is the
+maximum frequency component that can be represented by a discretized
+signal at that sample rate.  For example, the human hearing range is
+typically assumed to extend up to approximately 20 kHz, and so a digital
+audio signal must have a sample rate of at least 40 kHz to represent that
+range.  In practice, sample rates of 44.1 kHz and 48 kHz are commonly
+used, with Nyquist frequencies of 22.05 kHz and 24 kHz respectively.
+See 
+<a href="http://en.wikipedia.org/wiki/Nyquist_frequency">Nyquist frequency</a>
+and
+<a href="http://en.wikipedia.org/wiki/Hearing_range">Hearing range</a>
+for more information.
+</dd>
+
+<dt>resampler</dt>
+<dd>Synonym for sample rate converter.</dd>
+
+<dt>resampling</dt>
+<dd>The process of converting sample rate.</dd>
+
+<dt>sample rate converter</dt>
+<dd>A module that resamples.</dd>
+
+<dt>sink</dt>
+<dd>The output of a resampler.</dd>
+
+<dt>source</dt>
+<dd>The input to a resampler.</dd>
+
+<dt>upsample</dt>
+<dd>To resample, where sink sample rate &gt; source sample rate.</dd>
+
+</dl>
 
diff --git a/src/devices/audio_warmup.jd b/src/devices/audio_warmup.jd
index 0a0ec04..777650b 100644
--- a/src/devices/audio_warmup.jd
+++ b/src/devices/audio_warmup.jd
@@ -24,7 +24,7 @@
   </div>
 </div>
 
-<p>Audio warmup is the time for the audio amplifier circuit in your device to
+<p>Audio warmup is the time it takes for the audio amplifier circuit in your device to
 be fully powered and reach its normal operation state. The major contributors
 to audio warmup time are power management and any "de-pop" logic to stabilize
 the circuit.
diff --git a/src/devices/devices_toc.cs b/src/devices/devices_toc.cs
index c5075d9..c46a418 100644
--- a/src/devices/devices_toc.cs
+++ b/src/devices/devices_toc.cs
@@ -31,6 +31,7 @@
         </a>
       </div>
         <ul>
+          <li><a href="<?cs var:toroot ?>devices/audio_terminology.html">Terminology</a></li>
           <li><a href="<?cs var:toroot ?>devices/audio_implement.html">Implementation</a></li>
           <li><a href="<?cs var:toroot ?>devices/audio_warmup.html">Warmup</a></li>
           <li class="nav-section">
@@ -47,7 +48,6 @@
           </li>
           <li><a href="<?cs var:toroot ?>devices/audio_avoiding_pi.html">Priority Inversion</a></li>
           <li><a href="<?cs var:toroot ?>devices/audio_src.html">Sample Rate Conversion</a></li>
-          <li><a href="<?cs var:toroot ?>devices/audio_terminology.html">Terminology</a></li>
           <li><a href="<?cs var:toroot ?>devices/audio_debugging.html">Debugging</a></li>
           <li><a href="<?cs var:toroot ?>devices/audio_usb.html">USB Digital Audio</a></li>
         </ul>
@@ -89,7 +89,8 @@
           </a>
         </div>
         <ul>
-          <li><a href="<?cs var:toroot ?>devices/graphics/architecture.html">System-Level Architecture</a></li>
+          <li><a href="<?cs var:toroot ?>devices/graphics/architecture.html">Architecture</a></li>
+          <li><a href="<?cs var:toroot ?>devices/graphics/implement.html">Implementation</a></li>
         </ul>
       </li>
       <li class="nav-section">
diff --git a/src/devices/drm.jd b/src/devices/drm.jd
index 828f41b..9a7c673 100644
--- a/src/devices/drm.jd
+++ b/src/devices/drm.jd
@@ -2,19 +2,19 @@
 @jd:body
 
 <!--
-    Copyright 2013 The Android Open Source Project     
+    Copyright 2014 The Android Open Source Project
 
-    Licensed under the Apache License, Version 2.0 (the "License");    
-    you may not use this file except in compliance with the License.   
-    You may obtain a copy of the License at    
+    Licensed under the Apache License, Version 2.0 (the "License");
+    you may not use this file except in compliance with the License.
+    You may obtain a copy of the License at
 
         http://www.apache.org/licenses/LICENSE-2.0
 
-    Unless required by applicable law or agreed to in writing, software    
-    distributed under the License is distributed on an "AS IS" BASIS,    
-    WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.   
-    See the License for the specific language governing permissions and    
-    limitations under the License.   
+    Unless required by applicable law or agreed to in writing, software
+    distributed under the License is distributed on an "AS IS" BASIS,
+    WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+    See the License for the specific language governing permissions and
+    limitations under the License.
 -->
 
 <div id="qv-wrapper">
@@ -25,154 +25,234 @@
   </div>
 </div>
 
-<p>This document introduces Widevine DRM security levels
-  and certification requirements. It explains how to integrate and distribute Widevine DRM
-  for your product. Android provides the Widevine DRM solution with a royalty-free
-  license and we recommend that you use it for
-  your protected playback solution. </p>
+<p>This document provides an overview of the Android DRM framework, and
+introduces the interfaces a DRM plug-in must implement. This document does not
+describe robustness rules or compliance rules that may be defined by a DRM
+scheme.</p>
 
-<h2 id="overview">Overview</h2>
+<h2 id="introduction">Introduction</h2>
+
+<p>The Android platform provides an extensible DRM framework that lets
+applications manage rights-protected content according to the license
+constraints associated with the content. The DRM framework supports many DRM
+schemes; which DRM schemes a device supports is up to the device manufacturer.
+The DRM framework introduced in Android 3.0 provides a unified interface for
+application developers and hides the complexity of DRM operations. The DRM
+framework provides a consistent operation mode for protected and non-protected
+content. DRM schemes can define very complex usage models by license metadata.
+The DRM framework provides the association between DRM content and license, and
+handles the rights management. This enables the media player to be abstracted
+from DRM-protected or non-protected content. See <a
+href="https://developer.android.com/reference/android/media/MediaDrm.html">MediaDrm</a>
+for the class to obtain keys for decrypting protected media streams.</p>
+
+ <img src="images/drm_hal.png" alt="Android DRM HAL" />
+
+<p class="img-caption"><strong>Figure 1.</strong> DRM Hardware Abastraction
+Layer</p>
+
 <p>
-Availability of rich digital content is important to users on mobile devices. To make their content widely available,
-Android developers and digital content publishers need a consistent DRM implementation supported across the Android
-ecosystem. In order to make that digital content available on Android devices and to ensure that there is at least
-one consistent DRM available across all devices, Google provides Widevine DRM for free on compatible Android devices.
-On Android 3.0 and higher platforms, the Widevine DRM plugin is integrated with the Android DRM framework and uses
-hardware-backed protection to secure movie content and user credentials.
+Availability of rich digital content is important to users on mobile devices. To
+make their content widely available, Android developers and digital content
+publishers need a consistent DRM implementation supported across the Android
+ecosystem. In order to make that digital content available on Android devices
+and to ensure that there is at least one consistent DRM available across all
+devices, Google provides DRM without any license fees on compatible Android devices.
+On Android 3.0 and higher platforms, the DRM plug-in is integrated with the
+Android DRM framework and can use hardware-backed protection to secure premium
+content and user credentials.
 </p>
 
 <p>
-The content protection provided by the Widevine DRM plugin depends on the security and content protection capabilities of the underlying hardware platform. The hardware capabilities of the device include hardware secure boot to establish a chain of trust of security and protection of cryptographic keys. Content protection capabilities of the device include protection of decrypted frames in the device and content output protection via a trusted output protection mechanism. Not all hardware platforms support all the above security and content protection features. Security is never implemented in a single place in the stack, but instead relies on the integration of hardware, software, and services. The combination of hardware security functions, a trusted boot mechanism, and an isolated secure OS for handling security functions is critical to provide a secure device.</p>
+The content protection provided by the DRM plug-in depends on the security and
+content protection capabilities of the underlying hardware platform. The
+hardware capabilities of the device include hardware secure boot to establish a
+chain of trust of security and protection of cryptographic keys. Content
+protection capabilities of the device include protection of decrypted frames in
+the device and content protection via a trusted output protection mechanism. Not
+all hardware platforms support all of the above security and content protection
+features. Security is never implemented in a single place in the
+stack, but instead relies on the integration of hardware, software, and
+services. The combination of hardware security functions, a trusted boot
+mechanism, and an isolated secure OS for handling security functions is critical
+to providing a secure device.</p>
 
 
-<h3 id="framework">Android DRM Framework</h3>
-<p>Android 3.0 and higher platforms provide an extensible DRM framework that lets applications manage protected content using a
-    choice of DRM mechanisms. For application developers, the framework offers an
-    abstract, unified API that simplifies the management of protected content.
-    The API hides the complexity of DRM operations and allows a consistent operation mode for both protected and unprotected
-    content across a variety of DRM schemes. For device manufacturers, content owners, and Internet digital media providers
-    the DRM framework plugin API provides a means of adding support for a DRM scheme of choice into the Android system, for
-    secure enforcement of content protection.
+<h2 id="architecture">Architecture</h2>
+<p>The DRM framework is designed to be implementation agnostic and
+abstracts the details of the specific DRM scheme implementation in a
+scheme-specific DRM plug-in. The DRM framework includes simple APIs to handle
+complex DRM operations, register users and devices to online DRM services,
+extract constraint information from the license, associate DRM content and its
+license, and finally decrypt DRM content.</p>
 
-    <p><strong>Note:</strong> We recommend that you integrate the Widevine
-    solution as it is already implemented and ready for you to use. </p>
-</p>
-
-<h3 id="plugin">Widevine DRM Plugin</h3>
-
-<p>
-Built on top of the Android DRM framework, the Widevine DRM plugin offers DRM and advanced copy protection features on Android devices. Widevine DRM is available in binary form under a royalty free license from Widevine. The Widevine DRM plugin provides the capability to license, securely distribute, and protect playback of multimedia content. Protected content is secured using an encryption scheme based on the open AES (Advanced Encryption Standard). An application can decrypt the content only if it obtains a license from the Widevine DRM licensing server for the current user. Widevine DRM functions on Android in the same way as it does on other platforms. Figure 1 shows how the WideVine Crypto Plugin fits into the Android stack:</p>
-
-
- <img src="images/drm_hal.png" alt="" />
-
- <p class="img-caption"><strong>Figure 1.</strong> Widevine Crypto Plugin</p>
-
-
-<h2 id="integrating">Integrating Widevine into Your Product</h2>
-
-<p>The following sections go over the different security levels that Widevine supports and the requirements that your product must meet to
-support Widevine. After reading the information, you need to determine the security level for your target hardware, integration, and Widevine keybox provisioning requirements.
-</p>
-<p >
-To integrate and distribute Widevine DRM on Android devices, contact your Android technical account manager to begin Widevine DRM integration.
-We recommend you engage early in your device development process with the Widevine team to provide the highest level of content protection on the device. 
-Certify devices using the Widevine test player and submit results to your Android technical account manager for approval.
-</p>
-
-<h3 id="security">
-Widevine DRM security levels
-</h3>
-
-<p>Security is never implemented in a single place in the stack, but instead relies on the integration of hardware, software, and services. The combination of hardware security functions, a trusted boot mechanism, and an isolated secure OS for handling security functions is critical to provide a secure device.</p>
-
-<p>
-At the system level, Android offers the core security features of the Linux kernel, extended and customized for mobile devices. In the application framework, Android provides an extensible DRM framework and system architecture for checking and enforcing digital rights. The Widevine DRM plugin integrates with the hardware platform to leverage the available security capabilities. The level of security offered is determined by a combination of the security capabilities of the hardware platform and the integration with Android and the Widevine DRM plugin. Widevine DRM security supports the three levels of security shown in the table below. 
-</p>
-
-<table>
-
-<tr>
-<th>Security Level</th>
-<th>Secure Bootloader</th>
-<th>Widevine Key Provisioning</th>
-<th>Security Hardware or ARM Trust Zone</th>
-<th>Widevine Keybox and Video Key Processing</th>
-<th>Hardware Video Path</th>
-</tr>
-<tr>
-  <td>Level 1</td>
-  <td>Yes</td>
-  <td>Factory provisioned Widevine Keys</td>
-  <td>Yes</td>
-  <td>Keys never exposed in clear to host CPU</td>
-  <td>Hardware protected video path</td>
-<tr>
-
-<tr>
-  <td>Level 2</td>
-  <td>Yes</td>
-  <td>Factory provisioned Widevine Keys</td>
-  <td>Yes</td>
-  <td>Keys never exposed in clear to host CPU</td>
-  <td>Hardware protected video path</td>
-<tr>
-
-<tr>
-  <td>Level 3</td>
-  <td>Yes*</td>
-  <td>Field provisioned Widevine Keys</td>
-  <td>No</td>
-  <td>Clear keys exposed to host CPU</td>
-  <td>Clear video streams delivered to video decoder</td>
-<tr>
-
-</table>
-
-<p><superscript>*</superscript>Device implementations may use a trusted bootloader, where in the bootloader is authenticated via an OEM key stored on a system partition.</p>
-
-<h3 id="security-details">
-Security level details
-</h3>
-<h4>
-Level 1
-</h4>
-<p>In this implementation Widevine DRM keys and decrypted content are never exposed to the host CPU. Only security hardware or a protected security co-processor uses clear key values and the media content is decrypted by the secure hardware. This level of security requires factory provisioning of the Widevine key-box or requires the Widevine key-box to be protected by a device key installed at the time of manufacturing. The following describes some key points to this security level:
-</p>
-
+<p>The Android DRM framework is implemented in two architectural layers:</p>
 <ul>
-  <li>Device manufacturers must provide a secure bootloader. The chain of trust from the bootloader must extend through any software or firmware components involved in the security implementation, such as the ARM TrustZone protected application and any components involved in the enforcement of the secure video path. </li>
-  <li>The Widevine key-box must be encrypted with a device-unique secret key that is not visible to software or probing methods outside of the TrustZone.</li>
-  <li>The Widevine key-box must be installed in the factory or delivered to the device using an approved secure delivery mechanism.</li>
-  <li>Device manufacturers must provide an implementation of the Widevine Level 1 OEMCrypto API that performs all key processing and decryption in a trusted environment.</li>
+<li>A DRM framework API, which is exposed to applications through the Android
+  application framework and runs through the Dalvik VM for standard
+  applications.</li>
+<li>A native code DRM manager, which implements the DRM framework and exposes an
+  interface for DRM plug-ins (agents) to handle rights management and decryption
+  for various DRM schemes.</li>
 </ul>
 
-<h4>Level 2</h4>
-<p>
-  In this security level, the Widevine keys are never exposed to the host CPU. Only security hardware or a protected security co-processor uses clear key values. An AES crypto block performs the high throughput AES decryption of the media stream.  The resulting clear media buffers are returned to the CPU for delivery to the video decoder. This level of security requires factory provisioning of the Widevine key-box or requires the Widevine key box to be protected by a key-box installed at the time of manufacturing.
-  The following list describes some key requirements of this security level:
-</p>
+ <img src="images/drm_framework.png" alt="Android DRM Framework" />
+
+<p class="img-caption"><strong>Figure 2.</strong> DRM framework</p>
+
+<p>See the <a
+href="http://developer.android.com/reference/android/drm/package-summary.html">Android
+DRM package reference</a> for additional details.</p>
+
+<h2 id="plug-ins">Plug-ins</h2>
+<p>As shown in the figure below, the DRM framework uses a plug-in architecture
+to support various DRM schemes. The DRM manager service runs in an independent
+process to ensure isolated execution of DRM plug-ins. Each API call from
+DrmManagerClient to DrmManagerService goes across process boundaries by using
+the binder IPC mechanism. The DrmManagerClient provides a Java programming
+language implementation as a common interface to runtime applications; it
+also provides a DrmManagerClient-native implementation as the interface to
+native modules. The caller of DRM framework accesses only the DrmManagerClient
+and does not have to be aware of each DRM scheme. </p>
+
+ <img src="images/drm_plugin.png" alt="Android DRM Plug-in" />
+
+<p class="img-caption"><strong>Figure 3.</strong> DRM framework with plug-ins</p>
+
+<p>Plug-ins are loaded automatically when DrmManagerService is launched. As
+shown in the figure below, the DRM plug-in manager loads/unloads all the
+available plug-ins. The DRM framework loads plug-ins automatically by finding
+them under:<br/>
+<code>/system/lib/drm/plugins/native/</code></p>
+ 
+<img src="images/drm_plugin_lifecycle.png" alt="Android DRM Plug-in Lifecycle" />
+
+<p class="img-caption"><strong>Figure 4.</strong> DRM plug-in lifecycle</p>
+
+<p>The plug-in developer should ensure the plug-in is located in the DRM
+framework plug-in discovery directory. See implementation instructions below for details.</p>
+
+<h2 id="implementation">Implementation</h2>
+
+<h3 id="IDrmEngine">IDrmEngine</h3>
+
+<p>IDrmEngine is an interface with a set of APIs to suit DRM use cases. Plug-in
+developers must implement the interfaces specified in IDrmEngine and the
+listener interfaces specified below. This document assumes the plug-in developer
+has access to the Android source tree. The interface definition is available in
+the source tree at:<br/>
+<code>
+<&lt;platform_root&gt;/frameworks/base/drm/libdrmframework/plugins/common/include
+</code></p>
+
+<h3 id="DrmInfo">DRM Info</h3>
+<p>DrmInfo is a wrapper class that wraps the protocol for communicating with the
+DRM server. Server registration, deregistration, license acquisition, or any other
+server-related transaction can be achieved by processing an instance of DrmInfo.
+The protocol should be described by the plug-in in XML format. Each DRM plug-in
+would accomplish the transaction by interpreting the protocol. The DRM framework
+defines an API to retrieve an instance of DrmInfo called acquireDrmInfo().</p>
+
+<code>DrmInfo* acquireDrmInfo(int uniqueId, const DrmInfoRequest* drmInfoRequest);</code>
+<p>Retrieves necessary information for registration, deregistration or rights
+acquisition information. See <a
+href="http://developer.android.com/reference/android/drm/DrmInfoRequest.html">DrmInfoRequest</a> for more information.</p>
+
+<code>DrmInfoStatus* processDrmInfo(int uniqueId, const DrmInfo* drmInfo);</code>
+<p>processDrmInfo() behaves asynchronously and the results of the transaction can
+be retrieved either from OnEventListener or OnErrorListener.</p>
+
+<h3 id="drm-rights">DRM rights</h3>
+
+<p>The association of DRM content and the license is required to allow playback
+of DRM content. Once the association has been made, the license will be handled in
+the DRM framework so the Media Player application is abstracted from the existence
+of license.</p>
+
+<code>int checkRightsStatus(int uniqueId, const String8&amp; path, int
+action);</code>
+<p>Save DRM rights to the specified rights path and make association with content path.
+The input parameters are DrmRights to be saved, rights file path where rights
+are to be saved and content file path where content was saved.</p>
+
+<code>status_t saveRights(int uniqueId, const DrmRights&amp; drmRights,
+            const String8&amp; rightsPath, const String8&amp;
+contentPath);</code>
+<p>Save DRM rights to specified rights path and make association with content
+path.</p>
+
+<h3 id="metadata">License Metadata</h3>
+<p>License metadata such as license expiry time, repeatable count and etc., may be
+embedded inside the rights of the protected content. The Android DRM framework
+provides APIs to return constraints associated with input content. See <a
+href="http://developer.android.com/reference/android/drm/DrmManagerClient.html">DrmManagerClient</a>
+for more information.</p>
+
+<code>DrmConstraints* getConstraints(int uniqueId, const String path, int
+action);</code>
+<p>The getConstraint function call returns key-value pairs of constraints
+embedded in protected content. To retrieve the constraints, the uniqueIds (the
+Unique identifier for a session and path of the protected content) are required.
+The action, defined as Action::DEFAULT, Action::PLAY, etc., is also required.</p>
+
+ <img src="images/drm_license_metadata.png" alt="Android DRM License Metadata" />
+
+<p class="img-caption"><strong>Figure 5.</strong> Retrieve license metadata</p>
+
+<code>DrmMetadata* getMetadata(int uniqueId, const String path);</code>
+<p>Get metadata information associated with input content for a given path of the
+protected content to return key-value pairs of metadata.</p>
+
+<h3 id="metadata">Decrypt session</h3>
+<p>To maintain the decryption session, the caller of the DRM framework has to
+invoke openDecryptSession() at the beginning of the decryption sequence.
+openDecryptSession() asks each DRM plug-in if it can handle input DRM
+content.</p>
+<code>
+status_t openDecryptSession(
+   int uniqueId, DecryptHandle* decryptHandle, int fd, off64_t offset, off64_t length);
+</code>
+
+<p>The above call allows you to save DRM rights to specified rights path and make
+association with content path. DrmRights parameter is the rights to be saved,
+file path where rights should be and content file path where content should be
+saved.</p>
+
+<h3 id="listeners">DRM plug-in Listeners</h3>
+
+<p>Some APIs in DRM framework behave asynchronously in a DRM transaction. An
+application can register three listener classes to DRM framework.</p>
 
 <ul>
-  <li>Device manufacturers must provide a secure bootloader. The chain of trust from the bootloader must extend through any software or firmware components involved in the security implementation, such as the TrustZone protected application. </li>
-  <li>The Widevine key-box must be encrypted with a device-unique secret key that is not visible to software or probing methods outside of the TrustZone.</li>
-  <li>The Widevine key-box must be installed in the factory or delivered to the device using an approved secure delivery mechanism.</li>
-  <li>Device manufacturers must provide an implementation of the Widevine Level 2 OEMCrypto API that performs all key processing and decryption in a trusted environment.</li>
-  <li>Device manufacturers must provide a bootloader that loads signed system images only. For devices that allow users to load a custom operating system or gain root privileges on the device by unlocking the bootloader, device manufacturers must support the following:
-    <ul>
-      <li>Device manufacturers must provide a bootloader that allows a Widevine key-box to be written only when the bootloader is in a locked state.</li>
-      <li>The Widevine key-box must be stored in a region of memory that is erased or is inaccessible when the device bootloader is in an unlocked state.</li>
-    </ul>
-  </li>
+<li>OnEventListener for results of asynchronous APIs</li>
+<li>OnErrorListener for recieving errors of asynchronous APIs</li>
+<li>OnInfoListener for any supplementary information during DRM
+transactions.</li>
 </ul>
 
-<h4>Level 3</h4>
-<p>
-This security level relies on the secure bootloader to verify the system image. An AES crypto block performs the AES decryption of the media stream and the resulting clear media buffers are returned to the CPU for delivery to the video decoder.
-</p>
+<h3 id="source">Source</h3>
 
-<p>Device manufacturers must provide a bootloader that loads signed system images only. For devices that allow users to load a custom operating system or gain root privileges on the device by unlocking the bootloader, device manufacturers must support the following:</p>
-    <ul>
-      <li>Device manufacturers must provide a bootloader that allows a Widevine key-box to be written only when the bootloader is in a locked state.</li>
-      <li>The Widevine key-box must be stored in a region of memory that is erased or is inaccessible when the device bootloader is in an unlocked state.</li>
-    </ul>
+<p>The Android DRM framework includes a passthru plug-in as a sample plug-in.
+The implementation for passthru plug-in can be found in the Android source tree
+at:<br/>
+<code>
+&lt;platform_root&gt;/frameworks/base/drm/libdrmframework/plugins/passthru
+</code></p>
+
+<h3 id="build">Build and Integration</h3>
+
+<p>Add the following to the Android.mk of the plug-in implementation. The
+passthruplugin is used as a sample.</p>
+
+<code>
+PRODUCT_COPY_FILES +=
+$(TARGET_OUT_SHARED_LIBRARIES)/&lt;plugin_library&gt;:system/lib/drm/plugins/native/&lt;plugin_library&gt;
+e.g.,<br/>
+PRODUCT_COPY_FILES += $(TARGET_OUT_SHARED_LIBRARIES)/
+libdrmpassthruplugin.so:system/lib/drm/plugins/native/libdrmpassthruplugin.so
+</code>
+<br/>
+<br/>
+<p>Plug-in developers must  locate their respective plug-ins under this
+directory like so:<br/>
+<code>/system/lib/drm/plugins/native/libdrmpassthruplugin.so</code></p>
diff --git a/src/devices/graphics.jd b/src/devices/graphics.jd
index 45ebfae..c8f11e8 100644
--- a/src/devices/graphics.jd
+++ b/src/devices/graphics.jd
@@ -2,7 +2,7 @@
 @jd:body
 
 <!--
-    Copyright 2013 The Android Open Source Project
+    Copyright 2014 The Android Open Source Project
 
     Licensed under the Apache License, Version 2.0 (the "License");
     you may not use this file except in compliance with the License.
@@ -16,6 +16,7 @@
     See the License for the specific language governing permissions and
     limitations under the License.
 -->
+
 <div id="qv-wrapper">
   <div id="qv">
     <h2>In this document</h2>
@@ -24,354 +25,203 @@
   </div>
 </div>
 
-<p>
-  The Android framework has a variety of graphics rendering APIs for 2D and 3D that interact with
-  your HAL implementations and graphics drivers, so it is important to have a good understanding of
-  how they work at a higher level. There are two general ways that app developers can draw things
-  to the screen: with Canvas or OpenGL.
-</p>
-<p>
-  <a href="http://developer.android.com/reference/android/graphics/Canvas.html">android.graphics.Canvas</a>
-  is a 2D graphics API and is the most widely used graphics API by
-  developers. Canvas operations draw all the stock <a href="http://developer.android.com/reference/android/view/View.html">android.view.View</a>s
-  and custom <a href="http://developer.android.com/reference/android/view/View.html">android.view.View</a>s in Android. Prior to Android 3.0, Canvas always
-  used the non-hardware accelerated Skia 2D drawing library to draw.
-</p>
-<p>
-  Introduced in Android 3.0, hardware acceleration for Canvas APIs uses a new drawing library
-  called OpenGLRenderer that translates Canvas operations to OpenGL operations so that they can
-  execute on the GPU. Developers had to opt-in to this feature previously, but beginning in Android
-  4.0, hardware-accelerated Canvas is enabled by default. Consequently, a hardware GPU that
-  supports OpenGL ES 2.0 is mandatory for Android 4.0 devices.
-</p>
-<p>
-  Additionally, the <a href="https://developer.android.com/guide/topics/graphics/hardware-accel.html">Hardware Acceleration guide</a>
-  explains how the hardware-accelerated drawing path works and identifies the differences in behavior from the software drawing path.
-</p>
-<p>
-  The other main way that developers render graphics is by using OpenGL ES 1.x or 2.0 to directly
-  render to a surface.  Android provides OpenGL ES interfaces in the
-  <a href="http://developer.android.com/reference/android/opengl/package-summary.html">android.opengl</a> package
-  that a developer can use to call into your GL implementation with the SDK or with native APIs
-  provided in the Android NDK. 
+<p>The Android framework offers a variety of graphics rendering APIs for 2D and
+3D that interact with manufacturer implementations of graphics drivers, so it
+is important to have a good understanding of how those APIs work at a higher
+level. This page introduces the graphics hardware abstraction layer (HAL) upon
+which those drivers are built.</p>
 
-  <p class="note"><strong>Note:</strong>A third option, Renderscript, was introduced in Android 3.0 to
-  serve as a platform-agnostic graphics rendering API (it used OpenGL ES 2.0 under the hood), but
-  will be deprecated starting in the Android 4.1 release.
-</p>
-<h2 id="render">
-  How Android Renders Graphics
-</h2>
-<p>
-  No matter what rendering API developers use, everything is rendered onto a buffer of pixel data
-  called a "surface." Every window that is created on the Android platform is backed by a surface.
-  All of the visible surfaces that are rendered to are composited onto the display
-  by the SurfaceFlinger, Android's system service that manages composition of surfaces.
-  Of course, there are more components that are involved in graphics rendering, and the
-  main ones are described below:
-</p>
+<p>Application developers draw images to the screen in two ways: with Canvas or
+OpenGL. See <a
+href="{@docRoot}devices/graphics/architecture.html">System-level graphics
+architecture</a> for a detailed description of Android graphics
+components.</p>
 
-<dl>
-  <dt>
-    <strong>Image Stream Producers</strong>
-  </dt>
-    <dd>Image stream producers can be things such as an OpenGL ES game, video buffers from the media server,
-      a Canvas 2D application, or basically anything that produces graphic buffers for consumption.
-    </dd>
+<p><a
+href="http://developer.android.com/reference/android/graphics/Canvas.html">android.graphics.Canvas</a>
+is a 2D graphics API and is the most popular graphics API among developers.
+Canvas operations draw all the stock and custom <a
+href="http://developer.android.com/reference/android/view/View.html">android.view.View</a>s
+in Android. In Android, hardware acceleration for Canvas APIs is accomplished
+with a drawing library called OpenGLRenderer that translates Canvas operations
+to OpenGL operations so they can execute on the GPU.</p>
 
-  <dt>
-    <strong>Image Stream Consumers</strong>
-  </dt>
-  <dd>The most common consumer of image streams is SurfaceFlinger, the system service that consumes
-    the currently visible surfaces and composites them onto the display using
-    information provided by the Window Manager. SurfaceFlinger is the only service that can
-    modify the content of the display. SurfaceFlinger uses OpenGL and the
-    hardware composer to compose a group of surfaces. Other OpenGL ES apps can consume image
-    streams as well, such as the camera app consuming a camera preview image stream.
-  </dd>
-  <dt>
-    <strong>SurfaceTexture</strong>
-  </dt>
-  <dd>SurfaceTexture contains the logic that ties image stream producers and image stream consumers together
-    and is made of three parts: <code>SurfaceTextureClient</code>, <code>ISurfaceTexture</code>, and
-    <code>SurfaceTexture</code> (in this case, <code>SurfaceTexture</code> is the actual C++ class and not
-    the name of the overall component). These three parts facilitate the producer (<code>SurfaceTextureClient</code>),
-    binder (<code>ISurfaceTexture</code>), and consumer (<code>SurfaceTexture</code>)
-    components of SurfaceTexture in processes such as requesting memory from Gralloc,
-    sharing memory across process boundaries, synchronizing access to buffers, and pairing the appropriate consumer with the producer.
-    SurfaceTexture can operate in both asynchronous (producer never blocks waiting for consumer and drops frames) and
-    synchronous (producer waits for consumer to process textures) modes. Some examples of image
-    producers are the camera preview produced by the camera HAL or an OpenGL ES game. Some examples
-    of image consumers are SurfaceFlinger or another app that wants to display an OpenGL ES stream
-    such as the camera app displaying the camera viewfinder.
-  </dd>
+<p>Beginning in Android 4.0, hardware-accelerated Canvas is enabled by default.
+Consequently, a hardware GPU that supports OpenGL ES 2.0 is mandatory for
+Android 4.0 and later devices. See the <a
+href="https://developer.android.com/guide/topics/graphics/hardware-accel.html">Hardware
+Acceleration guide</a> for an explanation of how the hardware-accelerated
+drawing path works and the differences in its behavior from that of the
+software drawing path.</p>
 
- <dt>
-    <strong>Window Manager</strong>
-  </dt>
-  <dd>
-    The Android system service that controls window lifecycles, input and focus events, screen
-    orientation, transitions, animations, position, transforms, z-order, and many other aspects of
-    a window (a container for views). A window is always backed by a surface. The Window Manager
-    sends all of the window metadata to SurfaceFlinger, so SurfaceFlinger can use that data
-    to figure out how to composite surfaces on the display.
-  </dd>
-  
-  <dt>
-    <strong>Hardware Composer</strong>
-  </dt>
-  <dd>
-    The hardware abstraction for the display subsystem. SurfaceFlinger can delegate certain
-    composition work to the hardware composer to offload work from the OpenGL and the GPU. This makes
-    compositing faster than having SurfaceFlinger do all the work. Starting with Jellybean MR1,
-    new versions of the hardware composer have been introduced. See the <code>hardware/libhardware/include/hardware/gralloc.h</code> <a href="#hwc">Hardware composer</a> section
-    for more information.
-  </dd>
+<p>In addition to Canvas, the other main way that developers render graphics is
+by using OpenGL ES to directly render to a surface. Android provides OpenGL ES
+interfaces in the <a
+href="http://developer.android.com/reference/android/opengl/package-summary.html">android.opengl</a>
+package that developers can use to call into their GL implementations with the
+SDK or with native APIs provided in the <a
+href="https://developer.android.com/tools/sdk/ndk/index.html">Android
+NDK</a>.</p>
 
-    <dt>
-    <strong>Gralloc</strong>
-  </dt>
-  <dd>Allocates memory for graphics buffers. See the  If you
-    are using version 1.1 or later of the <a href="#hwc">hardware composer</a>, this HAL is no longer needed.</dd>
-  
- 
-</dl>
-<p>
-  The following diagram shows how these components work together:
-</p><img src="images/graphics_surface.png">
-<p class="img-caption">
-  <strong>Figure 1.</strong> How surfaces are rendered
-</p>
+<h2 id=android_graphics_components>Android graphics components</h2>
 
-</p>
-<h2 id="provide">
-  What You Need to Provide
-</h2>
-<p>
- The following list and sections describe what you need to provide to support graphics in your product:
-</p>
-<ul>
-  <li>OpenGL ES 1.x Driver
-  </li>
-  <li>OpenGL ES 2.0 Driver
-  </li>
-  <li>EGL Driver
-  </li>
-  <li>Gralloc HAL implementation
-  </li>
-  <li>Hardware Composer HAL implementation
-  </li>
-  <li>Framebuffer HAL implementation
-  </li>
-</ul>
-<h3 id="gl">
-  OpenGL and EGL drivers
-</h3>
-<p>
-  You must provide drivers for OpenGL ES 1.x, OpenGL ES 2.0, and EGL. Some key things to keep in
-  mind are:
-</p>
-<ul>
-  <li>The GL driver needs to be robust and conformant to OpenGL ES standards.
-  </li>
-  <li>Do not limit the number of GL contexts. Because Android allows apps in the background and
-  tries to keep GL contexts alive, you should not limit the number of contexts in your driver. It
-  is not uncommon to have 20-30 active GL contexts at once, so you should also be careful with the
-  amount of memory allocated for each context.
-  </li>
-  <li>Support the YV12 image format and any other YUV image formats that come from other
-    components in the system such as media codecs or the camera.
-  </li>
-  <li>Support the mandatory extensions: <code>GL_OES_texture_external</code>,
-  <code>EGL_ANDROID_image_native_buffer</code>, and <code>EGL_ANDROID_recordable</code>. We highly
-  recommend supporting <code>EGL_ANDROID_blob_cache</code> and <code>EGL_KHR_fence_sync</code> as
-  well.</li>
-</ul>
+<p>No matter what rendering API developers use, everything is rendered onto a
+"surface." The surface represents the producer side of a buffer queue that is
+often consumed by SurfaceFlinger. Every window that is created on the Android
+platform is backed by a surface. All of the visible surfaces rendered are
+composited onto the display by SurfaceFlinger.</p>
 
-<p>
-  Note that the OpenGL API exposed to app developers is different from the OpenGL interface that
-  you are implementing. Apps do not have access to the GL driver layer, and must go through the
-  interface provided by the APIs.
-</p>
-<h4>
-  Pre-rotation
-</h4>
-<p>Many times, hardware overlays do not support rotation, so the solution is to pre-transform the buffer before
-  it reaches SurfaceFlinger. A query hint in ANativeWindow was added (<code>NATIVE_WINDOW_TRANSFORM_HINT</code>)
-  that represents the most likely transform to be be applied to the buffer by SurfaceFlinger.
+<p>The following diagram shows how the key components work together:</p>
 
-  Your GL driver can use this hint to pre-transform the buffer before it reaches SurfaceFlinger, so when the buffer
-  actually reaches SurfaceFlinger, it is correctly transformed. See the ANativeWindow
-  interface defined in <code>system/core/include/system/window.h</code> for more details. The following
-  is some pseudo-code that implements this in the hardware composer:
-</p>
+<img src="graphics/images/graphics_surface.png" alt="image-rendering components">
 
-<pre>
-ANativeWindow->query(ANativeWindow, NATIVE_WINDOW_DEFAULT_WIDTH, &w);
-ANativeWindow->query(ANativeWindow, NATIVE_WINDOW_DEFAULT_HEIGHT, &h);
-ANativeWindow->query(ANativeWindow, NATIVE_WINDOW_TRANSFORM_HINT, &hintTransform);
-if (hintTransform & HAL_TRANSFORM_ROT_90)
-swap(w, h);
+<p class="img-caption"><strong>Figure 1.</strong> How surfaces are rendered</p>
 
-native_window_set_buffers_dimensions(anw, w, h);
-ANativeWindow->dequeueBuffer(...);
+<p>The main components are described below:</p>
 
-// here GL driver renders content transformed by " hintTransform "
+<h3 id=image_stream_producers>Image Stream Producers</h3>
 
-int inverseTransform;
-inverseTransform = hintTransform;
-if (hintTransform & HAL_TRANSFORM_ROT_90)
-   inverseTransform ^= HAL_TRANSFORM_ROT_180;
+<p>An image stream producer can be anything that produces graphic buffers for
+consumption. Examples include OpenGL ES, Canvas 2D, and mediaserver video
+decoders.</p>
 
-native_window_set_buffers_transform(anw, inverseTransform);
+<h3 id=image_stream_consumers>Image Stream Consumers</h3>
 
-ANativeWindow->queueBuffer(...);
-</pre>
+<p>The most common consumer of image streams is SurfaceFlinger, the system
+service that consumes the currently visible surfaces and composites them onto
+the display using information provided by the Window Manager. SurfaceFlinger is
+the only service that can modify the content of the display. SurfaceFlinger
+uses OpenGL and the Hardware Composer to compose a group of surfaces.</p>
 
-<h3 id="gralloc">
-  Gralloc HAL
-</h3>
-<p>
-  The graphics memory allocator is needed to allocate memory that is requested by
-  SurfaceTextureClient in image producers. You can find a stub implementation of the HAL at
-  <code>hardware/libhardware/modules/gralloc.h</code>
-</p>
-<h4>
-  Protected buffers
-</h4>
-<p>
-  There is a gralloc usage flag <code>GRALLOC_USAGE_PROTECTED</code> that allows
-  the graphics buffer to be displayed only through a hardware protected path.
-</p>
-<h3 id="hwc">
-  Hardware Composer HAL
-</h3>
-<p>
-  The hardware composer is used by SurfaceFlinger to composite surfaces to the screen. The hardware
-  composer abstracts things like overlays and 2D blitters and helps offload some things that would
-  normally be done with OpenGL. 
-</p>
+<p>Other OpenGL ES apps can consume image streams as well, such as the camera
+app consuming a camera preview image stream. Non-GL applications can be
+consumers too, for example the ImageReader class.</p>
 
-<p>Jellybean MR1 introduces a new version of the HAL. We recommend that you start using version 1.1 of the hardware
-  composer HAL as it will provide support for the newest features (explicit synchronization, external displays, etc).
-  Keep in mind that in addition to 1.1 version, there is also a 1.0 version of the HAL that we used for internal
-  compatibility reasons and a 1.2 draft mode of the hardware composer HAL. We recommend that you implement
-  version 1.1 until 1.2 is out of draft mode.
-</p>
+<h3 id=window_manager>Window Manager</h3>
 
- <p>Because the physical display hardware behind the hardware composer
-  abstraction layer can vary from device to device, it is difficult to define recommended features, but
-  here is some guidance:</p>
+<p>The Android system service that controls a window, which is a container for
+views. A window is always backed by a surface. This service oversees
+lifecycles, input and focus events, screen orientation, transitions,
+animations, position, transforms, z-order, and many other aspects of a window.
+The Window Manager sends all of the window metadata to SurfaceFlinger so
+SurfaceFlinger can use that data to composite surfaces on the display.</p>
 
-<ul>
-  <li>The hardware composer should support at least 4 overlays (status bar, system bar, application,
-  and live wallpaper) for phones and 3 overlays for tablets (no status bar).</li>
-  <li>Layers can be bigger than the screen, so the hardware composer should be able to handle layers
-    that are larger than the display (For example, a wallpaper).</li>
-  <li>Pre-multiplied per-pixel alpha blending and per-plane alpha blending should be supported at the same time.</li>
-  <li>The hardware composer should be able to consume the same buffers that the GPU, camera, video decoder, and Skia buffers are producing,
-    so supporting some of the following properties is helpful:
-   <ul>
-     <li>RGBA packing order</li>
-     <li>YUV formats</li>
-     <li>Tiling, swizzling, and stride properties</li>
-   </ul>
-  </li>
-  <li>A hardware path for protected video playback must be present if you want to support protected content.</li>
-</ul>
-<p>
-  The general recommendation when implementing your hardware composer is to implement a no-op
-  hardware composer first. Once you have the structure done, implement a simple algorithm to
-  delegate composition to the hardware composer. For example, just delegate the first three or four
-  surfaces to the overlay hardware of the hardware composer. After that focus on common use cases,
-  such as:
-</p>
-<ul>
-  <li>Full-screen games in portrait and landscape mode
-  </li>
-  <li>Full-screen video with closed captioning and playback control
-  </li>
-  <li>The home screen (compositing the status bar, system bar, application window, and live
-  wallpapers)
-  </li>
-  <li>Protected video playback
-  </li>
-  <li>Multiple display support
-  </li>
-</ul>
-<p>
-  After implementing the common use cases, you can focus on optimizations such as intelligently
-  selecting the surfaces to send to the overlay hardware that maximizes the load taken off of the
-  GPU. Another optimization is to detect whether the screen is updating. If not, delegate composition
-  to OpenGL instead of the hardware composer to save power. When the screen updates again, contin`ue to
-  offload composition to the hardware composer.
-</p>
+<h3 id=hardware_composer>Hardware Composer</h3>
 
-<p>
-  You can find the HAL for the hardware composer in the
-  <code>hardware/libhardware/include/hardware/hwcomposer.h</code> and <code>hardware/libhardware/include/hardware/hwcomposer_defs.h</code>
-  files. A stub implementation is available in the <code>hardware/libhardware/modules/hwcomposer</code> directory.
-</p>
+<p>The hardware abstraction for the display subsystem. SurfaceFlinger can
+delegate certain composition work to the Hardware Composer to offload work from
+OpenGL and the GPU. SurfaceFlinger acts as just another OpenGL ES client. So
+when SurfaceFlinger is actively compositing one buffer or two into a third, for
+instance, it is using OpenGL ES. This makes compositing lower power than having
+the GPU conduct all computation.</p>
 
-<h4>
-  VSYNC
-</h4>
-<p>
-  VSYNC synchronizes certain events to the refresh cycle of the display. Applications always
-  start drawing on a VSYNC boundary and SurfaceFlinger always composites on a VSYNC boundary.
-  This eliminates stutters and improves visual performance of graphics.
-  The hardware composer has a function pointer</p>
+<p>The Hardware Composer HAL conducts the other half of the work. This HAL is
+the central point for all Android graphics rendering. Hardware Composer must
+support events, one of which is VSYNC. Another is hotplug for plug-and-play
+HDMI support.</p>
 
-    <pre>int (waitForVsync*) (int64_t *timestamp)</pre>
+<p>See the <a href="{@docRoot}devices/graphics.html#hardware_composer_hal">Hardware Composer
+HAL</a> section for more information.</p>
 
-  <p>that points to a function you must implement for VSYNC. This function blocks until
-    a VSYNC happens and returns the timestamp of the actual VSYNC.
-    A client can receive a VSYNC timestamps once, at specified intervals, or continously (interval of 1). 
-    You must implement VSYNC to have no more than a 1ms lag at the maximum (1/2ms or less is recommended), and
-    the timestamps returned must be extremely accurate.
-</p>
+<h3 id=gralloc>Gralloc</h3>
 
-<h4>Explicit synchronization</h4>
-<p>Explicit synchronization is required in Jellybean MR1 and later and provides a mechanism
-for Gralloc buffers to be acquired and released in a synchronized way.
-Explicit synchronization allows producers and consumers of graphics buffers to signal when
-they are done with a buffer. This allows the Android system to asynchronously queue buffers
-to be read or written with the certainty that another consumer or producer does not currently need them.</p>
-<p>
-This communication is facilitated with the use of synchronization fences, which are now required when requesting
-a buffer for consuming or producing. The
- synchronization framework consists of three main parts:</p>
-<ul>
-  <li><code>sync_timeline</code>: a monotonically increasing timeline that should be implemented
-    for each driver instance. This basically is a counter of jobs submitted to the kernel for a particular piece of hardware.</li>
-    <li><code>sync_pt</code>: a single value or point on a <code>sync_timeline</code>. A point
-      has three states: active, signaled, and error. Points start in the active state and transition
-      to the signaled or error states. For instance, when a buffer is no longer needed by an image
-      consumer, this <code>sync_point</code> is signaled so that image producers
-      know that it is okay to write into the buffer again.</li>
-    <li><code>sync_fence</code>: a collection of <code>sync_pt</code>s that often have different
-      <code>sync_timeline</code> parents (such as for the display controller and GPU). This allows
-      multiple consumers or producers to signal that
-      they are using a buffer and to allow this information to be communicated with one function parameter.
-      Fences are backed by a file descriptor and can be passed from kernel-space to user-space.
-      For instance, a fence can contain two <code>sync_point</code>s that signify when two separate
-      image consumers are done reading a buffer. When the fence is signaled,
-      the image producers now know that both consumers are done consuming.</li>
-    </ul>
+<p>The graphics memory allocator is needed to allocate memory that is requested
+by image producers. See the <a
+href="{@docRoot}devices/graphics.html#gralloc">Gralloc HAL</a> section for more
+information.</p>
 
-<p>To implement explicit synchronization, you need to do provide the following:
+<h2 id=data_flow>Data flow</h2>
 
-<ul>
-  <li>A kernel-space driver that implements a synchronization timeline for a particular piece of hardware. Drivers that
-    need to be fence-aware are generally anything that accesses or communicates with the hardware composer.
-    See the <code>system/core/include/sync/sync.h</code> file for more implementation details. The
-    <code>system/core/libsync</code> directory includes a library to communicate with the kernel-space </li>
-  <li>A hardware composer HAL module (version 1.1 or later) that supports the new synchronization functionality. You will need to provide
-  the appropriate synchronization fences as parameters to the <code>set()</code> and <code>prepare()</code> functions in the HAL. As a last resort,
-you can pass in -1 for the file descriptor parameters if you cannot support explicit synchronization for some reason. This
-is not recommended, however.</li>
-  <li>Two GL specific extensions related to fences, <code>EGL_ANDROID_native_fence_sync</code> and <code>EGL_ANDROID_wait_sync</code>,
-    along with incorporating fence support into your graphics drivers.</ul>
+<p>See the following diagram for a depiction of the Android graphics
+pipeline:</p>
 
+<img src="graphics/images/graphics_pipeline.png" alt="graphics data flow">
 
+<p class="img-caption"><strong>Figure 2.</strong> How graphic data flow through
+Android</p>
 
+<p>The objects on the left are renderers producing graphics buffers, such as
+the home screen, status bar, and system UI. SurfaceFlinger is the compositor
+and Hardware Composer is the composer.</p>
+
+<h3 id=bufferqueue>BufferQueue</h3>
+
+<p>BufferQueues provide the glue between the Android graphics components. These
+are a pair of queues that mediate the constant cycle of buffers from the
+producer to the consumer. Once the producers hand off their buffers,
+SurfaceFlinger is responsible for compositing everything onto the display.</p>
+
+<p>See the following diagram for the BufferQueue communication process.</p>
+
+<img src="graphics/images/bufferqueue.png" alt="BufferQueue communication process">
+
+<p class="img-caption"><strong>Figure 3.</strong> BufferQueue communication
+process</p>
+
+<p>BufferQueue contains the logic that ties image stream producers and image
+stream consumers together. Some examples of image producers are the camera
+previews produced by the camera HAL or OpenGL ES games. Some examples of image
+consumers are SurfaceFlinger or another app that displays an OpenGL ES stream,
+such as the camera app displaying the camera viewfinder.</p>
+
+<p>BufferQueue is a data structure that combines a buffer pool with a queue and
+uses Binder IPC to pass buffers between processes. The producer interface, or
+what you pass to somebody who wants to generate graphic buffers, is
+IGraphicBufferProducer (part of <a
+href="http://developer.android.com/reference/android/graphics/SurfaceTexture.html">SurfaceTexture</a>).
+BufferQueue is often used to render to a Surface and consume with a GL
+Consumer, among other tasks.
+
+BufferQueue can operate in three different modes:</p>
+
+<p><em>Synchronous-like mode</em> - BufferQueue by default operates in a
+synchronous-like mode, in which every buffer that comes in from the producer
+goes out at the consumer. No buffer is ever discarded in this mode. And if the
+producer is too fast and creates buffers faster than they are being drained, it
+will block and wait for free buffers.</p>
+
+<p><em>Non-blocking mode</em> - BufferQueue can also operate in a non-blocking
+mode where it generates an error rather than waiting for a buffer in those
+cases. No buffer is ever discarded in this mode either. This is useful for
+avoiding potential deadlocks in application software that may not understand
+the complex dependencies of the graphics framework.</p>
+
+<p><em>Discard mode</em> - Finally, BufferQueue may be configured to discard
+old buffers rather than generate errors or wait. For instance, if conducting GL
+rendering to a texture view and drawing as quickly as possible, buffers must be
+dropped.</p>
+
+<p>To conduct most of this work, SurfaceFlinger acts as just another OpenGL ES
+client. So when SurfaceFlinger is actively compositing one buffer or two into a
+third, for instance, it is using OpenGL ES.</p>
+
+<p>The Hardware Composer HAL conducts the other half of the work. This HAL acts
+as the central point for all Android graphics rendering.</p>
+
+<h3 id=synchronization_framework>Synchronization framework</h3>
+
+<p>Since Android graphics offer no explicit parallelism, vendors have long
+implemented their own implicit synchronization within their own drivers. This
+is no longer required with the Android graphics synchronization framework. See
+the <a href="#explicit_synchronization">Explicit synchronization</a> section
+for implementation instructions.</p>
+
+<p>The synchronization framework explicitly describes dependencies between
+different asynchronous operations in the system. The framework provides a
+simple API that lets components signal when buffers are released. It also
+allows synchronization primitives to be passed between drivers from the kernel
+to userspace and between userspace processes themselves.</p>
+
+<p>For example, an application may queue up work to be carried out in the GPU.
+The GPU then starts drawing that image. Although the image hasn’t been drawn
+into memory yet, the buffer pointer can still be passed to the window
+compositor along with a fence that indicates when the GPU work will be
+finished. The window compositor may then start processing ahead of time and
+hand off the work to the display controller. In this manner, the CPU work can
+be done ahead of time. Once the GPU finishes, the display controller can
+immediately display the image.</p>
+
+<p>The synchronization framework also allows implementers to leverage
+synchronization resources in their own hardware components. Finally, the
+framework provides visibility into the graphics pipeline to aid in
+debugging.</p>
diff --git a/src/devices/graphics/architecture.jd b/src/devices/graphics/architecture.jd
index 6842dd7..75623cc 100644
--- a/src/devices/graphics/architecture.jd
+++ b/src/devices/graphics/architecture.jd
@@ -1,4 +1,4 @@
-page.title=Architecture
+page.title=Graphics architecture
 @jd:body
 
 <!--
diff --git a/src/devices/graphics/images/bufferqueue.png b/src/devices/graphics/images/bufferqueue.png
new file mode 100644
index 0000000..1951f46
--- /dev/null
+++ b/src/devices/graphics/images/bufferqueue.png
Binary files differ
diff --git a/src/devices/graphics/images/dispsync.png b/src/devices/graphics/images/dispsync.png
new file mode 100644
index 0000000..d97765c
--- /dev/null
+++ b/src/devices/graphics/images/dispsync.png
Binary files differ
diff --git a/src/devices/graphics/images/graphics_pipeline.png b/src/devices/graphics/images/graphics_pipeline.png
new file mode 100644
index 0000000..983a517
--- /dev/null
+++ b/src/devices/graphics/images/graphics_pipeline.png
Binary files differ
diff --git a/src/devices/graphics/images/graphics_surface.png b/src/devices/graphics/images/graphics_surface.png
new file mode 100644
index 0000000..6cd86ef
--- /dev/null
+++ b/src/devices/graphics/images/graphics_surface.png
Binary files differ
diff --git a/src/devices/graphics/implement.jd b/src/devices/graphics/implement.jd
new file mode 100644
index 0000000..59aca16
--- /dev/null
+++ b/src/devices/graphics/implement.jd
@@ -0,0 +1,605 @@
+page.title=Implementing graphics
+@jd:body
+
+<!--
+    Copyright 2014 The Android Open Source Project
+
+    Licensed under the Apache License, Version 2.0 (the "License");
+    you may not use this file except in compliance with the License.
+    You may obtain a copy of the License at
+
+        http://www.apache.org/licenses/LICENSE-2.0
+
+    Unless required by applicable law or agreed to in writing, software
+    distributed under the License is distributed on an "AS IS" BASIS,
+    WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+    See the License for the specific language governing permissions and
+    limitations under the License.
+-->
+
+<div id="qv-wrapper">
+  <div id="qv">
+    <h2>In this document</h2>
+    <ol id="auto-toc">
+    </ol>
+  </div>
+</div>
+
+
+<p>Follow the instructions here to implement the Android graphics HAL.</p>
+
+<h2 id=requirements>Requirements</h2>
+
+<p>The following list and sections describe what you need to provide to support
+graphics in your product:</p>
+
+<ul> <li> OpenGL ES 1.x Driver <li> OpenGL ES 2.0 Driver <li> OpenGL ES 3.0
+Driver (optional) <li> EGL Driver <li> Gralloc HAL implementation <li> Hardware
+Composer HAL implementation <li> Framebuffer HAL implementation </ul>
+
+<h2 id=implementation>Implementation</h2>
+
+<h3 id=opengl_and_egl_drivers>OpenGL and EGL drivers</h3>
+
+<p>You must provide drivers for OpenGL ES 1.x, OpenGL ES 2.0, and EGL. Here are
+some key considerations:</p>
+
+<ul> <li> The GL driver needs to be robust and conformant to OpenGL ES
+standards.  <li> Do not limit the number of GL contexts. Because Android allows
+apps in the background and tries to keep GL contexts alive, you should not
+limit the number of contexts in your driver.  <li> It is not uncommon to have
+20-30 active GL contexts at once, so you should also be careful with the amount
+of memory allocated for each context.  <li> Support the YV12 image format and
+any other YUV image formats that come from other components in the system such
+as media codecs or the camera.  <li> Support the mandatory extensions:
+<code>GL_OES_texture_external</code>,
+<code>EGL_ANDROID_image_native_buffer</code>, and
+<code>EGL_ANDROID_recordable</code>. The
+<code>EGL_ANDROID_framebuffer_target</code> extension is required for Hardware
+Composer 1.1 and higher, as well.  <li> We highly recommend also supporting
+<code>EGL_ANDROID_blob_cache</code>, <code>EGL_KHR_fence_sync</code>,
+<code>EGL_KHR_wait_sync</code>, and <code>EGL_ANDROID_native_fence_sync</code>.
+</ul>
+
+<p>Note the OpenGL API exposed to app developers is different from the OpenGL
+interface that you are implementing. Apps do not have access to the GL driver
+layer and must go through the interface provided by the APIs.</p>
+
+<h3 id=pre-rotation>Pre-rotation</h3>
+
+<p>Many hardware overlays do not support rotation, and even if they do it costs
+processing power. So the solution is to pre-transform the buffer before it
+reaches SurfaceFlinger. A query hint in <code>ANativeWindow</code> was added
+(<code>NATIVE_WINDOW_TRANSFORM_HINT</code>) that represents the most likely
+transform to be applied to the buffer by SurfaceFlinger. Your GL driver can use
+this hint to pre-transform the buffer before it reaches SurfaceFlinger so when
+the buffer arrives, it is correctly transformed.</p>
+
+<p>For example, you may receive a hint to rotate 90 degrees. You must generate
+a matrix and apply it to the buffer to prevent it from running off the end of
+the page. To save power, this should be done in pre-rotation. See the
+<code>ANativeWindow</code> interface defined in
+<code>system/core/include/system/window.h</code> for more details.</p>
+
+<h3 id=gralloc_hal>Gralloc HAL</h3>
+
+<p>The graphics memory allocator is needed to allocate memory that is requested
+by image producers. You can find the interface definition of the HAL at:
+<code>hardware/libhardware/modules/gralloc.h</code></p>
+
+<h3 id=protected_buffers>Protected buffers</h3>
+
+<p>The gralloc usage flag <code>GRALLOC_USAGE_PROTECTED</code> allows the
+graphics buffer to be displayed only through a hardware-protected path. These
+overlay planes are the only way to display DRM content. DRM-protected buffers
+cannot be accessed by SurfaceFlinger or the OpenGL ES driver.</p>
+
+<p>DRM-protected video can be presented only on an overlay plane. Video players
+that support protected content must be implemented with SurfaceView. Software
+running on unprotected hardware cannot read or write the buffer.
+Hardware-protected paths must appear on the Hardware Composer overlay. For
+instance, protected videos will disappear from the display if Hardware Composer
+switches to OpenGL ES composition.</p>
+
+<p>See the <a href="{@docRoot}devices/drm.html">DRM</a> page for a description
+of protected content.</p>
+
+<h3 id=hardware_composer_hal>Hardware Composer HAL</h3>
+
+<p>The Hardware Composer HAL is used by SurfaceFlinger to composite surfaces to
+the screen. The Hardware Composer abstracts objects like overlays and 2D
+blitters and helps offload some work that would normally be done with
+OpenGL.</p>
+
+<p>We recommend you start using version 1.3 of the Hardware Composer HAL as it
+will provide support for the newest features (explicit synchronization,
+external displays, and more). Because the physical display hardware behind the
+Hardware Composer abstraction layer can vary from device to device, it is
+difficult to define recommended features. But here is some guidance:</p>
+
+<ul> <li> The Hardware Composer should support at least four overlays (status
+bar, system bar, application, and wallpaper/background).  <li> Layers can be
+bigger than the screen, so the Hardware Composer should be able to handle
+layers that are larger than the display (for example, a wallpaper).  <li>
+Pre-multiplied per-pixel alpha blending and per-plane alpha blending should be
+supported at the same time.  <li> The Hardware Composer should be able to
+consume the same buffers that the GPU, camera, video decoder, and Skia buffers
+are producing, so supporting some of the following properties is helpful: <ul>
+<li> RGBA packing order <li> YUV formats <li> Tiling, swizzling, and stride
+properties </ul> <li> A hardware path for protected video playback must be
+present if you want to support protected content.  </ul>
+
+<p>The general recommendation when implementing your Hardware Composer is to
+implement a non-operational Hardware Composer first. Once you have the
+structure done, implement a simple algorithm to delegate composition to the
+Hardware Composer. For example, just delegate the first three or four surfaces
+to the overlay hardware of the Hardware Composer.</p>
+
+<p>Focus on optimization, such as intelligently selecting the surfaces to send
+to the overlay hardware that maximizes the load taken off of the GPU. Another
+optimization is to detect whether the screen is updating. If not, delegate
+composition to OpenGL instead of the Hardware Composer to save power. When the
+screen updates again, continue to offload composition to the Hardware
+Composer.</p>
+
+<p>Devices must report the display mode (or resolution). Android uses the first
+mode reported by the device. To support televisions, have the TV device report
+the mode selected for it by the manufacturer to Hardware Composer. See
+hwcomposer.h for more details.</p>
+
+<p>Prepare for common use cases, such as:</p>
+
+<ul> <li> Full-screen games in portrait and landscape mode <li> Full-screen
+video with closed captioning and playback control <li> The home screen
+(compositing the status bar, system bar, application window, and live
+wallpapers) <li> Protected video playback <li> Multiple display support </ul>
+
+<p>These use cases should address regular, predictable uses rather than edge
+cases that are rarely encountered. Otherwise, any optimization will have little
+benefit. Implementations must balance two competing goals: animation smoothness
+and interaction latency.</p>
+
+<p>Further, to make best use of Android graphics, you must develop a robust
+clocking strategy. Performance matters little if clocks have been turned down
+to make every operation slow. You need a clocking strategy that puts the clocks
+at high speed when needed, such as to make animations seamless, and then slows
+the clocks whenever the increased speed is no longer needed.</p>
+
+<p>Use the <code>adb shell dumpsys SurfaceFlinger</code> command to see
+precisely what SurfaceFlinger is doing. See the <a
+href="{@docRoot}devices/graphics/architecture.html#hwcomposer">Hardware
+Composer</a> section of the Architecture page for example output and a
+description of relevant fields.</p>
+
+<p>You can find the HAL for the Hardware Composer and additional documentation
+in: <code>hardware/libhardware/include/hardware/hwcomposer.h
+hardware/libhardware/include/hardware/hwcomposer_defs.h</code></p>
+
+<p>A stub implementation is available in the
+<code>hardware/libhardware/modules/hwcomposer</code> directory.</p>
+
+<h3 id=vsync>VSYNC</h3>
+
+<p>VSYNC synchronizes certain events to the refresh cycle of the display.
+Applications always start drawing on a VSYNC boundary, and SurfaceFlinger
+always composites on a VSYNC boundary. This eliminates stutters and improves
+visual performance of graphics. The Hardware Composer has a function
+pointer:</p>
+
+<pre class=prettyprint> int (waitForVsync*) (int64_t *timestamp) </pre>
+
+
+<p>This points to a function you must implement for VSYNC. This function blocks
+until a VSYNC occurs and returns the timestamp of the actual VSYNC. A message
+must be sent every time VSYNC occurs. A client can receive a VSYNC timestamp
+once, at specified intervals, or continuously (interval of 1). You must
+implement VSYNC to have no more than a 1ms lag at the maximum (0.5ms or less is
+recommended), and the timestamps returned must be extremely accurate.</p>
+
+<h4 id=explicit_synchronization>Explicit synchronization</h4>
+
+<p>Explicit synchronization is required and provides a mechanism for Gralloc
+buffers to be acquired and released in a synchronized way. Explicit
+synchronization allows producers and consumers of graphics buffers to signal
+when they are done with a buffer. This allows the Android system to
+asynchronously queue buffers to be read or written with the certainty that
+another consumer or producer does not currently need them. See the <a
+href="#synchronization_framework">Synchronization framework</a> section for an overview of
+this mechanism.</p>
+
+<p>The benefits of explicit synchronization include less behavior variation
+between devices, better debugging support, and improved testing metrics. For
+instance, the sync framework output readily identifies problem areas and root
+causes. And centralized SurfaceFlinger presentation timestamps show when events
+occur in the normal flow of the system.</p>
+
+<p>This communication is facilitated by the use of synchronization fences,
+which are now required when requesting a buffer for consuming or producing. The
+synchronization framework consists of three main building blocks:
+sync_timeline, sync_pt, and sync_fence.</p>
+
+<h5 id=sync_timeline>sync_timeline</h5>
+
+<p>A sync_timeline is a monotonically increasing timeline that should be
+implemented for each driver instance, such as a GL context, display controller,
+or 2D blitter. This is essentially a counter of jobs submitted to the kernel
+for a particular piece of hardware. It provides guarantees about the order of
+operations and allows hardware-specific implementations.</p>
+
+<p>Please note, the sync_timeline is offered as a CPU-only reference
+implementation called sw_sync (which stands for software sync). If possible,
+use sw_sync instead of a sync_timeline to save resources and avoid complexity.
+If you’re not employing a hardware resource, sw_sync should be sufficient.</p>
+
+<p>If you must implement a sync_timeline, use the sw_sync driver as a starting
+point. Follow these guidelines:</p>
+
+<ul> <li> Provide useful names for all drivers, timelines, and fences. This
+simplifies debugging.  <li> Implement timeline_value str and pt_value_str
+operators in your timelines as they make debugging output much more readable.
+<li> If you want your userspace libraries (such as the GL library) to have
+access to the private data of your timelines, implement the fill driver_data
+operator. This lets you get information about the immutable sync_fence and
+sync_pts so you might build command lines based upon them.  </ul>
+
+<p>When implementing a sync_timeline, <strong>don’t</strong>:</p>
+
+<ul> <li> Base it on any real view of time, such as when a wall clock or other
+piece of work might finish. It is better to create an abstract timeline that
+you can control.  <li> Allow userspace to explicitly create or signal a fence.
+This can result in one piece of the user pipeline creating a denial-of-service
+attack that halts all functionality. This is because the userspace cannot make
+promises on behalf of the kernel.  <li> Access sync_timeline, sync_pt, or
+sync_fence elements explicitly, as the API should provide all required
+functions.  </ul>
+
+<h5 id=sync_pt>sync_pt</h5>
+
+<p>A sync_pt is a single value or point on a sync_timeline. A point has three
+states: active, signaled, and error. Points start in the active state and
+transition to the signaled or error states. For instance, when a buffer is no
+longer needed by an image consumer, this sync_point is signaled so that image
+producers know it is okay to write into the buffer again.</p>
+
+<h5 id=sync_fence>sync_fence</h5>
+
+<p>A sync_fence is a collection of sync_pts that often have different
+sync_timeline parents (such as for the display controller and GPU). These are
+the main primitives over which drivers and userspace communicate their
+dependencies. A fence is a promise from the kernel that it gives upon accepting
+work that has been queued and assures completion in a finite amount of
+time.</p>
+
+<p>This allows multiple consumers or producers to signal they are using a
+buffer and to allow this information to be communicated with one function
+parameter. Fences are backed by a file descriptor and can be passed from
+kernel-space to user-space. For instance, a fence can contain two sync_points
+that signify when two separate image consumers are done reading a buffer. When
+the fence is signaled, the image producers know both consumers are done
+consuming.
+
+Fences, like sync_pts, start active and then change state based upon the state
+of their points. If all sync_pts become signaled, the sync_fence becomes
+signaled. If one sync_pt falls into an error state, the entire sync_fence has
+an error state.
+
+Membership in the sync_fence is immutable once the fence is created. And since
+a sync_pt can be in only one fence, it is included as a copy. Even if two
+points have the same value, there will be two copies of the sync_pt in the
+fence.
+
+To get more than one point in a fence, a merge operation is conducted. In the
+merge, the points from two distinct fences are added to a third fence. If one
+of those points was signaled in the originating fence, and the other was not,
+the third fence will also not be in a signaled state.</p>
+
+<p>To implement explicit synchronization, you need to provide the
+following:</p>
+
+<ul> <li> A kernel-space driver that implements a synchronization timeline for
+a particular piece of hardware. Drivers that need to be fence-aware are
+generally anything that accesses or communicates with the Hardware Composer.
+Here are the key files (found in the android-3.4 kernel branch): <ul> <li> Core
+implementation: <ul> <li> <code>kernel/common/include/linux/sync.h</code> <li>
+<code>kernel/common/drivers/base/sync.c</code> </ul> <li> sw_sync: <ul> <li>
+<code>kernel/common/include/linux/sw_sync.h</code> <li>
+<code>kernel/common/drivers/base/sw_sync.c</code> </ul> <li> Documentation:
+<li> <code>kernel/common//Documentation/sync.txt</code> Finally, the
+<code>platform/system/core/libsync</code> directory includes a library to
+communicate with the kernel-space.  </ul> <li> A Hardware Composer HAL module
+(version 1.3 or later) that supports the new synchronization functionality. You
+will need to provide the appropriate synchronization fences as parameters to
+the set() and prepare() functions in the HAL.  <li> Two GL-specific extensions
+related to fences, <code>EGL_ANDROID_native_fence_sync</code> and
+<code>EGL_ANDROID_wait_sync</code>, along with incorporating fence support into
+your graphics drivers.  </ul>
+
+<p>For example, to use the API supporting the synchronization function, you
+might develop a display driver that has a display buffer function. Before the
+synchronization framework existed, this function would receive dma-bufs, put
+those buffers on the display, and block while the buffer is visible, like
+so:</p>
+
+<pre class=prettyprint>
+/*
+ * assumes buf is ready to be displayed.  returns when buffer is no longer on
+ * screen.
+ */
+void display_buffer(struct dma_buf *buf); </pre>
+
+
+<p>With the synchronization framework, the API call is slightly more complex.
+While putting a buffer on display, you associate it with a fence that says when
+the buffer will be ready. So you queue up the work, which you will initiate
+once the fence clears.</p>
+
+<p>In this manner, you are not blocking anything. You immediately return your
+own fence, which is a guarantee of when the buffer will be off of the display.
+As you queue up buffers, the kernel will list dependencies. With the
+synchronization framework:</p>
+
+<pre class=prettyprint>
+/*
+ * will display buf when fence is signaled.  returns immediately with a fence
+ * that will signal when buf is no longer displayed.
+ */
+struct sync_fence* display_buffer(struct dma_buf *buf, struct sync_fence
+*fence); </pre>
+
+
+<h4 id=sync_integration>Sync integration</h4>
+
+<h5 id=integration_conventions>Integration conventions</h5>
+
+<p>This section explains how to integrate the low-level sync framework with
+different parts of the Android framework and the drivers that need to
+communicate with one another.</p>
+
+<p>The Android HAL interfaces for graphics follow consistent conventions so
+when file descriptors are passed across a HAL interface, ownership of the file
+descriptor is always transferred. This means:</p>
+
+<ul> <li> if you receive a fence file descriptor from the sync framework, you
+must close it.  <li> if you return a fence file descriptor to the sync
+framework, the framework will close it.  <li> if you want to continue using the
+fence file descriptor, you must duplicate the descriptor.  </ul>
+
+<p>Every time a fence is passed through BufferQueue - such as for a window that
+passes a fence to BufferQueue saying when its new contents will be ready - the
+fence object is renamed. Since kernel fence support allows fences to have
+strings for names, the sync framework uses the window name and buffer index
+that is being queued to name the fence, for example:
+<code>SurfaceView:0</code></p>
+
+<p>This is helpful in debugging to identify the source of a deadlock. Those
+names appear in the output of <code>/d/sync</code> and bug reports when
+taken.</p>
+
+<h5 id=anativewindow_integration>ANativeWindow integration</h5>
+
+<p>ANativeWindow is fence aware. <code>dequeueBuffer</code>,
+<code>queueBuffer</code>, and <code>cancelBuffer</code> have fence
+parameters.</p>
+
+<h5 id=opengl_es_integration>OpenGL ES integration</h5>
+
+<p>OpenGL ES sync integration relies upon these two EGL extensions:</p>
+
+<ul> <li> <code>EGL_ANDROID_native_fence_sync</code> - provides a way to either
+wrap or create native Android fence file descriptors in EGLSyncKHR objects.
+<li> <code>EGL_ANDROID_wait_sync</code> - allows GPU-side stalls rather than in
+CPU, making the GPU wait for an EGLSyncKHR. This is essentially the same as the
+<code>EGL_KHR_wait_sync</code> extension. See the
+<code>EGL_KHR_wait_sync</code> specification for details.  </ul>
+
+<p>These extensions can be used independently and are controlled by a compile
+flag in libgui. To use them, first implement the
+<code>EGL_ANDROID_native_fence_sync</code> extension along with the associated
+kernel support. Next add a ANativeWindow support for fences to your driver and
+then turn on support in libgui to make use of the
+<code>EGL_ANDROID_native_fence_sync</code> extension.</p>
+
+<p>Then, as a second pass, enable the <code>EGL_ANDROID_wait_sync</code>
+extension in your driver and turn it on separately. The
+<code>EGL_ANDROID_native_fence_sync</code> extension consists of a distinct
+native fence EGLSync object type so extensions that apply to existing EGLSync
+object types don’t necessarily apply to <code>EGL_ANDROID_native_fence</code>
+objects to avoid unwanted interactions.</p>
+
+<p>The EGL_ANDROID_native_fence_sync extension employs a corresponding native
+fence file descriptor attribute that can be set only at creation time and
+cannot be directly queried onward from an existing sync object. This attribute
+can be set to one of two modes:</p>
+
+<ul> <li> A valid fence file descriptor - wraps an existing native Android
+fence file descriptor in an EGLSyncKHR object.  <li> -1 - creates a native
+Android fence file descriptor from an EGLSyncKHR object.  </ul>
+
+<p>The DupNativeFenceFD function call is used to extract the EGLSyncKHR object
+from the native Android fence file descriptor. This has the same result as
+querying the attribute that was set but adheres to the convention that the
+recipient closes the fence (hence the duplicate operation). Finally, destroying
+the EGLSync object should close the internal fence attribute.</p>
+
+<h5 id=hardware_composer_integration>Hardware Composer integration</h5>
+
+<p>Hardware Composer handles three types of sync fences:</p>
+
+<ul> <li> <em>Acquire fence</em> - one per layer, this is set before calling
+HWC::set. It signals when Hardware Composer may read the buffer.  <li>
+<em>Release fence</em> - one per layer, this is filled in by the driver in
+HWC::set. It signals when Hardware Composer is done reading the buffer so the
+framework can start using that buffer again for that particular layer.  <li>
+<em>Retire fence</em> - one per the entire frame, this is filled in by the
+driver each time HWC::set is called. This covers all of the layers for the set
+operation. It signals to the framework when all of the effects of this set
+operation has completed. The retire fence signals when the next set operation
+takes place on the screen.  </ul>
+
+<p>The retire fence can be used to determine how long each frame appears on the
+screen. This is useful in identifying the location and source of delays, such
+as a stuttering animation. </p>
+
+<h4 id=vsync_offset>VSYNC Offset</h4>
+
+<p>Application and SurfaceFlinger render loops should be synchronized to the
+hardware VSYNC. On a VSYNC event, the display begins showing frame N while
+SurfaceFlinger begins compositing windows for frame N+1. The app handles
+pending input and generates frame N+2.</p>
+
+<p>Synchronizing with VSYNC delivers consistent latency. It reduces errors in
+apps and SurfaceFlinger and the drifting of displays in and out of phase with
+each other. This, however, does assume application and SurfaceFlinger per-frame
+times don’t vary widely. Nevertheless, the latency is at least two frames.</p>
+
+<p>To remedy this, you may employ VSYNC offsets to reduce the input-to-display
+latency by making application and composition signal relative to hardware
+VSYNC. This is possible because application plus composition usually takes less
+than 33 ms.</p>
+
+<p>The result of VSYNC offset is three signals with same period, offset
+phase:</p>
+
+<ul> <li> <em>HW_VSYNC_0</em> - Display begins showing next frame <li>
+<em>VSYNC</em> - App reads input and generates next frame <li> <em>SF
+VSYNC</em> - SurfaceFlinger begins compositing for next frame </ul>
+
+<p>With VSYNC offset, SurfaceFlinger receives the buffer and composites the
+frame, while the application processes the input and renders the frame, all
+within a single frame of time.</p>
+
+<p>Please note, VSYNC offsets reduce the time available for app and composition
+and therefore provide a greater chance for error.</p>
+
+<h5 id=dispsync>DispSync</h5>
+
+<p>DispSync maintains a model of the periodic hardware-based VSYNC events of a
+display and uses that model to execute periodic callbacks at specific phase
+offsets from the hardware VSYNC events.</p>
+
+<p>DispSync is essentially a software phase lock loop (PLL) that generates the
+VSYNC and SF VSYNC signals used by Choreographer and SurfaceFlinger, even if
+not offset from hardware VSYNC.</p>
+
+<img src="images/dispsync.png" alt="DispSync flow">
+
+<p class="img-caption"><strong>Figure 4.</strong> DispSync flow</p>
+
+<p>DispSync has these qualities:</p>
+
+<ul> <li> <em>Reference</em> - HW_VSYNC_0 <li> <em>Output</em> - VSYNC and SF
+VSYNC <li> <em>Feedback</em> - Retire fence signal timestamps from Hardware
+Composer </ul>
+
+<h5 id=vsync_retire_offset>VSYNC/Retire Offset</h5>
+
+<p>The signal timestamp of retire fences must match HW VSYNC even on devices
+that don’t use the offset phase. Otherwise, errors appear to have greater
+severity than reality.</p>
+
+<p>“Smart” panels often have a delta. Retire fence is the end of direct memory
+access (DMA) to display memory. The actual display switch and HW VSYNC is some
+time later.</p>
+
+<p><code>PRESENT_TIME_OFFSET_FROM_VSYNC_NS</code> is set in the device’s
+BoardConfig.mk make file. It is based upon the display controller and panel
+characteristics. Time from retire fence timestamp to HW Vsync signal is
+measured in nanoseconds.</p>
+
+<h5 id=vsync_and_sf_vsync_offsets>VSYNC and SF_VSYNC Offsets</h5>
+
+<p>The <code>VSYNC_EVENT_PHASE_OFFSET_NS</code> and
+<code>SF_VSYNC_EVENT_PHASE_OFFSET_NS</code> are set conservatively based on
+high-load use cases, such as partial GPU composition during window transition
+or Chrome scrolling through a webpage containing animations. These offsets
+allow for long application render time and long GPU composition time.</p>
+
+<p>More than a millisecond or two of latency is noticeable. We recommend
+integrating thorough automated error testing to minimize latency without
+significantly increasing error counts.</p>
+
+<p>Note these offsets are also set in the device’s BoardConfig.mk make file.
+The default if not set is zero offset. Both settings are offset in nanoseconds
+after HW_VSYNC_0. Either can be negative.</p>
+
+<h3 id=virtual_displays>Virtual displays</h3>
+
+<p>Android added support for virtual displays to Hardware Composer in version
+1.3. This support was implemented in the Android platform and can be used by
+Miracast.</p>
+
+<p>The virtual display composition is similar to the physical display: Input
+layers are described in prepare(), SurfaceFlinger conducts GPU composition, and
+layers and GPU framebuffer are  provided to Hardware Composer in set().</p>
+
+<p>Instead of the output going to the screen, it is sent to a gralloc buffer.
+Hardware Composer writes output to a buffer and provides the completion fence.
+The buffer is sent to an arbitrary consumer: video encoder, GPU, CPU, etc.
+Virtual displays can use 2D/blitter or overlays if the display pipeline can
+write to memory.</p>
+
+<h4 id=modes>Modes</h4>
+
+<p>Each frame is in one of three modes after prepare():</p>
+
+<ul> <li> <em>GLES</em> - All layers composited by GPU. GPU writes directly to
+the output buffer while Hardware Composer does nothing. This is equivalent to
+virtual display composition with Hardware Composer <1.3.  <li> <em>MIXED</em> -
+GPU composites some layers to framebuffer, and Hardware Composer composites
+framebuffer and remaining layers. GPU writes to scratch buffer (framebuffer).
+Hardware Composer reads scratch buffer and writes to the output buffer. Buffers
+may have different formats, e.g. RGBA and YCbCr.  <li> <em>HWC</em> - All
+layers composited by Hardware Composer. Hardware Composer writes directly to
+the output buffer.  </ul>
+
+<h4 id=output_format>Output format</h4>
+
+<p><em>MIXED and HWC modes</em>: If the consumer needs CPU access, the consumer
+chooses the format. Otherwise, the format is IMPLEMENTATION_DEFINED. Gralloc
+can choose best format based on usage flags. For example, choose a YCbCr format
+if the consumer is video encoder, and Hardware Composer can write the format
+efficiently.</p>
+
+<p><em>GLES mode</em>: EGL driver chooses output buffer format in
+dequeueBuffer(), typically RGBA8888. The consumer must be able to accept this
+format.</p>
+
+<h4 id=egl_requirement>EGL requirement</h4>
+
+<p>Hardware Composer 1.3 virtual displays require that eglSwapBuffers() does
+not dequeue the next buffer immediately. Instead, it should defer dequeueing
+the buffer until rendering begins. Otherwise, EGL always owns the “next” output
+buffer. SurfaceFlinger can’t get the output buffer for Hardware Composer in
+MIXED/HWC mode. </p>
+
+<p>If Hardware Composer always sends all virtual display layers to GPU, all
+frames will be in GLES mode. Although it is not recommended, you may use this
+method if you need to support Hardware Composer 1.3 for some other reason but
+can’t conduct virtual display composition.</p>
+
+<h2 id=testing>Testing</h2>
+
+<p>For benchmarking, we suggest following this flow by phase:</p>
+
+<ul> <li> <em>Specification</em> - When initially specifying the device, such
+as when using immature drivers, you should use predefined (fixed) clocks and
+workloads to measure the frames per second rendered. This gives a clear view of
+what the hardware is capable of doing.  <li> <em>Development</em> - In the
+development phase as drivers mature, you should use a fixed set of user actions
+to measure the number of visible stutters (janks) in animations.  <li>
+<em>Production</em> - Once the device is ready for production and you want to
+compare against competitors, you should increase the workload until stutters
+increase. Determine if the current clock settings can keep up with the load.
+This can help you identify where you might be able to slow the clocks and
+reduce power use.  </ul>
+
+<p>For the specification phase, Android offers the Flatland tool to help derive
+device capabilities. It can be found at:
+<code>platform/frameworks/native/cmds/flatland/</code></p>
+
+<p>Flatland relies upon fixed clocks and shows the throughput that can be
+achieved with composition-based workloads. It uses gralloc buffers to simulate
+multiple window scenarios, filling in the window with GL and then measuring the
+compositing. Please note, Flatland uses the synchronization framework to
+measure time. So you must support the synchronization framework to readily use
+Flatland.</p>
diff --git a/src/devices/images/drm_framework.png b/src/devices/images/drm_framework.png
new file mode 100644
index 0000000..06afe05
--- /dev/null
+++ b/src/devices/images/drm_framework.png
Binary files differ
diff --git a/src/devices/images/drm_hal.png b/src/devices/images/drm_hal.png
index ef6379b..6c43422 100644
--- a/src/devices/images/drm_hal.png
+++ b/src/devices/images/drm_hal.png
Binary files differ
diff --git a/src/devices/images/drm_license_metadata.png b/src/devices/images/drm_license_metadata.png
new file mode 100644
index 0000000..2076866
--- /dev/null
+++ b/src/devices/images/drm_license_metadata.png
Binary files differ
diff --git a/src/devices/images/drm_plugin.png b/src/devices/images/drm_plugin.png
new file mode 100644
index 0000000..d332ce6
--- /dev/null
+++ b/src/devices/images/drm_plugin.png
Binary files differ
diff --git a/src/devices/images/drm_plugin_lifecycle.png b/src/devices/images/drm_plugin_lifecycle.png
new file mode 100644
index 0000000..b04acb5
--- /dev/null
+++ b/src/devices/images/drm_plugin_lifecycle.png
Binary files differ
diff --git a/src/devices/images/graphics_surface.png b/src/devices/images/graphics_surface.png
deleted file mode 100644
index e32792d..0000000
--- a/src/devices/images/graphics_surface.png
+++ /dev/null
Binary files differ
diff --git a/src/devices/index.jd b/src/devices/index.jd
index f0b4e42..da9438d 100644
--- a/src/devices/index.jd
+++ b/src/devices/index.jd
@@ -33,7 +33,7 @@
   <p>To ensure that your devices maintain a high level of quality and offers a consistent
   experience for your users, they must must also
   pass the tests in the compatibility test suite (CTS). CTS ensures that anyone
-  building a device meets a quality standard that ensures apps run reliabaly well
+  building a device meets a quality standard that ensures apps run reliably well
   and gives users a good experience. For more information, see the
   <a href="{@docRoot}compatibility/index.html">Compatibility</a> section.</p>
 
diff --git a/src/devices/latency_design.jd b/src/devices/latency_design.jd
index eb503f3..15485a5 100644
--- a/src/devices/latency_design.jd
+++ b/src/devices/latency_design.jd
@@ -53,11 +53,11 @@
 </p>
 
 <ul>
-<li>presence of a fast mixer thread for this output (see below)</li>
-<li>track sample rate</li>
-<li>presence of a client thread to execute callback handlers for this track</li>
-<li>track buffer size</li>
-<li>available fast track slots (see below)</li>
+<li>Presence of a fast mixer thread for this output (see below)</li>
+<li>Track sample rate</li>
+<li>Presence of a client thread to execute callback handlers for this track</li>
+<li>Track buffer size</li>
+<li>Available fast track slots (see below)</li>
 </ul>
 
 <p>
@@ -85,7 +85,7 @@
 </p>
 
 <ul>
-<li>mixing of the normal mixer's sub-mix and up to 7 client fast tracks</li>
+<li>Mixing of the normal mixer's sub-mix and up to 7 client fast tracks</li>
 <li>Per track attenuation</li>
 </ul>
 
diff --git a/src/devices/tech/security/acknowledgements.jd b/src/devices/tech/security/acknowledgements.jd
index 4fc7f80..485af30 100644
--- a/src/devices/tech/security/acknowledgements.jd
+++ b/src/devices/tech/security/acknowledgements.jd
@@ -16,7 +16,6 @@
     See the License for the specific language governing permissions and
     limitations under the License.
 -->
-
 <p>The Android Security Team would like to thank the following people and
 parties for helping to improve Android security. They have done this either by
 finding and responsibly reporting security vulnerabilities to <a
@@ -27,21 +26,21 @@
 
 <h2>2014</h2>
 
+<div style="LINE-HEIGHT:25px;">
 <p>Jeff Forristal of <a href="http://www.bluebox.com/blog/">Bluebox
 Security</a></p>
 
-<p>Aaron Mangeli of <a href="https://banno.com/">Banno</a> (<a
+<p>Aaron Mangel of <a href="https://banno.com/">Banno</a> (<a
 href="mailto:amangel@gmail.com">amangel@gmail.com</a>)</p>
 
 <p><a href="http://www.linkedin.com/in/tonytrummer/">Tony Trummer</a> of <a
-href="http://www.themeninthemiddle.com">The Men in the Middle</a> (<a
+href="http://www.themeninthemiddle.com">The Men in the Middle</a> <br>(<a
 href="https://twitter.com/SecBro1">@SecBro1</a>)</p>
 
 <p><a href="http://www.samsung.com">Samsung Mobile</a></p>
 
 <p>Henry Hoggard of <a href="https://labs.mwrinfosecurity.com/">MWR Labs</a> (<a
 href="https://twitter.com/henryhoggard">@HenryHoggard</a>)</p>
-<p></p>
 
 <p><a href="http://www.androbugs.com">Yu-Cheng Lin 林禹成</a> (<a
 href="https://twitter.com/AndroBugs">@AndroBugs</a>)</p>
@@ -52,12 +51,12 @@
 Engineering Group</a>, EC SPRIDE Technische Universität Darmstadt (<a
 href="mailto:siegfried.rasthofer@gmail.com">siegfried.rasthofer@gmail.com</a>)</p>
 
-<p>Steven Artz of <a href="http://sseblog.ec-spride.de/">Secure Software
+<p>Steven Arzt of <a href="http://sseblog.ec-spride.de/">Secure Software
 Engineering Group</a>, EC SPRIDE Technische Universität Darmstadt (<a
 href="mailto:Steven.Arzt@ec-spride.de">Steven.Arzt@ec-spride.de</a>)</p>
 
 <p><a href="http://blog.redfern.me/">Joseph Redfern</a> of <a
-href="https://labs.mwrinfosecurity.com/">MWR Labs</a> (<a
+href="https://labs.mwrinfosecurity.com/">MWR Labs</a> <br>(<a
 href="https://twitter.com/JosephRedfern">@JosephRedfern</a>)</p>
 
 <p><a href="https://plus.google.com/u/0/109528607786970714118">Valera
@@ -70,7 +69,8 @@
 
 <p>Stephan Huber of Testlab Mobile Security, <a
 href="https://www.sit.fraunhofer.de/">Fraunhofer SIT</a> (<a
-href="mailto:Stephan.Huber@sit.fraunhofer.de">Stephan.Huber@sit.fraunhofer.de</a>)</p>
+href="mailto:Stephan.Huber@sit.fraunhofer.de">Stephan.Huber@sit.fraunhofer.de</a>)
+</p>
 
 <p><a href="http://www.corkami.com">Ange Albertini</a> (<a
 href="https://twitter.com/angealbertini">@angealbertini</a>)</p>
@@ -84,7 +84,7 @@
 href="mailto:litongxin1991@gmail.com">litongxin1991@gmail.com</a>)</p>
 
 <p><a href="https://www.facebook.com/zhou.xiaoyong">Xiaoyong Zhou</a> of <a
-href="http://www.cs.indiana.edu/~zhou/">Indiana University Bloomington</a> (<a
+href="http://www.cs.indiana.edu/~zhou/">Indiana University Bloomington</a> <br>(<a
 href="https://twitter.com/xzhou">@xzhou</a>, <a
 href="mailto:zhou.xiaoyong@gmail.com">zhou.xiaoyong@gmail.com</a>)</p>
 
@@ -102,5 +102,163 @@
 <p>Xinhui Han of Peking University (<a
 href="mailto:hanxinhui@pku.edu.cn">hanxinhui@pku.edu.cn</a>)</p>
 
+<p><a href="http://thejh.net/">Jann Horn</a> <a href="https://android-review.googlesource.com/#/c/98197/">
+<img style="vertical-align:middle;" src="images/tiny-robot.png"
+alt="Green Droid Patch Symbol"
+title="This person contributed code that improved Android security">
+</a></p>
+
+<p>Robert Craig of <a href="http://www.nsa.gov/research/ia_research/">
+Trusted Systems Research Group</a>, US National Security Agency
+<a href="https://android-review.googlesource.com/#/q/owner:%22Robert+Craig+%253Crpcraig%2540tycho.ncsc.mil%253E%22+status:merged">
+<img style="vertical-align:middle" src="images/tiny-robot.png" alt="Patch Symbol"
+title="This person contributed code that improved Android security"></a></p>
+
+<p>Stephen Smalley of <a href="http://www.nsa.gov/research/ia_research/">
+Trusted Systems Research Group</a>, US National Security Agency
+<a href=
+"https://android-review.googlesource.com/#/q/owner:%22Stephen+Smalley+%253Csds%2540tycho.nsa.gov%253E%22+status:merged">
+<img style="vertical-align:middle" src="images/tiny-robot.png"
+alt="Patch Symbol" title="This person contributed code that improved Android security"></a></p>
+
+<p><a href="http://www.linkedin.com/in/billcroberts">
+William Roberts</a> (<a href="mailto:bill.c.roberts@gmail.com">bill.c.roberts@gmail.com</a>)
+<a href=
+"https://android-review.googlesource.com/#/q/owner:bill.c.roberts%2540gmail.com+status:merged">
+<img style="vertical-align:middle" src="images/tiny-robot.png"
+alt="Patch Symbol" title="This person contributed code that improved Android security"></a></p>
+
+<p>Scotty Bauer of University of Utah (<a href="mailto:sbauer@eng.utah.edu">sbauer@eng.utah.edu</a>)</p>
+
+<p><a href="http://www.cs.utah.edu/~rsas/">Raimondas Sasnauskas</a> of University of Utah</p>
+
+<p><a href="http://www.subodh.io">Subodh Iyengar</a> of <a href="https://www.facebook.com">Facebook</a></p>
+
+<p><a href="http://www.shackleton.io/">Will Shackleton</a> of <a href="https://www.facebook.com">Facebook</a></p>
+
+</div>
+
+<h2>2013</h2>
+
+<div style="LINE-HEIGHT:25px;">
+
+<p>Jon Sawyer of <a href="http://appliedcybersecurity.com/">Applied Cybersecurity LLC
+</a> (<a href="mailto:jon@cunninglogic.com">jon@cunninglogic.com</a>)</p>
+
+<p>Joshua J. Drake of <a href="http://www.accuvant.com/">Accuvant LABS
+</a> (<a href="https://twitter.com/jduck">@jduck</a>)
+<a href="https://android-review.googlesource.com/#/q/change:72228+OR+change:72229">
+<img style="vertical-align:middle" src="images/patchreward.png"
+alt="Patch Rewards Symbol" title="This person qualified for the Patch Rewards program!"></a></p>
+
+<p>Ruben Santamarta of IOActive
+(<a href="https://twitter.com/reversemode">@reversemode</a>)</p>
+
+<p>Lucas Yang (amadoh4ck) of
+<a href="http://raonsecurity.com/">RaonSecurity</a>
+(<a href="mailto:amadoh4ck@gmail.com">amadoh4ck@gmail.com</a>)</p>
+
+<p><a href="https://tsarstva.bg/sh/">Ivaylo Marinkov</a>
+of <a href="http://www.ecommera.com/">eCommera</a> <br>
+(<a href="mailto:ivo@tsarstva.bg">ivo@tsarstva.bg</a>)</p>
+
+<p><a href="http://roeehay.blogspot.com/">Roee Hay</a>
+<br>(<a href="https://twitter.com/roeehay">@roeehay</a>,
+<a href="mailto:roeehay@gmail.com">roeehay@gmail.com</a>)</p>
+
+<p>Qualcomm Product Security Initiative</p>
+
+<p><a href="https://lacklustre.net/">Mike Ryan</a> of
+<a href="https://isecpartners.com/">iSEC Partners</a>
+<br>(<a href="https://twitter.com/mpeg4codec">@mpeg4codec</a>,
+<a href="mailto:mikeryan@isecpartners.com">mikeryan@isecpartners.com
+</a>)</p>
+
+<p><a href="http://cryptoonline.com/">Muhammad Naveed</a>
+of <a href="http://illinois.edu/">University of Illinois
+at Urbana-Champaign</a>
+<br>(<a href="mailto:naveed2@illinois.edu">naveed2@illinois.edu</a>)</p>
+
+<p>Robert Craig of <a href="http://www.nsa.gov/research/ia_research/">
+Trusted Systems Research Group</a>, US National Security Agency
+<a href="https://android-review.googlesource.com/#/q/owner:%22Robert+Craig+%253Crpcraig%2540tycho.ncsc.mil%253E%22+status:merged">
+<img style="vertical-align:middle" src="images/tiny-robot.png" alt="Patch Symbol"
+title="This person contributed code that improved Android security"></a></p>
+
+<p>Stephen Smalley of <a href="http://www.nsa.gov/research/ia_research/">
+Trusted Systems Research Group</a>, US National Security Agency
+<a href=
+"https://android-review.googlesource.com/#/q/owner:%22Stephen+Smalley+%253Csds%2540tycho.nsa.gov%253E%22+status:merged">
+<img style="vertical-align:middle" src="images/tiny-robot.png"
+alt="Patch Symbol" title="This person contributed code that improved Android security"></a></p>
+
+<p><a href="http://www.linkedin.com/in/billcroberts">
+William Roberts</a> (<a href="mailto:bill.c.roberts@gmail.com">bill.c.roberts@gmail.com</a>)
+<a href=
+"https://android-review.googlesource.com/#/q/owner:bill.c.roberts%2540gmail.com+status:merged">
+<img style="vertical-align:middle" src="images/tiny-robot.png"
+alt="Patch Symbol" title="This person contributed code that improved Android security"></a></p>
+
+<p><a href="http://roeehay.blogspot.com/">Roee Hay</a>
+<br>(<a href="https://twitter.com/roeehay">@roeehay</a>,
+<a href="mailto:roeehay@gmail.com">roeehay@gmail.com</a>)</p>
+
+</div>
+<h2>2012</h2>
+
+<div style="LINE-HEIGHT:25px;">
+
+<p>Robert Craig of <a href="http://www.nsa.gov/research/ia_research/">
+Trusted Systems Research Group</a>, US National Security Agency
+<a href="https://android-review.googlesource.com/#/q/owner:%22Robert+Craig+%253Crpcraig%2540tycho.ncsc.mil%253E%22+status:merged">
+<img style="vertical-align:middle" src="images/tiny-robot.png" alt="Patch Symbol"
+title="This person contributed code that improved Android security"></a></p>
+
+<p>Stephen Smalley of <a href="http://www.nsa.gov/research/ia_research/">
+Trusted Systems Research Group</a>, US National Security Agency
+<a href=
+"https://android-review.googlesource.com/#/q/owner:%22Stephen+Smalley+%253Csds%2540tycho.nsa.gov%253E%22+status:merged">
+<img style="vertical-align:middle" src="images/tiny-robot.png"
+alt="Patch Symbol" title="This person contributed code that improved Android security"></a></p>
+
+<p><a href="http://www.linkedin.com/in/billcroberts">
+William Roberts</a> (<a href="mailto:bill.c.roberts@gmail.com">bill.c.roberts@gmail.com</a>)
+<a href=
+"https://android-review.googlesource.com/#/q/owner:bill.c.roberts%2540gmail.com+status:merged">
+<img style="vertical-align:middle" src="images/tiny-robot.png"
+alt="Patch Symbol" title="This person contributed code that improved Android security"></a></p>
+
+
+<p><a href="http://thejh.net/">Jann Horn</a></p>
+
+<p><a href="http://web.ict.kth.se/~rbbo/ussdvul.html">Ravishankar
+Borgaonkar</a> of TU Berlin
+(<a href="https://twitter.com/raviborgaonkar">@raviborgaonkar</a>)</p>
+
+<p><a href="http://roeehay.blogspot.com/">Roee Hay</a>
+<br>(<a href="https://twitter.com/roeehay">@roeehay</a>,
+<a href="mailto:roeehay@gmail.com">roeehay@gmail.com</a>)</p>
+
+</div>
+
+<h2>2011</h2>
+
+<div style="LINE-HEIGHT:25px;">
+
+<p>Collin Mulliner of <a href="http://www.mulliner.org/collin/academic">MUlliNER.ORG</a> (<a href="https://twitter.com/collinrm">@collinrm</a>)</p>
+
+</div>
+
+<h2>2009</h2>
+
+<div style="LINE-HEIGHT:25px;">
+
+<p>Collin Mulliner of <a href="http://www.mulliner.org/collin/academic">MUlliNER.ORG</a> (<a href="https://twitter.com/collinrm">@collinrm</a>)</p>
+
+<p>Charlie Miller (<a href="https://twitter.com/0xcharlie">@0xcharlie</a>)</p>
+
+</div>
+
 <br>
-<p><small>If you have reported a vulnerability prior to 2014 and want to be included on this list, or to report a vulnerability in Android, contact <a href="mailto:security@android.com">security@android.com</a></small></p>
+<p><small>If you have reported a vulnerability prior to 2014 and want to be
+included on this list, or to report a vulnerability in Android, contact <a href="mailto:security@android.com">security@android.com</a></small></p>
diff --git a/src/devices/tech/security/images/patchreward.png b/src/devices/tech/security/images/patchreward.png
new file mode 100644
index 0000000..496fe64
--- /dev/null
+++ b/src/devices/tech/security/images/patchreward.png
Binary files differ
diff --git a/src/devices/tech/security/images/tiny-robot.png b/src/devices/tech/security/images/tiny-robot.png
new file mode 100644
index 0000000..2cb88ca
--- /dev/null
+++ b/src/devices/tech/security/images/tiny-robot.png
Binary files differ
diff --git a/src/devices/tech/security/index.jd b/src/devices/tech/security/index.jd
index 57962c9..5570844 100644
--- a/src/devices/tech/security/index.jd
+++ b/src/devices/tech/security/index.jd
@@ -124,7 +124,7 @@
 cloud capabilities such as (<a href="https://developer.android.com/guide/topics/data/backup.html">backing
 up</a>) application
 data and settings and cloud-to-device messaging
-(<a href="https://code.google.com/android/c2dm/index.html">C2DM</a>)
+(<a href="https://developers.google.com/android/c2dm/">C2DM</a>)
 for push messaging.</p>
 </li>
 </ul>
diff --git a/src/index.jd b/src/index.jd
index f96e721..bcf28c3 100644
--- a/src/index.jd
+++ b/src/index.jd
@@ -42,41 +42,39 @@
     <div class="col-8">
     <h3>What's New</h3>
 
-<a href="{@docRoot}source/build-numbers.html">
-        <h4>Android 4.4.4 released</h4></a>
-        <p>Builds for Android 4.4.4 have been released. See the <strong><a
-href="{@docRoot}source/build-numbers.html#source-code-tags-and-builds">Source
-Code Tags and Builds</a></strong> table on <strong>Codenames, Tags, and Build
-Numbers</strong> for the new builds, tags, and devices supported.</p>
-
-<a href="{@docRoot}compatibility/downloads.html">
-        <h4>Android 4.4 CTS packages updated</h4></a>
-        <p>Revision 3 of the Android 4.4 Compatibility Test Suite (CTS) and
-Android 4.4 CTS Verifier have been added to Compatibility
-<strong><a
-href="{@docRoot}compatibility/downloads.html">Downloads</a></strong>. Packages
-for x86 architectures are included for the first time.</p>
+<a href="{@docRoot}devices/graphics.html">
+        <h4>Graphics introduction and implementation rewritten</h4></a>
+        <p>The <strong><a
+        href="{@docRoot}devices/graphics.html">Graphics introduction</a></strong> and <strong><a
+	href="{@docRoot}devices/graphics/implement.html">implementation
+        instructions</a></strong> have been revised in entirety.</p>
 
 <img border="0" src="images/Android_Robot_100.png" alt="Android Partner icon" style="display:inline;float:right;margin:5px 10px">
 
-<a href="{@docRoot}devices/tech/dalvik/art.html">
-        <h4>ART introduction completely revised</h4></a>
-        <p><strong><a
-href="{@docRoot}devices/tech/dalvik/art.html">Introducing ART</a></strong> has been
-rewritten to reflect forthcoming changes and prepare developers and
-manufacturers for the runtime's adoption.</p>
+<a href="{@docRoot}devices/drm.html">
+        <h4>DRM page rewritten</h4></a>
+        <p>The description of the Android <strong><a
+        href="{@docRoot}devices/drm.html">DRM</a></strong> framework has been
+        completely rewritten and expanded.</p>
 
-<a href="{@docRoot}source/brands.html">
-        <h4>Brand guidelines published</h4></a>
-        <p>Manufacturers have their own set of <strong><a
-href="{@docRoot}source/brands.html">guidelines for Android brand use</a></strong>.</p>
+<a href="{@docRoot}devices/tuning.html">
+        <h4>Performance tuning property removed</h4></a>
+        <p>The <code>ro.hwui.fbo_cache_size</code> property has been removed
+        from Android and the corresponding <strong><a
+href="{@docRoot}devices/tuning.html">Performance tuning</a></strong> list.</p>
 
-<a href="{@docRoot}devices/graphics/architecture.html">
- <h4>Graphics architecture document published</h4></a>
-        <p>Android engineering describes the system-level <strong><a
-href="{@docRoot}devices/graphics/architecture.html">Graphics
-Architecture</a></strong> in great detail.</p>
+<a href="{@docRoot}compatibility/cts-intro.html">
+        <h4>Compatibility Test Suite manual revised</h4></a>
+        <p>The <strong><a
+        href="{@docRoot}compatibility/android-cts-manual.pdf">Compatibility Test Suite
+        (CTS) manual</a></strong> has been revised with additional details regarding location
+        settings and more.</p>
 
+<a href="{@docRoot}source/known-issues.html">
+        <h4>Known issues refined</h4></a>
+        <p>The list of <strong><a
+        href="{@docRoot}source/known-issues.html">Known Issues</a></strong>
+        has been updated and split into categories.</p>
     </div>
 
     <div class="col-8">
@@ -84,11 +82,12 @@
       <a href="{@docRoot}source/index.html">
         <h4>Explore the Source</h4></a>
         <p>Get the complete Android platform and modify and build it to suit your needs. You can
-	also <strong><a
-href="https://android-review.googlesource.com/#/q/status:open">contribute to</a></strong> the <strong><a
-href="https://android.googlesource.com/">Android Open Source Project
-repository</a></strong> to make your changes available to everyone else in
-        the Android ecosystem.</p>
+        also <strong><a
+        href="https://android-review.googlesource.com/#/q/status:open">contribute
+        to</a></strong> the <strong><a
+        href="https://android.googlesource.com/">Android Open Source Project
+        repository</a></strong> to make your changes available to everyone else
+        in the Android ecosystem.</p>
       <a href="{@docRoot}devices/index.html">
         <h4>Port Android to Devices</h4></a>
         <p>Port the latest Android platform and
diff --git a/src/license.jd b/src/license.jd
index eb278a0..fb390b9 100644
--- a/src/license.jd
+++ b/src/license.jd
@@ -13,7 +13,7 @@
 </ul>
 
 <p>The documentation content on this site is made available to
-you as part of the <a href="http://source.android.com">Android Open
+you as part of the <a href="https://android.googlesource.com/">Android Open
 Source Project</a>. This documentation, including any code shown in it,
 is licensed under the <a
 href="http://www.apache.org/licenses/LICENSE-2.0">Apache 2.0
@@ -107,7 +107,7 @@
 </p>
 <p style="margin-left:20px;font-style:italic">
  Portions of this page are reproduced from work created and <a
- href="http://code.google.com/policies.html">shared by the Android Open Source Project</a>
+ href="https://code.google.com/p/android/">shared by the Android Open Source Project</a>
  and used according to terms described in the <a
  href="http://creativecommons.org/licenses/by/2.5/">Creative Commons
  2.5 Attribution License</a>.
@@ -125,7 +125,7 @@
 </p>
 <p style="margin-left:20px;font-style:italic">
  Portions of this page are modifications based on work created and <a
- href="http://code.google.com/policies.html">shared by the Android Open
+ href="https://code.google.com/p/android/">shared by the Android Open
  Source Project</a> and used according to terms described in the <a
  href="http://creativecommons.org/licenses/by/2.5/">Creative Commons
  2.5 Attribution License</a>.
diff --git a/src/source/build-numbers.jd b/src/source/build-numbers.jd
index 89cf0c4..c14ae13 100644
--- a/src/source/build-numbers.jd
+++ b/src/source/build-numbers.jd
@@ -168,10 +168,16 @@
 <th>Supported devices</th>
 </tr>
 <tr>
+  <td>KTU84Q</td>
+  <td>android-4.4.4_r2</td>
+  <td>KitKat</td>
+  <td>Nexus 5 (hammerhead) (For 2Degrees/NZ, Telstra/AUS and India ONLY)</td>
+</tr>
+<tr>
   <td>KTU84P</td>
   <td>android-4.4.4_r1</td>
   <td>KitKat</td>
-  <td>Nexus 5, Nexus 7 (flo/grouper/tilapia), Nexus 4, Nexus 10</td>
+  <td>Nexus 5, Nexus 7 (flo/deb/grouper/tilapia), Nexus 4, Nexus 10</td>
 </tr>
 <tr>
   <td>KTU84M</td>
@@ -211,7 +217,7 @@
 </tr>
 <tr>
   <td>KRT16M</td>
-  <td>android-4.4.2_r1</td>
+  <td>android-4.4_r1</td>
   <td>KitKat</td>
   <td>Nexus 5 (hammerhead)</td>
 </tr>
diff --git a/src/source/building-running.jd b/src/source/building-running.jd
index 905b94e..ed8c4b7 100644
--- a/src/source/building-running.jd
+++ b/src/source/building-running.jd
@@ -162,15 +162,13 @@
     https://source.android.com/source/download.html
 ************************************************************
 </code></pre>
-<p>This may be caused by</p>
+<p>This may be caused by:</p>
 <ul>
 <li>
-<p>failing to install the correct JDK as specified in <a href="initializing.html">Initializing the Build Environment</a>.</p>
+<p>Failing to install the correct JDK as specified in <a href="initializing.html">Initializing the Build Environment</a>.</p>
 </li>
 <li>
-<p>another JDK that you previously installed appearing in your path.  You can remove the offending JDK from your path with:</p>
-<pre><code>$ export PATH=${PATH/\/path\/to\/jdk\/dir:/}
-</code></pre>
+<p>Another JDK previously installed appearing in your path. Prepend the correct JDK to the beginning of your PATH or remove the problematic JDK.</p>
 </li>
 </ul>
 <h3 id="python-version-3">Python Version 3</h3>
diff --git a/src/source/code-style.jd b/src/source/code-style.jd
index 9ec3c99..ee65c27 100644
--- a/src/source/code-style.jd
+++ b/src/source/code-style.jd
@@ -277,8 +277,7 @@
 <h3 id="define-fields-in-standard-places">Define Fields in Standard Places</h3>
 <p>Fields should be defined either at the top of the file, or immediately before the methods that use them.</p>
 <h3 id="limit-variable-scope">Limit Variable Scope</h3>
-<p>The scope of local variables should be kept to a minimum (<em>Effective
-Java</em> Item 29). By doing so, you increase the readability and
+<p>The scope of local variables should be kept to a minimum. By doing so, you increase the readability and
 maintainability of your code and reduce the likelihood of error. Each variable
 should be declared in the innermost block that encloses all uses of the
 variable.</p>
@@ -537,8 +536,7 @@
 <p>Both the JDK and the Android code bases are very inconsistent with regards
 to acronyms, therefore, it is virtually impossible to be consistent with the
 code around you. Bite the bullet, and treat acronyms as words.</p>
-<p>For further justifications of this style rule, see <em>Effective Java</em>
-Item 38 and <em>Java Puzzlers</em> Number 68.</p>
+
 <h3 id="use-todo-comments">Use TODO Comments</h3>
 <p>Use TODO comments for code that is temporary, a short-term solution, or
 good-enough but not perfect.</p>
@@ -553,10 +551,9 @@
 specific event ("Remove this code after all production mixers understand
 protocol V7.").</p>
 <h3 id="log-sparingly">Log Sparingly</h3>
-<p>While logging is necessary it has a significantly negative impact on
+<p>While logging is necessary, it has a significantly negative impact on
 performance and quickly loses its usefulness if it's not kept reasonably
-terse. The logging facilities provides five different levels of logging. Below
-are the different levels and when and how they should be used.</p>
+terse. The logging facilities provides five different levels of logging:</p>
 <ul>
 <li>
 <p><code>ERROR</code>: 
diff --git a/src/source/community/index.jd b/src/source/community/index.jd
index 22aa73c..31361ca 100644
--- a/src/source/community/index.jd
+++ b/src/source/community/index.jd
@@ -91,8 +91,7 @@
 <p><em>Use a clear, relevant message subject.</em> This helps everyone, both those trying to answer your question as well as those who may be looking for information in the future.</p>
 </li>
 <li>
-<p><em>Give plenty of details in your post.</em> Code or log snippets, pointers to screenshots, and similar details will get better results and make for better discussions. For a great guide to phrasing your questions, read <a href="http://www.catb.org/%7Eesr/faqs/smart-questions.html">How to Ask Questions the Smart Way</a>.
-<img src="{@docRoot}images/external-link.png"></p>
+<p><em>Give plenty of details in your post.</em> Code or log snippets, pointers to screenshots, and similar details will get better results and make for better discussions. For a great guide to phrasing your questions, read <a href="http://www.catb.org/%7Eesr/faqs/smart-questions.html">How to Ask Questions the Smart Way</a>.</p>
 </li>
 </ul>
 
diff --git a/src/source/developing.jd b/src/source/developing.jd
index 46a51a7..e6a97b5 100644
--- a/src/source/developing.jd
+++ b/src/source/developing.jd
@@ -72,18 +72,19 @@
 <pre><code>$ repo sync PROJECT0 PROJECT1 PROJECT2 ...
 </code></pre>
 <h2 id="creating-topic-branches">Creating topic branches</h2>
-<p>Start a topic branch in your local work environment whenever you begin a change, for example when you begin work on a bug or new feature. A topic branch is not a copy of the original files; it is a pointer to a particular commit. This makes creating local branches and switching among them a light-weight operation. By using branches, you can isolate one aspect of your work from the others. For an interesting article about using topic branches, see <a href="http://www.kernel.org/pub/software/scm/git/docs/howto/separating-topic-branches.txt">Separating topic branches</a>.
-<img src="{@docRoot}images/external-link.png" alt=""></p>
-<p>To start a topic branch using Repo: </p>
-<pre><code>$ repo start BRANCH_NAME
+<p>Start a topic branch in your local work environment whenever you begin a change, for example when you begin work on a bug or new feature. A topic branch is not a copy of the original files; it is a pointer to a particular commit. This makes creating local branches and switching among them a light-weight operation. By using branches, you can isolate one aspect of your work from the others. For an interesting article about using topic branches, see <a href="http://www.kernel.org/pub/software/scm/git/docs/howto/separating-topic-branches.txt">Separating topic branches</a>.</p>
+<p>To start a topic branch using Repo, navigate into the project to be modified and issue: </p>
+<pre><code>$ repo start BRANCH_NAME .
 </code></pre>
-<p>To verify that your new branch was created:</p>
-<pre><code>$ repo status
+<p>Please note, the period represents the project in the current working directory. To verify your new branch was created:</p>
+<pre><code>$ repo status .
 </code></pre>
 <h2 id="using-topic-branches">Using topic branches</h2>
 <p>To assign the branch to a particular project:</p>
-<pre><code>$ repo start BRANCH_NAME PROJECT
+<pre><code>$ repo start BRANCH_NAME PROJECT_NAME
 </code></pre>
+<p>See <a href="https://android.googlesource.com/">android.googlesource.com</a> for a list of all projects. Again, if you've already navigated into a particular project directory, you may simply pass a period to represent the current project.</p>
+
 <p>To switch to another branch that you have created in your local work environment:</p>
 <pre><code>$ git checkout BRANCH_NAME
 </code></pre>
diff --git a/src/source/faqs.jd b/src/source/faqs.jd
index f08a896..58a9a21 100644
--- a/src/source/faqs.jd
+++ b/src/source/faqs.jd
@@ -321,5 +321,7 @@
 implement the 'adb' debugging utility. This means that any compatible device
 -- including ones available at retail -- must be able to run the CTS
 tests.</p>
+<h3 id="are-codecs-verified">Are codecs verified by CTS?</h3>
+<p>Yes. All mandatory codecs are verified by CTS.</p>
 
 <a href="#top">Back to top</a>
diff --git a/src/source/initializing.jd b/src/source/initializing.jd
index 73e5545..3283575 100644
--- a/src/source/initializing.jd
+++ b/src/source/initializing.jd
@@ -95,6 +95,11 @@
 $ sudo ln -s /usr/lib/i386-linux-gnu/mesa/libGL.so.1 /usr/lib/i386-linux-gnu/libGL.so
 </code></pre>
 
+<h3 id="installing-required-packages-ubuntu-1404">Installing required packages (Ubuntu 14.04)</h3>
+<p>Building on Ubuntu 14.04 is experimental at the moment but will eventually become the recommended
+environment.</p>
+<pre><code>$ sudo apt-get install bison g++-multilib git gperf libxml2-utils</code></pre>
+
 <h3 id="installing-required-packages-ubuntu-1004-1110">Installing required packages (Ubuntu 10.04 -- 11.10)</h3>
 <p>Building on Ubuntu 10.04-11.10 is no longer supported, but may be useful for building older
 releases of AOSP.</p>
diff --git a/src/source/known-issues.jd b/src/source/known-issues.jd
index f87a34f..9a6d9fc 100644
--- a/src/source/known-issues.jd
+++ b/src/source/known-issues.jd
@@ -27,7 +27,9 @@
 <p>Even with our best care, small problems sometimes slip in. This page keeps
 track of the known issues around using the Android source code.</p>
 
-<h2 id="missing-cellbroadcastreceiver">Missing CellBroadcastReceiver in toro builds</h2>
+<h2 id="build-issues">Build issues</h2>
+
+<h3 id="missing-cellbroadcastreceiver">Missing CellBroadcastReceiver in toro builds</h3>
 <p><strong>Symptom</strong></p>On AOSP builds for toro (up to Jelly Bean 4.2.1),
 CellBroadcastReceiver doesn't get included in the system.</p>
 
@@ -35,14 +37,14 @@
 where <code>PRODUCT_PACKAGES</code> has the K replaced by an H.
 <p><strong>Fix</strong>: Use the latest packages for 4.2.2, or manually fix the typo.</p>
 
-<h2 id="missing-cts-native-xml-generator">Missing CTS Native XML Generator</h2>
+<h3 id="missing-cts-native-xml-generator">Missing CTS Native XML Generator</h3>
 <p><strong>Symptom</strong>: On some builds of IceCreamSandwich and later, the following
 warning is printed early during the build:
 <code>/bin/bash: line 0: cd: cts/tools/cts-native-xml-generator/src/res: No
 such file or directory</code></p>
 <p><strong>Cause</strong>: Some makefile references that path, which doesn't exist.</p>
 <p><strong>Fix</strong>: None. This is a harmless warning.</p>
-<h2 id="black-gingerbread-emulator">Black Gingerbread Emulator</h2>
+<h3 id="black-gingerbread-emulator">Black Gingerbread Emulator</h3>
 <p><strong>Symptom</strong>: The emulator built directly from the gingerbread branch
 doesn't start and stays stuck on a black screen.</p>
 <p><strong>Cause</strong>: The gingerbread branch uses version R7 of the emulator,
@@ -54,14 +56,87 @@
 $ make
 $ emulator -kernel prebuilt/android-arm/kernel/kernel-qemu-armv7
 </code></pre>
-<h2 id="emulator-built-on-macos-107-lion-doesnt-work">Emulator built on MacOS 10.7 Lion doesn't work.</h2>
+<h3 id="emulator-built-on-macos-107-lion-doesnt-work">Emulator built on MacOS 10.7 Lion doesn't work.</h3>
 <p><strong>Symptom</strong>: The emulator (any version) built on MacOS 10.7 Lion
 and/or on XCode 4.x doesn't start.</p>
 <p><strong>Cause</strong>: Some change in the development environment causes
 the emulator to be compiled in a way that prevents it from working.</p>
 <p><strong>Fix</strong>: Use an emulator binary from the SDK, which is built on
 MacOS 10.6 with XCode 3 and works on MacOS 10.7.</p>
-<h2 id="difficulties-syncing-the-source-code-proxy-issues">Difficulties syncing the source code (proxy issues).</h2>
+
+<h3 id="partial-and-emulator-builds"><code>WITH_DEXPREOPT=true</code> and emulator builds.</h3>
+<p><strong>Symptom</strong>: When conducting partial builds or syncs (make system no dependencies)
+on emulator builds, the resulting build doesn't work.</p>
+<p><strong>Cause</strong>: All emulator builds now run Dex optimization at build
+time by default, which requires to follow all dependencies to
+re-optimize the applications each time the framework changes.</p>
+<p><strong>Fix</strong>: Locally disable Dex optimizations with
+<code>export WITH_DEXPREOPT=false</code>, delete the existing optimized
+versions with <code>make installclean</code> and run a full build to
+re-generate non-optimized versions. After that, partial builds
+will work.</p>
+<h3 id="permission-denied-during-builds">"Permission Denied" during builds.</h3>
+<p><strong>Symptom</strong>: All builds fail with "Permission Denied", possibly
+along with anti-virus warnings.</p>
+<p><strong>Cause</strong>: Some anti-virus programs mistakenly recognize some
+source files in the Android source tree as if they contained
+viruses.</p>
+<p><strong>Fix</strong>: After verifying that there are no actual viruses
+involved, disable anti-virus on the Android tree. This has
+the added benefit of reducing build times.</p>
+<h3 id="build-errors-related-to-using-the-wrong-compiler">Build errors related to using the wrong compiler.</h3>
+<p><strong>Symptom</strong>: The build fails with various symptoms. One
+such symptom is <code>cc1: error: unrecognized command line option "-m32"</code></p>
+<p><strong>Cause</strong>: The Android build system uses the default compiler
+in the PATH, assuming it's a suitable compiler to generate
+binaries that run on the host. Other situations (e.g. using
+the Android NDK or building the kernel) cause the default
+compiler to not be a host compiler.</p>
+<p><strong>Fix</strong>: Use a "clean" shell, in which no previous
+actions could have swapped the default compiler.</p>
+<h3 id="build-errors-caused-by-non-default-tool-settings">Build errors caused by non-default tool settings.</h3>
+<p><strong>Symptom</strong>: The build fails with various symptoms, possibly
+complinaing about missing files or files that have the
+wrong format. One such symptom is <code>member [...] in archive is not an object</code>.</p>
+<p><strong>Cause</strong>: The Android build system tends to use many host tools
+and to rely on their default behaviors. Some settings change
+those tools' behaviors and make them behave in ways that
+confuse the build system. Variables known to cause such
+issues are <code>CDPATH</code> and <code>GREP_OPTIONS</code>.</p>
+<p><strong>Fix</strong>: Build Android in an environment that has as few
+customizations as possible.</p>
+<h3 id="build-error-with-40x-and-earlier-on-macos-107">Build error with 4.0.x and earlier on MacOS 10.7.</h3>
+<p><strong>Symptom</strong>: Building IceCreamSandwich 4.0.x (and older
+versions) fails on MacOS 10.7 with errors similar to this:
+<code>Undefined symbols for architecture i386: "_SDL_Init"</code></p>
+<p><strong>Cause</strong>: 4.0.x is not compatible with MacOS 10.7.</p>
+<p><strong>Fix</strong>: Either downgrade to MacOS 10.6, or use the master
+branch, which can be built on MacOS 10.7.</p>
+<pre><code>$ repo init -b master
+$ repo sync
+</code></pre>
+<h3 id="build-error-on-macos-with-xcode-43">Build error on MacOS with XCode 4.3.</h3>
+<p><strong>Symptom</strong>: All builds fail when using XCode 4.3.</p>
+<p><strong>Cause</strong>: XCode 4.3 switched the default compiler from
+gcc to llvm, and llvm rejects code that used to be
+accepted by gcc.</p>
+<p><strong>Fix</strong>: Use XCode 4.2.</p>
+<h3 id="build-error-with-40x-and-earlier-on-ubuntu-1110">Build error with 4.0.x and earlier on Ubuntu 11.10.</h3>
+<p><strong>Symptom</strong>: Building IceCreamSandwich 4.0.x (and older
+versions) on Ubuntu 11.10 and newer fails with errors similar to this:
+<code>&lt;command-line&gt;:0:0: warning: "_FORTIFY_SOURCE" redefined [enabled by default]</code></p>
+<p><strong>Cause</strong>: Ubuntu 11.10 uses a version of gcc where that symbol
+is defined by default, and Android also defines that symbol,
+which causes a conflict.</p>
+<p><strong>Fix</strong>: Either downgrade to Ubuntu 10.04, or use the master
+branch, which can be compiled on Ubuntu 11.10 and newer.</p>
+<pre><code>$ repo init -b master
+$ repo sync
+</code></pre>
+
+<h2 id="source-sync">Source sync issues<h2>
+
+<h3 id="difficulties-syncing-the-source-code-proxy-issues">Difficulties syncing the source code (proxy issues).</h3>
 <p><strong>Symptom</strong>: <code>repo init</code> or <code>repo sync</code> fail with http errors,
 typically 403 or 500.</p>
 <p><strong>Cause</strong>: There are quite a few possible causes, most often
@@ -70,7 +145,7 @@
 <p><strong>Fix</strong>: While there's no general solution, using python 2.7
 and explicitly using <code>repo sync -j1</code> have been reported to
 improve the situation for some users.</p>
-<h2 id="difficulties-syncing-the-source-tree-virtualbox-ethernet-issues">Difficulties syncing the source tree (VirtualBox Ethernet issues).</h2>
+<h3 id="difficulties-syncing-the-source-tree-virtualbox-ethernet-issues">Difficulties syncing the source tree (VirtualBox Ethernet issues).</h3>
 <p><strong>Symptom</strong>: When running <code>repo sync</code> in some VirtualBox installations,
 the process hangs or fails with a variety of possible symptoms.
 One such symptom is
@@ -80,7 +155,7 @@
 the network. The heavy network activity of repo sync triggers some
 corner cases in the NAT code.</p>
 <p><strong>Fix</strong>: Configure VirtualBox to use bridged network instead of NAT.</p>
-<h2 id="difficulties-syncing-the-source-tree-dns-issues">Difficulties syncing the source tree (DNS issues).</h2>
+<h3 id="difficulties-syncing-the-source-tree-dns-issues">Difficulties syncing the source tree (DNS issues).</h3>
 <p><strong>Symptom</strong>: When running <code>repo sync</code>, the process fails with
 various errors related to not recognizing the hostname. One such
 error is <code>&lt;urlopen error [Errno -2] Name or service not known&gt;</code>.</p>
@@ -103,7 +178,7 @@
 <p>Note that this will only work as long as the servers' addresses
 don't change, and if they do and you can't connect you'll have
 to resolve those hostnames again and edit <code>etc/hosts</code> accordingly.</p>
-<h2 id="difficulties-syncing-the-source-tree-tcp-issues">Difficulties syncing the source tree (TCP issues).</h2>
+<h3 id="difficulties-syncing-the-source-tree-tcp-issues">Difficulties syncing the source tree (TCP issues).</h3>
 <p><strong>Symptom</strong>: <code>repo sync</code> hangs while syncing, often when it's
 completed 99% of the sync.</p>
 <p><strong>Cause</strong>: Some settings in the TCP/IP stack cause difficulties
@@ -111,27 +186,10 @@
 nor fails.</p>
 <p><strong>Fix</strong>: On linux, <code>sysctl -w net.ipv4.tcp_window_scaling=0</code>. On
 MacOS, disable the rfc1323 extension in the network settings.</p>
-<h2 id="make-snod-and-emulator-builds"><code>make snod</code> and emulator builds.</h2>
-<p><strong>Symptom</strong>: When using <code>make snod</code> (make system no dependencies)
-on emulator builds, the resulting build doesn't work.</p>
-<p><strong>Cause</strong>: All emulator builds now run Dex optimization at build
-time by default, which requires to follow all dependencies to
-re-optimize the applications each time the framework changes.</p>
-<p><strong>Fix</strong>: Locally disable Dex optimizations with
-<code>export WITH_DEXPREOPT=false</code>, delete the existing optimized
-versions with <code>make installclean</code> and run a full build to
-re-generate non-optimized versions. After that, <code>make snod</code>
-will work.</p>
-<h2 id="permission-denied-during-builds">"Permission Denied" during builds.</h2>
-<p><strong>Symptom</strong>: All builds fail with "Permission Denied", possibly
-along with anti-virus warnings.</p>
-<p><strong>Cause</strong>: Some anti-virus programs mistakenly recognize some
-source files in the Android source tree as if they contained
-viruses.</p>
-<p><strong>Fix</strong>: After verifying that there are no actual viruses
-involved, disable anti-virus on the Android tree. This has
-the added benefit of reducing build times.</p>
-<h2 id="camera-and-gps-dont-work-on-galaxy-nexus">Camera and GPS don't work on Galaxy Nexus.</h2>
+
+
+<h2 id="runtime-issues">Runtime issues</h2>
+<h3 id="camera-and-gps-dont-work-on-galaxy-nexus">Camera and GPS don't work on Galaxy Nexus.</h3>
 <p><strong>Symptom</strong>: Camera and GPS don't work on Galaxy Nexus.
 As an example, the Camera application crashes as soon as it's
 launched.</p>
@@ -139,52 +197,3 @@
 libraries that aren't available in the Android Open Source
 Project.</p>
 <p><strong>Fix</strong>: None.</p>
-<h2 id="build-errors-related-to-using-the-wrong-compiler">Build errors related to using the wrong compiler.</h2>
-<p><strong>Symptom</strong>: The build fails with various symptoms. One
-such symptom is <code>cc1: error: unrecognized command line option "-m32"</code></p>
-<p><strong>Cause</strong>: The Android build system uses the default compiler
-in the PATH, assuming it's a suitable compiler to generate
-binaries that run on the host. Other situations (e.g. using
-the Android NDK or building the kernel) cause the default
-compiler to not be a host compiler.</p>
-<p><strong>Fix</strong>: Use a "clean" shell, in which no previous
-actions could have swapped the default compiler.</p>
-<h2 id="build-errors-caused-by-non-default-tool-settings">Build errors caused by non-default tool settings.</h2>
-<p><strong>Symptom</strong>: The build fails with various symptoms, possibly
-complinaing about missing files or files that have the
-wrong format. One such symptom is <code>member [...] in archive is not an object</code>.</p>
-<p><strong>Cause</strong>: The Android build system tends to use many host tools
-and to rely on their default behaviors. Some settings change
-those tools' behaviors and make them behave in ways that
-confuse the build system. Variables known to cause such
-issues are <code>CDPATH</code> and <code>GREP_OPTIONS</code>.</p>
-<p><strong>Fix</strong>: Build Android in an environment that has as few
-customizations as possible.</p>
-<h2 id="build-error-with-40x-and-earlier-on-macos-107">Build error with 4.0.x and earlier on MacOS 10.7.</h2>
-<p><strong>Symptom</strong>: Building IceCreamSandwich 4.0.x (and older
-versions) fails on MacOS 10.7 with errors similar to this:
-<code>Undefined symbols for architecture i386: "_SDL_Init"</code></p>
-<p><strong>Cause</strong>: 4.0.x is not compatible with MacOS 10.7.</p>
-<p><strong>Fix</strong>: Either downgrade to MacOS 10.6, or use the master
-branch, which can be built on MacOS 10.7.</p>
-<pre><code>$ repo init -b master
-$ repo sync
-</code></pre>
-<h2 id="build-error-on-macos-with-xcode-43">Build error on MacOS with XCode 4.3.</h2>
-<p><strong>Symptom</strong>: All builds fail when using XCode 4.3.</p>
-<p><strong>Cause</strong>: XCode 4.3 switched the default compiler from
-gcc to llvm, and llvm rejects code that used to be
-accepted by gcc.</p>
-<p><strong>Fix</strong>: Use XCode 4.2.</p>
-<h2 id="build-error-with-40x-and-earlier-on-ubuntu-1110">Build error with 4.0.x and earlier on Ubuntu 11.10.</h2>
-<p><strong>Symptom</strong>: Building IceCreamSandwich 4.0.x (and older
-versions) on Ubuntu 11.10 and newer fails with errors similar to this:
-<code>&lt;command-line&gt;:0:0: warning: "_FORTIFY_SOURCE" redefined [enabled by default]</code></p>
-<p><strong>Cause</strong>: Ubuntu 11.10 uses a version of gcc where that symbol
-is defined by default, and Android also defines that symbol,
-which causes a conflict.</p>
-<p><strong>Fix</strong>: Either downgrade to Ubuntu 10.04, or use the master
-branch, which can be compiled on Ubuntu 11.10 and newer.</p>
-<pre><code>$ repo init -b master
-$ repo sync
-</code></pre>
diff --git a/src/source/report-bugs.jd b/src/source/report-bugs.jd
index 027dcd2..300df54 100644
--- a/src/source/report-bugs.jd
+++ b/src/source/report-bugs.jd
@@ -78,7 +78,7 @@
 and a poor bug report.</p>
 
 <h2 id="a-poor-bug-report">A Poor Bug Report</h2>
-<pre>
+<blockquote>
 Title: Error message
 
 When running Eclipse I get an "Internal Error" that says "See the .log file for more details".
@@ -91,18 +91,18 @@
 
 Observed results:
 See above.
-</pre>
+</blockquote>
 <p>This is a poor bug report because it doesn't provide any context for the
 issue; is it a problem in the Dalvik virtual machine, the core framework, or
 something else? It also doesn't provide any code or hint on how to reproduce
 it. In other words, this bug report doesn't provide enough information for
 anyone to take action on, so it would be ignored.</p>
 <h2 id="a-good-bug-report">A Good Bug Report</h2>
-<pre>
+<blockquote>
 Title: Stepping over "Object o = null" causes Eclipse "Internal Error"
 
 Interesting bug, while using Eclipse 3.3.1.1 with m37a of android and the following code:
-
+<pre>
 package com.saville.android;
 
 import android.app.Activity;
@@ -125,7 +125,7 @@
 
     static final String TAG = "TestObjectNull";
 }
-
+</pre>
 Eclipse indicates an "Internal Error" with "See the .log file for more
 details" and then asks if I want to exit the workbench. This occurs when I
 place a break point on "setContentView(R.layout.main);" and then single
@@ -134,7 +134,7 @@
 If I change "Object o = null;" to "Object o" all is well.
 
 The last lines of the .log file are:
-
+<pre>
 !ENTRY org.eclipse.core.jobs 4 2 2008-01-01 13:04:15.825
 !MESSAGE An internal error occurred during: "has children update".
 !STACK 0
@@ -163,3 +163,4 @@
 org.eclipse.debug.internal.ui.model.elements.ElementContentProvider$3.run(ElementContentProvider.java:200)
         at org.eclipse.core.internal.jobs.Worker.run(Worker.java:55)
 </pre>
+</blockquote>
diff --git a/src/source/submit-patches.jd b/src/source/submit-patches.jd
index 1c2dd0d..c41de25 100644
--- a/src/source/submit-patches.jd
+++ b/src/source/submit-patches.jd
@@ -199,7 +199,7 @@
 
 <h2 id="mksh">mksh</h2>
 <p>All changes to the MirBSD Korn Shell project at <code>external/mksh</code> should be made upstream
-either by sending an email to miros-mksh on the mirbsd.o®g domain (no subscription
+either by sending an email to miros-mksh on the mirbsd.org domain (no subscription
 required to submit there) or (optionally) at <a href="https://launchpad.net/mksh">Launchpad</a>.
 </p>
 <h2 id="openssl">OpenSSL</h2>
diff --git a/src/source/using-repo.jd b/src/source/using-repo.jd
index 67ca7b7..ce86c43 100644
--- a/src/source/using-repo.jd
+++ b/src/source/using-repo.jd
@@ -147,7 +147,7 @@
 <p><code>REPO_PATH</code> is the path relative to the root of the client.</p>
 </li>
 <li>
-<p><code>REPO_REMOTE</code> is the name of the remote sstem from the manifest.</p>
+<p><code>REPO_REMOTE</code> is the name of the remote system from the manifest.</p>
 </li>
 <li>
 <p><code>REPO_LREV</code> is the name of the revision from the manifest, translated to a local tracking branch.  Used if you need to pass the manifest revision to a locally executed git command.</p>