Merge "Docs: Adding sensors HAL content to source.android Bug: 10134622"
diff --git a/src/devices/camera.jd b/src/devices/camera/camera.jd
similarity index 97%
rename from src/devices/camera.jd
rename to src/devices/camera/camera.jd
index e85a23d..4b4b22c 100644
--- a/src/devices/camera.jd
+++ b/src/devices/camera/camera.jd
@@ -1,8 +1,8 @@
-page.title=Camera Version 1
+page.title=Camera HAL overview
@jd:body
<!--
- Copyright 2010 The Android Open Source Project
+ Copyright 2013 The Android Open Source Project
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
@@ -138,7 +138,7 @@
<li>Declare your camera’s media codec, format, and resolution capabilities in
<code>device/<company_name>/<device_name>/media_profiles.xml</code> and
<code>device/<company_name>/<device_name>/media_codecs.xml</code> XML files.
- For more information, see <a href="media.html#expose"> Exposing
+ For more information, see <a href="{@docRoot}devices/media.html#expose"> Exposing
Codecs and Profiles to the Framework</a> for information on how to do this.
</p></code>
diff --git a/src/devices/camera/camera3.jd b/src/devices/camera/camera3.jd
new file mode 100644
index 0000000..6fe9770
--- /dev/null
+++ b/src/devices/camera/camera3.jd
@@ -0,0 +1,184 @@
+page.title=Camera HAL v3 overview
+@jd:body
+
+<!--
+ Copyright 2013 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>
+Android's camera Hardware Abstraction Layer (HAL) connects the higher level
+camera framework APIs in
+<a
+href="http://developer.android.com/reference/android/hardware/Camera.html">android.hardware.Camera</a>
+to your underlying camera driver and hardware. The latest version of Android
+introduces a new, underlying implementation of the camera stack. If you have
+previously developed a camera HAL module and driver for other versions of
+Android, be aware that there are significant changes in the camera pipeline.</p>
+<p>Version 1 of the camera HAL is still supported for future releases of Android
+ because many devices still rely on it. Implementing both HALs is also supported
+ by the Android camera service, which is useful when you want to support a less
+ capable front-facing camera with version 1 of the HAL and a more advanced
+ back-facing camera with version 3 of the HAL. Version 2 was a stepping stone to
+ version 3 and is not supported.</p>
+<p>
+There is only one camera HAL module (with its own version number, currently 1, 2,
+or 2.1), which lists multiple independent camera devices that each have
+their own version. Camera module v2 or newer is required to support devices v2 or newer, and such
+camera modules can have a mix of camera device versions. This is what we mean
+when we say we Android supports implementing both HALs.
+</p>
+<p><strong>Note:</strong> The new camera HAL is in active development and can change at any
+ time. This document describes at a high level the design of the camera subsystem
+ and omits many details. Stay tuned for more updates to the PDK repository and
+ look out for updates to the Camera HAL and reference implementation for more
+ information.</p>
+
+<h2 id="overview">Overview</h2>
+
+<p>
+Version 1 of the camera subsystem was designed as a black box with high-level
+controls. Roughly speaking, the old subsystem has three operating modes:</p>
+
+<ul>
+<li>Preview</li>
+<li>Video Record</li>
+<li>Still Capture</li>
+</ul>
+
+<p>Each mode has slightly different and overlapping capabilities. This made it hard
+to implement new types of features, such as burst mode, since it would fall
+between two of these modes.<br/>
+<img src="images/camera_block.png" alt="Camera block diagram"/><br/>
+<strong>Figure 1.</strong> Camera components</p>
+
+<h2 id="v3-enhance">Version 3 enhancements</h2>
+
+<p>The aim of the Android Camera API redesign is to substantially increase the
+ability of applications to control the camera subsystem on Android devices while
+reorganizing the API to make it more efficient and maintainable.</p>
+
+<p>The additional control makes it easier to build high-quality camera applications
+on Android devices that can operate reliably across multiple products while
+still using device-specific algorithms whenever possible to maximize quality and
+performance.</p>
+
+<p>Version 3 of the camera subsystem structures the operation modes into a single
+unified view, which can be used to implement any of the previous modes and
+several others, such as burst mode. This results in better user control for
+focus and exposure and more post-processing, such as noise reduction, contrast
+and sharpening. Further, this simplified view makes it easier for application
+developers to use the camera's various functions.<br/>
+The API models the camera subsystem as a pipeline that converts incoming
+requests for frame captures into frames, on a 1:1 basis. The requests
+encapsulate all configuration information about the capture and processing of a
+frame. This includes: resolution and pixel format; manual sensor, lens and flash
+control; 3A operating modes; RAW->YUV processing control; statistics generation;
+and so on.</p>
+
+<p>In simple terms, the application framework requests a frame from the camera
+subsystem, and the camera subsystem returns results to an output stream. In
+addition, metadata that contains information such as color spaces and lens
+shading is generated for each set of results. The following sections and
+diagrams give you more detail about each component.<br/>
+You can think of camera version 3 as a pipeline to camera version 1's one-way
+stream. It converts each capture request into one image captured by the sensor,
+which is processed into: </p>
+
+<ul>
+<li>A Result object with metadata about the capture.</li>
+<li>One to N buffers of image data, each into its own destination Surface.</li>
+</ul>
+
+<p>The set of possible output Surfaces is preconfigured:</p>
+
+<ul>
+<li>Each Surface is a destination for a stream of image buffers of a fixed
+resolution.</li>
+<li>Only a small number of Surfaces can be configured as outputs at once (~3).</li>
+</ul>
+
+<p>A request contains all desired capture settings and the list of output Surfaces
+to push image buffers into for this request (out of the total configured set). A
+request can be one-shot ( with capture() ), or it may be repeated indefinitely
+(with setRepeatingRequest() ). Captures have priority over repeating
+requests.</p>
+<img src="images/camera_simple_model.png" alt="Camera data model"/>
+<p><strong>Figure 2.</strong> Camera core operation model</p>
+
+<h2 id="supported-version">Supported version</h2>
+
+<p>Camera devices that support this version of the HAL must return
+CAMERA_DEVICE_API_VERSION_3_1 in camera_device_t.common.version and in
+camera_info_t.device_version (from camera_module_t.get_camera_info).<br/>
+Camera modules that may contain version 3.1 devices must implement at least
+version 2.0 of the camera module interface (as defined by
+camera_module_t.common.module_api_version).<br/>
+See camera_common.h for more versioning details.</p>
+
+<h2 id="version-history">Version history</h2>
+
+<h4><strong>1.0</strong></h4>
+
+<p>Initial Android camera HAL (Android 4.0) [camera.h]:</p>
+
+<ul>
+<li>Converted from C++ CameraHardwareInterface abstraction layer.</li>
+<li>Supports android.hardware.Camera API.</li>
+</ul>
+
+<h4><strong>2.0</strong></h4>
+
+<p>Initial release of expanded-capability HAL (Android 4.2) [camera2.h]:</p>
+
+<ul>
+<li>Sufficient for implementing existing android.hardware.Camera API.</li>
+<li>Allows for ZSL queue in camera service layer</li>
+<li>Not tested for any new features such manual capture control, Bayer RAW
+capture, reprocessing of RAW data.</li>
+</ul>
+
+<h4><strong>3.0</strong></h4>
+
+<p>First revision of expanded-capability HAL:</p>
+
+<ul>
+<li>Major version change since the ABI is completely different. No change to the
+required hardware capabilities or operational model from 2.0.</li>
+<li>Reworked input request and stream queue interfaces: Framework calls into HAL
+with next request and stream buffers already dequeued. Sync framework support
+is included, necessary for efficient implementations.</li>
+<li>Moved triggers into requests, most notifications into results.</li>
+<li>Consolidated all callbacks into framework into one structure, and all setup
+methods into a single initialize() call.</li>
+<li>Made stream configuration into a single call to simplify stream management.
+Bidirectional streams replace STREAM_FROM_STREAM construct.</li>
+<li>Limited mode semantics for older/limited hardware devices.</li>
+</ul>
+
+<h4><strong>3.1</strong></h4>
+
+<p>Minor revision of expanded-capability HAL:</p>
+
+<ul>
+<li>configure_streams passes consumer usage flags to the HAL.</li>
+<li>flush call to drop all in-flight requests/buffers as fast as possible.</li>
+</ul>
diff --git a/src/devices/camera/camera3_3Amodes.jd b/src/devices/camera/camera3_3Amodes.jd
new file mode 100644
index 0000000..89d9841
--- /dev/null
+++ b/src/devices/camera/camera3_3Amodes.jd
@@ -0,0 +1,662 @@
+page.title=3A Modes and State Transition
+@jd:body
+
+<!--
+ Copyright 2013 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<p>
+ While the actual 3A algorithms are up to the HAL implementation, a high-level
+ state machine description is defined by the HAL interface to allow the HAL
+ device and the framework to communicate about the current state of 3A and
+ trigger 3A events.</p>
+<p>When the device is opened, all the individual 3A states must be STATE_INACTIVE.
+ Stream configuration does not reset 3A. For example, locked focus must be
+ maintained across the configure() call.</p>
+<p>Triggering a 3A action involves simply setting the relevant trigger entry in the
+ settings for the next request to indicate start of trigger. For example, the
+ trigger for starting an autofocus scan is setting the entry
+ ANDROID_CONTROL_AF_TRIGGER to ANDROID_CONTROL_AF_TRIGGER_START for one request;
+ and cancelling an autofocus scan is triggered by setting
+ ANDROID_CONTROL_AF_TRIGGER to ANDROID_CONTRL_AF_TRIGGER_CANCEL. Otherwise, the
+ entry will not exist or be set to ANDROID_CONTROL_AF_TRIGGER_IDLE. Each request
+ with a trigger entry set to a non-IDLE value will be treated as an independent
+ triggering event.</p>
+<p>At the top level, 3A is controlled by the ANDROID_CONTROL_MODE setting. It
+ selects between no 3A (ANDROID_CONTROL_MODE_OFF), normal AUTO mode
+ (ANDROID_CONTROL_MODE_AUTO), and using the scene mode setting
+ (ANDROID_CONTROL_USE_SCENE_MODE):</p>
+<ul>
+ <li>In OFF mode, each of the individual Auto-focus(AF), auto-exposure (AE), and
+ auto-whitebalance (AWB) modes are effectively OFF, and none of the capture
+ controls may be overridden by the 3A routines.</li>
+ <li>In AUTO mode, AF, AE, and AWB modes all run their own independent algorithms,
+ and have their own mode, state, and trigger metadata entries, as listed in the
+ next section.</li>
+ <li>In USE_SCENE_MODE, the value of the ANDROID_CONTROL_SCENE_MODE entry must be
+ used to determine the behavior of 3A routines. In SCENE_MODEs other than
+ FACE_PRIORITY, the HAL must override the values of
+ ANDROID_CONTROL_AE/AWB/AF_MODE to be the mode it prefers for the selected
+ SCENE_MODE. For example, the HAL may prefer SCENE_MODE_NIGHT to use
+ CONTINUOUS_FOCUS AF mode. Any user selection of AE/AWB/AF_MODE when scene must
+ be ignored for these scene modes.</li>
+ <li>For SCENE_MODE_FACE_PRIORITY, the AE/AWB/AFMODE controls work as in
+ ANDROID_CONTROL_MODE_AUTO, but the 3A routines must bias toward metering and
+ focusing on any detected faces in the scene.</li>
+</ul>
+<h2 id="auto-focus">Auto-focus settings and result entries</h2>
+<p>Main metadata entries:<br/>
+ ANDROID_CONTROL_AF_MODE: Control for selecting the current autofocus mode. Set
+ by the framework in the request settings.<br/>
+ AF_MODE_OFF: AF is disabled; the framework/app directly controls lens position.<br/>
+ AF_MODE_AUTO: Single-sweep autofocus. No lens movement unless AF is triggered.<br/>
+ AF_MODE_MACRO: Single-sweep up-close autofocus. No lens movement unless AF is
+ triggered.<br/>
+ AF_MODE_CONTINUOUS_VIDEO: Smooth continuous focusing, for recording video.
+ Triggering immediately locks focus in current position. Canceling resumes
+ cotinuous focusing.<br/>
+ AF_MODE_CONTINUOUS_PICTURE: Fast continuous focusing, for zero-shutter-lag still
+ capture. Triggering locks focus once currently active sweep concludes. Canceling
+ resumes continuous focusing.<br/>
+ AF_MODE_EDOF: Advanced extended depth of field focusing. There is no autofocus
+ scan, so triggering one or canceling one has no effect. Images are focused
+ automatically by the HAL.<br/>
+ ANDROID_CONTROL_AF_STATE: Dynamic metadata describing the current AF algorithm
+ state, reported by the HAL in the result metadata.<br/>
+ AF_STATE_INACTIVE: No focusing has been done, or algorithm was reset. Lens is
+ not moving. Always the state for MODE_OFF or MODE_EDOF. When the device is
+ opened, it must start in this state.<br/>
+ AF_STATE_PASSIVE_SCAN: A continuous focus algorithm is currently scanning for
+ good focus. The lens is moving.<br/>
+ AF_STATE_PASSIVE_FOCUSED: A continuous focus algorithm believes it is well
+ focused. The lens is not moving. The HAL may spontaneously leave this state.<br/>
+ AF_STATE_PASSIVE_UNFOCUSED: A continuous focus algorithm believes it is not well
+ focused. The lens is not moving. The HAL may spontaneously leave this state.<br/>
+ AF_STATE_ACTIVE_SCAN: A scan triggered by the user is underway.<br/>
+ AF_STATE_FOCUSED_LOCKED: The AF algorithm believes it is focused. The lens is
+ not moving.<br/>
+ AF_STATE_NOT_FOCUSED_LOCKED: The AF algorithm has been unable to focus. The lens
+ is not moving.<br/>
+ ANDROID_CONTROL_AFTRIGGER: Control for starting an autofocus scan, the meaning
+ of which depends on mode and state. Set by the framework in the request
+ settings.<br/>
+ AF_TRIGGER_IDLE: No current trigger.<br/>
+ AF_TRIGGER_START: Trigger start of AF scan. Effect depends on mode and state.<br/>
+ AF_TRIGGER_CANCEL: Cancel current AF scan if any, and reset algorithm to
+ default.<br/>
+ Additional metadata entries:<br/>
+ ANDROID_CONTROL_AF_REGIONS: Control for selecting the regions of the field of
+ view (FOV) that should be used to determine good focus. This applies to all AF
+ modes that scan for focus. Set by the framework in the request settings.</p>
+<h2 id="auto-exposure">Auto-exposure settings and result entries</h2>
+<p>Main metadata entries:<br/>
+ ANDROID_CONTROL_AE_MODE: Control for selecting the current auto-exposure mode.
+ Set by the framework in the request settings.<br/>
+ AE_MODE_OFF: Autoexposure is disabled; the user controls exposure, gain, frame
+ duration, and flash.<br/>
+ AE_MODE_ON: Standard autoexposure, with flash control disabled. User may set
+ flash to fire or to torch mode.<br/>
+ AE_MODE_ON_AUTO_FLASH: Standard autoexposure, with flash on at HAL's discretion
+ for precapture and still capture. User control of flash disabled.<br/>
+ AE_MODE_ON_ALWAYS_FLASH: Standard autoexposure, with flash always fired for
+ capture, and at HAL's discretion for precapture. User control of flash disabled.<br/>
+ AE_MODE_ON_AUTO_FLASH_REDEYE: Standard autoexposure, with flash on at HAL's
+ discretion for precapture and still capture. Use a flash burst at end of
+ precapture sequence to reduce redeye in the final picture. User control of flash
+ disabled.<br/>
+ ANDROID_CONTROL_AE_STATE: Dynamic metadata describing the current AE algorithm
+ state, reported by the HAL in the result metadata.<br/>
+ AE_STATE_INACTIVE: Initial AE state after mode switch. When the device is
+ opened, it must start in this state.<br/>
+ AE_STATE_SEARCHING: AE is not converged to a good value and is adjusting
+ exposure parameters.<br/>
+ AE_STATE_CONVERGED: AE has found good exposure values for the current scene, and
+ the exposure parameters are not changing. HAL may spontaneously leave this state
+ to search for a better solution.<br/>
+ AE_STATE_LOCKED: AE has been locked with the AE_LOCK control. Exposure values
+ are not changing.<br/>
+ AE_STATE_FLASH_REQUIRED: The HAL has converged exposure but believes flash is
+ required for a sufficiently bright picture. Used for determining if a
+ zero-shutter-lag frame can be used.<br/>
+ AE_STATE_PRECAPTURE: The HAL is in the middle of a precapture sequence.
+ Depending on AE mode, this mode may involve firing the flash for metering or a
+ burst of flash pulses for redeye reduction.<br/>
+ ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER: Control for starting a metering sequence
+ before capturing a high-quality image. Set by the framework in the request
+ settings.<br/>
+ PRECAPTURE_TRIGGER_IDLE: No current trigger.<br/>
+ PRECAPTURE_TRIGGER_START: Start a precapture sequence. The HAL should use the
+ subsequent requests to measure good exposure/white balance for an upcoming
+ high-resolution capture.<br/>
+ Additional metadata entries:<br/>
+ ANDROID_CONTROL_AE_LOCK: Control for locking AE controls to their current
+ values.<br/>
+ ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION: Control for adjusting AE algorithm
+ target brightness point.<br/>
+ ANDROID_CONTROL_AE_TARGET_FPS_RANGE: Control for selecting the target frame rate
+ range for the AE algorithm. The AE routine cannot change the frame rate to be
+ outside these bounds.<br/>
+ ANDROID_CONTROL_AE_REGIONS: Control for selecting the regions of the FOV that
+ should be used to determine good exposure levels. This applies to all AE modes
+ besides OFF.</p>
+<h2 id="auto-wb">Auto-whitebalance settings and result entries</h2>
+<p>Main metadata entries:<br/>
+ ANDROID_CONTROL_AWB_MODE: Control for selecting the current white-balance mode.<br/>
+ AWB_MODE_OFF: Auto-whitebalance is disabled. User controls color matrix.<br/>
+ AWB_MODE_AUTO: Automatic white balance is enabled; 3A controls color transform,
+ possibly using more complex transforms than a simple matrix.<br/>
+ AWB_MODE_INCANDESCENT: Fixed white balance settings good for indoor incandescent
+ (tungsten) lighting, roughly 2700K.<br/>
+ AWB_MODE_FLUORESCENT: Fixed white balance settings good for fluorescent
+ lighting, roughly 5000K.<br/>
+ AWB_MODE_WARM_FLUORESCENT: Fixed white balance settings good for fluorescent
+ lighting, roughly 3000K.<br/>
+ AWB_MODE_DAYLIGHT: Fixed white balance settings good for daylight, roughly
+ 5500K.<br/>
+ AWB_MODE_CLOUDY_DAYLIGHT: Fixed white balance settings good for clouded
+ daylight, roughly 6500K.<br/>
+ AWB_MODE_TWILIGHT: Fixed white balance settings good for near-sunset/sunrise,
+ roughly 15000K.<br/>
+ AWB_MODE_SHADE: Fixed white balance settings good for areas indirectly lit by
+ the sun, roughly 7500K.<br/>
+ ANDROID_CONTROL_AWB_STATE: Dynamic metadata describing the current AWB algorithm
+ state, reported by the HAL in the result metadata.<br/>
+ AWB_STATE_INACTIVE: Initial AWB state after mode switch. When the device is
+ opened, it must start in this state.<br/>
+ AWB_STATE_SEARCHING: AWB is not converged to a good value and is changing color
+ adjustment parameters.<br/>
+ AWB_STATE_CONVERGED: AWB has found good color adjustment values for the current
+ scene, and the parameters are not changing. HAL may spontaneously leave this
+ state to search for a better solution.<br/>
+ AWB_STATE_LOCKED: AWB has been locked with the AWB_LOCK control. Color
+ adjustment values are not changing.<br/>
+ Additional metadata entries:<br/>
+ ANDROID_CONTROL_AWB_LOCK: Control for locking AWB color adjustments to their
+ current values.<br/>
+ ANDROID_CONTROL_AWB_REGIONS: Control for selecting the regions of the FOV that
+ should be used to determine good color balance. This applies only to
+ auto-whitebalance mode.</p>
+<h2 id="state-transition">General state machine transition notes</h2>
+<p>Switching between AF, AE, or AWB modes always resets the algorithm's state to
+ INACTIVE. Similarly, switching between CONTROL_MODE or CONTROL_SCENE_MODE if
+ CONTROL_MODE == USE_SCENE_MODE resets all the algorithm states to INACTIVE.<br/>
+ The tables below are per-mode.</p>
+<h2 id="af-state">AF state machines</h2>
+<table>
+ <tr>
+ <td><strong>mode = AF_MODE_OFF or AF_MODE_EDOF</strong></td>
+ <td></td>
+ <td></td>
+ <td></td>
+ </tr>
+ <tr>
+ <th>State</th>
+ <th>Transformation cause</th>
+ <th>New state</th>
+ <th>Notes</th>
+ </tr>
+ <tr>
+ <td>INACTIVE</td>
+ <td></td>
+ <td></td>
+ <td>AF is disabled</td>
+ </tr>
+ <tr>
+ <td><strong>mode = AF_MODE_AUTO or AF_MODE_MACRO</strong></td>
+ <td></td>
+ <td></td>
+ <td></td>
+ </tr>
+ <tr>
+ <th>State</th>
+ <th>Transformation cause</th>
+ <th>New state</th>
+ <th>Notes</th>
+ </tr>
+ <tr>
+ <td>INACTIVE</td>
+ <td>AF_TRIGGER</td>
+ <td>ACTIVE_SCAN</td>
+ <td>Start AF sweep
+ Lens now moving</td>
+ </tr>
+ <tr>
+ <td>ACTIVE_SCAN</td>
+ <td>AF sweep done</td>
+ <td>FOCUSED_LOCKED</td>
+ <td>If AF successful
+ Lens now locked</td>
+ </tr>
+ <tr>
+ <td>ACTIVE_SCAN</td>
+ <td>AF sweep done</td>
+ <td>NOT_FOCUSED_LOCKED</td>
+ <td>If AF successful
+ Lens now locked</td>
+ </tr>
+ <tr>
+ <td>ACTIVE_SCAN</td>
+ <td>AF_CANCEL</td>
+ <td>INACTIVE</td>
+ <td>Cancel/reset AF
+ Lens now locked</td>
+ </tr>
+ <tr>
+ <td>FOCUSED_LOCKED</td>
+ <td>AF_CANCEL</td>
+ <td>INACTIVE</td>
+ <td>Cancel/reset AF</td>
+ </tr>
+ <tr>
+ <td>FOCUSED_LOCKED</td>
+ <td>AF_TRIGGER</td>
+ <td>ACTIVE_SCAN</td>
+ <td>Start new sweep
+ Lens now moving</td>
+ </tr>
+ <tr>
+ <td>NOT_FOCUSED_LOCKED</td>
+ <td>AF_CANCEL</td>
+ <td>INACTIVE</td>
+ <td>Cancel/reset AF</td>
+ </tr>
+ <tr>
+ <td>NOT_FOCUSED_LOCKED</td>
+ <td>AF_TRIGGER</td>
+ <td>ACTIVE_SCAN</td>
+ <td>Start new sweep
+ Lens now moving</td>
+ </tr>
+ <tr>
+ <td>All states</td>
+ <td>mode change</td>
+ <td>INACTIVE</td>
+ <td></td>
+ </tr>
+ <tr>
+ <td><strong>mode = AF_MODE_CONTINUOUS_VIDEO</strong></td>
+ <td></td>
+ <td></td>
+ <td></td>
+ </tr>
+ <tr>
+ <th>State</th>
+ <th>Transformation cause</th>
+ <th>New state</th>
+ <th>Notes</th>
+ </tr>
+ <tr>
+ <td>INACTIVE</td>
+ <td>HAL initiates new scan</td>
+ <td>PASSIVE_SCAN</td>
+ <td>Start AF sweep
+ Lens now moving</td>
+ </tr>
+ <tr>
+ <td>INACTIVE</td>
+ <td>AF_TRIGGER</td>
+ <td>NOT_FOCUSED_LOCKED</td>
+ <td>AF state query
+ Lens now locked</td>
+ </tr>
+ <tr>
+ <td>PASSIVE_SCAN</td>
+ <td>HAL completes current scan</td>
+ <td>PASSIVE_FOCUSED</td>
+ <td>End AF scan
+ Lens now locked </td>
+ </tr>
+ <tr>
+ <td>PASSIVE_SCAN</td>
+ <td>AF_TRIGGER</td>
+ <td>FOCUSED_LOCKED</td>
+ <td>Immediate transformation
+ if focus is good
+ Lens now locked</td>
+ </tr>
+ <tr>
+ <td>PASSIVE_SCAN</td>
+ <td>AF_TRIGGER</td>
+ <td>NOT_FOCUSED_LOCKED</td>
+ <td>Immediate transformation
+ if focus is bad
+ Lens now locked</td>
+ </tr>
+ <tr>
+ <td>PASSIVE_SCAN</td>
+ <td>AF_CANCEL</td>
+ <td>INACTIVE</td>
+ <td>Reset lens position
+ Lens now locked</td>
+ </tr>
+ <tr>
+ <td>PASSIVE_FOCUSED</td>
+ <td>HAL initiates new scan</td>
+ <td>PASSIVE_SCAN</td>
+ <td>Start AF scan
+ Lens now moving</td>
+ </tr>
+ <tr>
+ <td>PASSIVE_FOCUSED</td>
+ <td>AF_TRIGGER</td>
+ <td>FOCUSED_LOCKED</td>
+ <td>Immediate transformation
+ if focus is good
+ Lens now locked</td>
+ </tr>
+ <tr>
+ <td>PASSIVE_FOCUSED</td>
+ <td>AF_TRIGGER</td>
+ <td>NOT_FOCUSED_LOCKED</td>
+ <td>Immediate transformation
+ if focus is bad
+ Lens now locked</td>
+ </tr>
+ <tr>
+ <td>FOCUSED_LOCKED</td>
+ <td>AF_TRIGGER</td>
+ <td>FOCUSED_LOCKED</td>
+ <td>No effect</td>
+ </tr>
+ <tr>
+ <td>FOCUSED_LOCKED</td>
+ <td>AF_CANCEL</td>
+ <td>INACTIVE</td>
+ <td>Restart AF scan</td>
+ </tr>
+ <tr>
+ <td>NOT_FOCUSED_LOCKED</td>
+ <td>AF_TRIGGER</td>
+ <td>NOT_FOCUSED_LOCKED</td>
+ <td>No effect</td>
+ </tr>
+ <tr>
+ <td>NOT_FOCUSED_LOCKED</td>
+ <td>AF_CANCEL</td>
+ <td>INACTIVE</td>
+ <td>Restart AF scan</td>
+ </tr>
+ <tr>
+ <td><strong>mode = AF_MODE_CONTINUOUS_PICTURE</strong></td>
+ <td></td>
+ <td></td>
+ <td></td>
+ </tr>
+ <tr>
+ <th>State</th>
+ <th>Transformation cause</th>
+ <th>New state</th>
+ <th>Notes</th>
+ </tr>
+ <tr>
+ <td>INACTIVE</td>
+ <td>HAL initiates new scan</td>
+ <td>PASSIVE_SCAN</td>
+ <td>Start AF scan
+ Lens now moving</td>
+ </tr>
+ <tr>
+ <td>INACTIVE</td>
+ <td>AF_TRIGGER</td>
+ <td>NOT_FOCUSED_LOCKED</td>
+ <td>AF state query
+ Lens now locked</td>
+ </tr>
+ <tr>
+ <td>PASSIVE_SCAN</td>
+ <td>HAL completes current scan</td>
+ <td>PASSIVE_FOCUSED</td>
+ <td>End AF scan
+ Lens now locked</td>
+ </tr>
+ <tr>
+ <td>PASSIVE_SCAN</td>
+ <td>AF_TRIGGER</td>
+ <td>FOCUSED_LOCKED</td>
+ <td>Eventual transformation once focus good
+ Lens now locked</td>
+ </tr>
+ <tr>
+ <td>PASSIVE_SCAN</td>
+ <td>AF_TRIGGER</td>
+ <td>NOT_FOCUSED_LOCKED</td>
+ <td>Eventual transformation if cannot focus
+ Lens now locked</td>
+ </tr>
+ <tr>
+ <td>PASSIVE_SCAN</td>
+ <td>AF_CANCEL</td>
+ <td>INACTIVE</td>
+ <td>Reset lens position
+ Lens now locked</td>
+ </tr>
+ <tr>
+ <td>PASSIVE_FOCUSED</td>
+ <td>HAL initiates new scan</td>
+ <td>PASSIVE_SCAN</td>
+ <td>Start AF scan
+ Lens now moving</td>
+ </tr>
+ <tr>
+ <td>PASSIVE_FOCUSED</td>
+ <td>AF_TRIGGER</td>
+ <td>FOCUSED_LOCKED</td>
+ <td>Immediate transformation if focus is good
+ Lens now locked</td>
+ </tr>
+ <tr>
+ <td>PASSIVE_FOCUSED</td>
+ <td>AF_TRIGGER</td>
+ <td>NOT_FOCUSED_LOCKED</td>
+ <td>Immediate transformation if focus is bad
+ Lens now locked</td>
+ </tr>
+ <tr>
+ <td>FOCUSED_LOCKED</td>
+ <td>AF_TRIGGER</td>
+ <td>FOCUSED_LOCKED</td>
+ <td>No effect</td>
+ </tr>
+ <tr>
+ <td>FOCUSED_LOCKED</td>
+ <td>AF_CANCEL</td>
+ <td>INACTIVE</td>
+ <td>Restart AF scan</td>
+ </tr>
+ <tr>
+ <td>NOT_FOCUSED_LOCKED</td>
+ <td>AF_TRIGGER</td>
+ <td>NOT_FOCUSED_LOCKED</td>
+ <td>No effect</td>
+ </tr>
+ <tr>
+ <td>NOT_FOCUSED_LOCKED</td>
+ <td>AF_CANCEL</td>
+ <td>INACTIVE</td>
+ <td>Restart AF scan</td>
+ </tr>
+</table>
+<h2 id="ae-wb">AE and AWB state machines</h2>
+<p>The AE and AWB state machines are mostly identical. AE has additional
+ FLASH_REQUIRED and PRECAPTURE states. So rows below that refer to those two
+ states should be ignored for the AWB state machine.</p>
+<table>
+ <tr>
+ <td><strong>mode = AE_MODE_OFF / AWB mode not AUTO</strong></td>
+ <td></td>
+ <td></td>
+ <td></td>
+ </tr>
+ <tr>
+ <th>State</th>
+ <th>Transformation cause</th>
+ <th>New state</th>
+ <th>Notes</th>
+ </tr>
+ <tr>
+ <td>INACTIVE</td>
+ <td></td>
+ <td></td>
+ <td>AE/AWB disabled</td>
+ </tr>
+ <tr>
+ <td><strong>mode = AE_MODE_ON_* / AWB_MODE_AUTO</strong></td>
+ <td></td>
+ <td></td>
+ <td></td>
+ </tr>
+ <tr>
+ <th>State</th>
+ <th>Transformation cause</th>
+ <th>New state</th>
+ <th>Notes</th>
+ </tr>
+ <tr>
+ <td>INACTIVE</td>
+ <td>HAL initiates AE/AWB scan</td>
+ <td>SEARCHING</td>
+ <td></td>
+ </tr>
+ <tr>
+ <td>INACTIVE</td>
+ <td>AE/AWB_LOCK on</td>
+ <td>LOCKED</td>
+ <td>Values locked</td>
+ </tr>
+ <tr>
+ <td>SEARCHING</td>
+ <td>HAL finishes AE/AWB scan</td>
+ <td>CONVERGED</td>
+ <td>Good values, not changing</td>
+ </tr>
+ <tr>
+ <td>SEARCHING</td>
+ <td>HAL finishes AE scan</td>
+ <td>FLASH_REQUIRED</td>
+ <td>Converged but too dark without flash</td>
+ </tr>
+ <tr>
+ <td>SEARCHING</td>
+ <td>AE/AWB_LOCK on</td>
+ <td>LOCKED</td>
+ <td>Values locked</td>
+ </tr>
+ <tr>
+ <td>CONVERGED</td>
+ <td>HAL initiates AE/AWB scan</td>
+ <td>SEARCHING</td>
+ <td>Values locked</td>
+ </tr>
+ <tr>
+ <td>CONVERGED</td>
+ <td>AE/AWB_LOCK on</td>
+ <td>LOCKED</td>
+ <td>Values locked</td>
+ </tr>
+ <tr>
+ <td>FLASH_REQUIRED</td>
+ <td>HAL initiates AE/AWB scan</td>
+ <td>SEARCHING</td>
+ <td>Values locked</td>
+ </tr>
+ <tr>
+ <td>FLASH_REQUIRED</td>
+ <td>AE/AWB_LOCK on</td>
+ <td>LOCKED</td>
+ <td>Values locked</td>
+ </tr>
+ <tr>
+ <td>LOCKED</td>
+ <td>AE/AWB_LOCK off</td>
+ <td>SEARCHING</td>
+ <td>Values not good after unlock</td>
+ </tr>
+ <tr>
+ <td>LOCKED</td>
+ <td>AE/AWB_LOCK off</td>
+ <td>CONVERGED</td>
+ <td>Values good after unlock</td>
+ </tr>
+ <tr>
+ <td>LOCKED</td>
+ <td>AE_LOCK off</td>
+ <td>FLASH_REQUIRED</td>
+ <td>Exposure good, but too dark</td>
+ </tr>
+ <tr>
+ <td>All AE states</td>
+ <td>PRECAPTURE_START</td>
+ <td>PRECAPTURE</td>
+ <td>Start precapture sequence</td>
+ </tr>
+ <tr>
+ <td>PRECAPTURE</td>
+ <td>Sequence done, AE_LOCK off</td>
+ <td>CONVERGED</td>
+ <td>Ready for high-quality capture</td>
+ </tr>
+ <tr>
+ <td>PRECAPTURE</td>
+ <td>Sequence done, AE_LOCK on</td>
+ <td>LOCKED</td>
+ <td>Ready for high-quality capture</td>
+ </tr>
+</table>
+<h2 id="manual-control">Enabling manual control</h2>
+<p>Several controls are also involved in configuring the device 3A blocks to allow
+ for direct application control.</p>
+<p>The HAL model for 3A control is that for each request, the HAL inspects the
+ state of the 3A control fields. If any 3A routine is enabled, then that routine
+ overrides the control variables that relate to that routine, and these override
+ values are then available in the result metadata for that capture. So for
+ example, if auto-exposure is enabled in a request, the HAL should overwrite the
+ exposure, gain, and frame duration fields (and potentially the flash fields,
+ depending on AE mode) of the request. The list of relevant controls is:</p>
+<table>
+ <tr>
+ <th>Control name</th>
+ <th>Unit</th>
+ <th>Notes</th>
+ </tr>
+ <tr>
+ <td>android.control.mode</td>
+ <td>enum: OFF, AUTO, USE_SCENE_MODE</td>
+ <td>High-level 3A control. When set to OFF, all 3A control by the HAL is disabled. The application must set the fields for capture parameters itself.
+ When set to AUTO, the individual algorithm controls in android.control.* are in effect, such as android.control.afMode.
+ When set to USE_SCENE_MODE, the individual controls in android.control.* are mostly disabled, and the HAL implements one of the scene mode settings (such as ACTION, SUNSET, or PARTY) as it wishes.</td>
+ </tr>
+ <tr>
+ <td>android.control.afMode</td>
+ <td>enum</td>
+ <td>OFF means manual control of lens focusing through android.lens.focusDistance.</td>
+ </tr>
+ <tr>
+ <td>android.control.aeMode</td>
+ <td>enum</td>
+ <td>OFF means manual control of exposure/gain/frame duration through android.sensor.exposureTime / .sensitivity / .frameDuration</td>
+ </tr>
+ <tr>
+ <td>android.control.awbMode</td>
+ <td>enum</td>
+ <td>OFF means manual control of white balance. </td>
+ </tr>
+</table>
diff --git a/src/devices/camera/camera3_crop_reprocess.jd b/src/devices/camera/camera3_crop_reprocess.jd
new file mode 100644
index 0000000..e617e1e
--- /dev/null
+++ b/src/devices/camera/camera3_crop_reprocess.jd
@@ -0,0 +1,125 @@
+page.title=Output streams and cropping
+@jd:body
+
+<!--
+ Copyright 2013 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<h2 id="output-stream">Output streams</h2>
+<p> Unlike the old camera subsystem, which has 3-4 different ways of producing data
+ from the camera (ANativeWindow-based preview operations, preview callbacks,
+ video callbacks, and takePicture callbacks), the new subsystem operates solely
+ on the ANativeWindow-based pipeline for all resolutions and output formats.
+ Multiple such streams can be configured at once, to send a single frame to many
+ targets such as the GPU, the video encoder, RenderScript, or app-visible buffers
+ (RAW Bayer, processed YUV buffers, or JPEG-encoded buffers).</p>
+<p>As an optimization, these output streams must be configured ahead of time, and
+ only a limited number may exist at once. This allows for pre-allocation of
+ memory buffers and configuration of the camera hardware, so that when requests
+ are submitted with multiple or varying output pipelines listed, there won't be
+ delays or latency in fulfilling the request.</p>
+<p>To support backwards compatibility with the current camera API, at least 3
+ simultaneous YUV output streams must be supported, plus one JPEG stream. This is
+ required for video snapshot support with the application also receiving YUV
+ buffers:</p>
+<ul>
+ <li>One stream to the GPU/SurfaceView (opaque YUV format) for preview</li>
+ <li>One stream to the video encoder (opaque YUV format) for recording</li>
+ <li>One stream to the application (known YUV format) for preview frame callbacks</li>
+ <li>One stream to the application (JPEG) for video snapshots.</li>
+</ul>
+<p>The exact requirements are still being defined since the corresponding API
+isn't yet finalized.</p>
+<h2>Cropping</h2>
+<p>Cropping of the full pixel array (for digital zoom and other use cases where a
+ smaller FOV is desirable) is communicated through the ANDROID_SCALER_CROP_REGION
+ setting. This is a per-request setting, and can change on a per-request basis,
+ which is critical for implementing smooth digital zoom.</p>
+<p>The region is defined as a rectangle (x, y, width, height), with (x, y)
+ describing the top-left corner of the rectangle. The rectangle is defined on the
+ coordinate system of the sensor active pixel array, with (0,0) being the
+ top-left pixel of the active pixel array. Therefore, the width and height cannot
+ be larger than the dimensions reported in the ANDROID_SENSOR_ACTIVE_PIXEL_ARRAY
+ static info field. The minimum allowed width and height are reported by the HAL
+ through the ANDROID_SCALER_MAX_DIGITAL_ZOOM static info field, which describes
+ the maximum supported zoom factor. Therefore, the minimum crop region width and
+ height are:</p>
+<pre>
+ {width, height} =
+ { floor(ANDROID_SENSOR_ACTIVE_PIXEL_ARRAY[0] /
+ ANDROID_SCALER_MAX_DIGITAL_ZOOM),
+ floor(ANDROID_SENSOR_ACTIVE_PIXEL_ARRAY[1] /
+ ANDROID_SCALER_MAX_DIGITAL_ZOOM) }
+ </pre>
+<p>If the crop region needs to fulfill specific requirements (for example, it needs
+ to start on even coordinates, and its width/height needs to be even), the HAL
+ must do the necessary rounding and write out the final crop region used in the
+ output result metadata. Similarly, if the HAL implements video stabilization, it
+ must adjust the result crop region to describe the region actually included in
+ the output after video stabilization is applied. In general, a camera-using
+ application must be able to determine the field of view it is receiving based on
+ the crop region, the dimensions of the image sensor, and the lens focal length.</p>
+<p>Since the crop region applies to all streams, which may have different aspect
+ ratios than the crop region, the exact sensor region used for each stream may be
+ smaller than the crop region. Specifically, each stream should maintain square
+ pixels and its aspect ratio by minimally further cropping the defined crop
+ region. If the stream's aspect ratio is wider than the crop region, the stream
+ should be further cropped vertically, and if the stream's aspect ratio is
+ narrower than the crop region, the stream should be further cropped
+ horizontally.</p>
+<p>In all cases, the stream crop must be centered within the full crop region, and
+ each stream is only either cropped horizontally or vertical relative to the full
+ crop region, never both.</p>
+<p>For example, if two streams are defined, a 640x480 stream (4:3 aspect), and a
+ 1280x720 stream (16:9 aspect), below demonstrates the expected output regions
+ for each stream for a few sample crop regions, on a hypothetical 3 MP (2000 x
+ 1500 pixel array) sensor.</p>
+</p>
+ Crop region: (500, 375, 1000, 750) (4:3 aspect ratio)<br/>
+ 640x480 stream crop: (500, 375, 1000, 750) (equal to crop region)<br/>
+ 1280x720 stream crop: (500, 469, 1000, 562)<br/>
+ <img src="images/crop-region-43-ratio.png" alt="crop-region-43-ratio"/>
+</p>
+<p>Crop region: (500, 375, 1333, 750) (16:9 aspect ratio)<br/>
+ 640x480 stream crop: (666, 375, 1000, 750)<br/>
+ 1280x720 stream crop: (500, 375, 1333, 750) (equal to crop region)<br/>
+ <img src="images/crop-region-169-ratio.png" alt="crop-region-169-ratio"/>
+ <!-- TODO: Fix alt text and URL -->
+</p>
+<p>Crop region: (500, 375, 750, 750) (1:1 aspect ratio)<br/>
+ 640x480 stream crop: (500, 469, 750, 562)<br/>
+ 1280x720 stream crop: (500, 543, 750, 414)<br/>
+ <img src="images/crop-region-11-ratio.png" alt="crop-region-11-ratio"/>
+ <br/>
+ And a final example, a 1024x1024 square aspect ratio stream instead of the 480p
+ stream:<br/>
+ Crop region: (500, 375, 1000, 750) (4:3 aspect ratio)<br/>
+ 1024x1024 stream crop: (625, 375, 750, 750)<br/>
+ 1280x720 stream crop: (500, 469, 1000, 562)<br/>
+ <img src="images/crop-region-43-square-ratio.png"
+alt="crop-region-43-square-ratio"/>
+</p>
+<h2 id="reprocessing">Reprocessing</h2>
+<p> Additional support for raw image files is provided by reprocessing support for RAW Bayer
+ data. This support allows the camera pipeline to process a previously captured
+ RAW buffer and metadata (an entire frame that was recorded previously), to
+ produce a new rendered YUV or JPEG output.</p>
diff --git a/src/devices/camera/camera3_error_stream.jd b/src/devices/camera/camera3_error_stream.jd
new file mode 100644
index 0000000..c1a1610
--- /dev/null
+++ b/src/devices/camera/camera3_error_stream.jd
@@ -0,0 +1,160 @@
+page.title=Error and stream handling
+@jd:body
+
+<!--
+ Copyright 2013 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<h2 id="error-mgmt">Error management</h2>
+<p>Camera HAL device ops functions that have a return value will all return -ENODEV
+ / NULL in case of a serious error. This means the device cannot continue
+ operation, and must be closed by the framework. Once this error is returned by
+ some method, or if notify() is called with ERROR_DEVICE, only the close() method
+ can be called successfully. All other methods will return -ENODEV / NULL.<br/>
+ If a device op is called in the wrong sequence, for example if the framework
+ calls configure_streams() is called before initialize(), the device must return
+ -ENOSYS from the call, and do nothing.<br/>
+ Transient errors in image capture must be reported through notify() as follows:</p>
+<ul>
+ <li>The failure of an entire capture to occur must be reported by the HAL by
+ calling notify() with ERROR_REQUEST. Individual errors for the result metadata
+ or the output buffers must not be reported in this case.</li>
+ <li>If the metadata for a capture cannot be produced, but some image buffers were
+ filled, the HAL must call notify() with ERROR_RESULT.</li>
+ <li>If an output image buffer could not be filled, but either the metadata was
+ produced or some other buffers were filled, the HAL must call notify() with
+ ERROR_BUFFER for each failed buffer.</li>
+</ul>
+<p>In each of these transient failure cases, the HAL must still call
+ process_capture_result, with valid output buffer_handle_t. If the result
+ metadata could not be produced, it should be NULL. If some buffers could not be
+ filled, their sync fences must be set to the error state.<br/>
+ Invalid input arguments result in -EINVAL from the appropriate methods. In that
+ case, the framework must act as if that call had never been made.</p>
+<h2 id="stream-mgmt">Stream management</h2>
+<h3 id="configure_streams">configure_streams</h3>
+<p>Reset the HAL camera device processing pipeline and set up new input and output
+ streams. This call replaces any existing stream configuration with the streams
+ defined in the stream_list. This method will be called at least once after
+ initialize() before a request is submitted with process_capture_request().<br/>
+ The stream_list must contain at least one output-capable stream, and may not
+ contain more than one input-capable stream.<br/>
+ The stream_list may contain streams that are also in the currently-active set of
+ streams (from the previous call to configure_stream()). These streams will
+ already have valid values for usage, maxbuffers, and the private pointer. If
+ such a stream has already had its buffers registered, register_stream_buffers()
+ will not be called again for the stream, and buffers from the stream can be
+ immediately included in input requests.<br/>
+ If the HAL needs to change the stream configuration for an existing stream due
+ to the new configuration, it may rewrite the values of usage and/or maxbuffers
+ during the configure call. The framework will detect such a change, and will
+ then reallocate the stream buffers, and call register_stream_buffers() again
+ before using buffers from that stream in a request.<br/>
+ If a currently-active stream is not included in stream_list, the HAL may safely
+ remove any references to that stream. It will not be reused in a later
+ configure() call by the framework, and all the gralloc buffers for it will be
+ freed after the configure_streams() call returns.<br/>
+ The stream_list structure is owned by the framework, and may not be accessed
+ once this call completes. The address of an individual camera3streamt
+ structure will remain valid for access by the HAL until the end of the first
+ configure_stream() call which no longer includes that camera3streamt in the
+ stream_list argument. The HAL may not change values in the stream structure
+ outside of the private pointer, except for the usage and maxbuffers members
+ during the configure_streams() call itself.<br/>
+ If the stream is new, the usage, maxbuffer, and private pointer fields of the
+ stream structure will all be set to 0. The HAL device must set these fields
+ before the configure_streams() call returns. These fields are then used by the
+ framework and the platform gralloc module to allocate the gralloc buffers for
+ each stream.<br/>
+ Before such a new stream can have its buffers included in a capture request, the
+ framework will call register_stream_buffers() with that stream. However, the
+ framework is not required to register buffers for _all streams before
+ submitting a request. This allows for quick startup of (for example) a preview
+ stream, with allocation for other streams happening later or concurrently.</p>
+<h4><strong>Preconditions</strong></h4>
+<p>The framework will only call this method when no captures are being processed.
+ That is, all results have been returned to the framework, and all in-flight
+ input and output buffers have been returned and their release sync fences have
+ been signaled by the HAL. The framework will not submit new requests for capture
+ while the configure_streams() call is underway.</p>
+<h4><strong>Postconditions</strong></h4>
+<p>The HAL device must configure itself to provide maximum possible output frame
+ rate given the sizes and formats of the output streams, as documented in the
+ camera device's static metadata.</p>
+<h4><strong>Performance expectations</strong></h4>
+<p>This call is expected to be heavyweight and possibly take several hundred
+ milliseconds to complete, since it may require resetting and reconfiguring the
+ image sensor and the camera processing pipeline. Nevertheless, the HAL device
+ should attempt to minimize the reconfiguration delay to minimize the
+ user-visible pauses during application operational mode changes (such as
+ switching from still capture to video recording).</p>
+<h4><strong>Return values</strong></h4>
+<ul>
+ <li>0: On successful stream configuration</li>
+ <li>undefined</li>
+ <li>-EINVAL: If the requested stream configuration is invalid. Some examples of
+ invalid stream configurations include:
+ <ul>
+ <li>Including more than 1 input-capable stream (INPUT or BIDIRECTIONAL)</li>
+ <li>Not including any output-capable streams (OUTPUT or BIDIRECTIONAL)</li>
+ <li>Including streams with unsupported formats, or an unsupported size for
+ that format.</li>
+ <li>Including too many output streams of a certain format.</li>
+ <li>Note that the framework submitting an invalid stream configuration is not
+ normal operation, since stream configurations are checked before
+ configure. An invalid configuration means that a bug exists in the
+ framework code, or there is a mismatch between the HAL's static metadata
+ and the requirements on streams.</li>
+ </ul>
+ </li>
+ <li>-ENODEV: If there has been a fatal error and the device is no longer
+ operational. Only close() can be called successfully by the framework after
+ this error is returned.</li>
+</ul>
+<h3 id="register-stream">register_stream_buffers</h3>
+<p>Register buffers for a given stream with the HAL device. This method is called
+ by the framework after a new stream is defined by configure_streams, and before
+ buffers from that stream are included in a capture request. If the same stream
+ is listed in a subsequent configure_streams() call, register_stream_buffers will
+ not be called again for that stream.<br/>
+ The framework does not need to register buffers for all configured streams
+ before it submits the first capture request. This allows quick startup for
+ preview (or similar use cases) while other streams are still being allocated.<br/>
+ This method is intended to allow the HAL device to map or otherwise prepare the
+ buffers for later use. The buffers passed in will already be locked for use. At
+ the end of the call, all the buffers must be ready to be returned to the stream.
+ The bufferset argument is only valid for the duration of this call.<br/>
+ If the stream format was set to HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED, the
+ camera HAL should inspect the passed-in buffers here to determine any
+ platform-private pixel format information.</p>
+<h4><strong>Return values</strong></h4>
+<ul>
+ <li>0: On successful registration of the new stream buffers</li>
+ <li>-EINVAL: If the streambufferset does not refer to a valid active stream, or
+ if the buffers array is invalid.</li>
+ <li>-ENOMEM: If there was a failure in registering the buffers. The framework must
+ consider all the stream buffers to be unregistered, and can try to register
+ again later.</li>
+ <li>-ENODEV: If there is a fatal error, and the device is no longer operational.
+ Only close() can be called successfully by the framework after this error is
+ returned.</li>
+</ul>
diff --git a/src/devices/camera/camera3_metadata.jd b/src/devices/camera/camera3_metadata.jd
new file mode 100644
index 0000000..9e43512
--- /dev/null
+++ b/src/devices/camera/camera3_metadata.jd
@@ -0,0 +1,65 @@
+page.title=Metadata and Controls
+@jd:body
+
+<!--
+ Copyright 2013 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<h2 id="metadata">Metadata support</h2>
+<p> To support the saving of raw image files by the Android framework, substantial
+ metadata is required about the sensor's characteristics. This includes
+ information such as color spaces and lens shading functions.</p>
+<p>Most of this information is a static property of the camera subsystem and can
+ therefore be queried before configuring any output pipelines or submitting any
+ requests. The new camera APIs greatly expand the information provided by the
+ getCameraInfo() method to provide this information to the application.</p>
+<p>In addition, manual control of the camera subsystem requires feedback from the
+ assorted devices about their current state, and the actual parameters used in
+ capturing a given frame. The actual values of the controls (exposure time, frame
+ duration, and sensitivity) as actually used by the hardware must be included in
+ the output metadata. This is essential so that applications know when either
+ clamping or rounding took place, and so that the application can compensate for
+ the real settings used for image capture.</p>
+<p>For example, if an application sets frame duration to 0 in a request, the HAL
+ must clamp the frame duration to the real minimum frame duration for that
+ request, and report that clamped minimum duration in the output result metadata.</p>
+<p>So if an application needs to implement a custom 3A routine (for example, to
+ properly meter for an HDR burst), it needs to know the settings used to capture
+ the latest set of results it has received in order to update the settings for
+ the next request. Therefore, the new camera API adds a substantial amount of
+ dynamic metadata to each captured frame. This includes the requested and actual
+ parameters used for the capture, as well as additional per-frame metadata such
+ as timestamps and statistics generator output.</p>
+<h2 id="per-setting">Per-setting control</h2>
+<p> For most settings, the expectation is that they can be changed every frame,
+ without introducing significant stutter or delay to the output frame stream.
+ Ideally, the output frame rate should solely be controlled by the capture
+ request's frame duration field, and be independent of any changes to processing
+ blocks' configuration. In reality, some specific controls are known to be slow
+ to change; these include the output resolution and output format of the camera
+ pipeline, as well as controls that affect physical devices, such as lens focus
+ distance. The exact requirements for each control set are detailed later.</p>
+<h2 id="raw-sensor">Raw sensor data support</h2>
+<p>In addition to the pixel formats supported by
+ the old API, the new API adds a requirement for support for raw sensor data
+ (Bayer RAW), both for advanced camera applications as well as to support raw
+ image files.</p>
diff --git a/src/devices/camera/camera3_requests_hal.jd b/src/devices/camera/camera3_requests_hal.jd
new file mode 100644
index 0000000..9bd4f28
--- /dev/null
+++ b/src/devices/camera/camera3_requests_hal.jd
@@ -0,0 +1,428 @@
+page.title=HAL subsystem
+@jd:body
+
+<!--
+ Copyright 2013 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<h2 id="requests">Requests</h2>
+<p> The app framework issues requests for captured results to the camera subsystem.
+ One request corresponds to one set of results. A request encapsulates all
+ configuration information about the capturing and processing of those results.
+ This includes things such as resolution and pixel format; manual sensor, lens,
+ and flash control; 3A operating modes; RAW to YUV processing control; and
+ statistics generation. This allows for much more control over the results'
+ output and processing. Multiple requests can be in flight at once, and
+ submitting requests is non-blocking. And the requests are always processed in
+ the order they are received.<br/>
+ <img src="images/camera_model.png" alt="Camera request model"/>
+ <br/>
+ <strong>Figure 3.</strong> Camera model</p>
+<h2 id="hal-subsystem">The HAL and camera subsystem</h2>
+<p> The camera subsystem includes the implementations for components in the camera
+ pipeline such as the 3A algorithm and processing controls. The camera HAL
+ provides interfaces for you to implement your versions of these components. To
+ maintain cross-platform compatibility between multiple device manufacturers and
+ Image Signal Processor (ISP, or camera sensor) vendors, the camera pipeline
+ model is virtual and does not directly correspond to any real ISP. However, it
+ is similar enough to real processing pipelines so that you can map it to your
+ hardware efficiently. In addition, it is abstract enough to allow for multiple
+ different algorithms and orders of operation without compromising either
+ quality, efficiency, or cross-device compatibility.<br/>
+ The camera pipeline also supports triggers that the app framework can initiate
+ to turn on things such as auto-focus. It also sends notifications back to the
+ app framework, notifying apps of events such as an auto-focus lock or errors.<br/>
+ <img src="images/camera_hal.png" alt="Camera hardware abstraction layer"/>
+ <br/>
+ <strong>Figure 4.</strong> Camera pipeline<br/>
+ Please note, some image processing blocks shown in the diagram above are not
+ well-defined in the initial release.<br/>
+ The camera pipeline makes the following assumptions:</p>
+<ul>
+ <li>RAW Bayer output undergoes no processing inside the ISP.</li>
+ <li>Statistics are generated based off the raw sensor data.</li>
+ <li>The various processing blocks that convert raw sensor data to YUV are in an
+ arbitrary order.</li>
+ <li>While multiple scale and crop units are shown, all scaler units share the
+ output region controls (digital zoom). However, each unit may have a different
+ output resolution and pixel format.</li>
+</ul>
+<p><strong>Summary of API use</strong><br/>
+ This is a brief summary of the steps for using the Android camera API. See the
+ Startup and expected operation sequence section for a detailed breakdown of
+ these steps, including API calls.</p>
+<ol>
+ <li>Listen for and enumerate camera devices.</li>
+ <li>Open device and connect listeners.</li>
+ <li>Configure outputs for target use case (such as still capture, recording,
+ etc.).</li>
+ <li>Create request(s) for target use case.</li>
+ <li>Capture/repeat requests and bursts.</li>
+ <li>Receive result metadata and image data.</li>
+ <li>When switching use cases, return to step 3.</li>
+</ol>
+<p><strong>HAL operation summary</strong></p>
+<ul>
+ <li>Asynchronous requests for captures come from the framework.</li>
+ <li>HAL device must process requests in order. And for each request, produce
+ output result metadata, and one or more output image buffers.</li>
+ <li>First-in, first-out for requests and results, and for streams referenced by
+ subsequent requests. </li>
+ <li>Timestamps must be identical for all outputs from a given request, so that the
+ framework can match them together if needed. </li>
+ <li>All capture configuration and state (except for the 3A routines) is
+ encapsulated in the requests and results.</li>
+</ul>
+<p><img src="images/camera-hal-overview.png" alt="Camera HAL overview"/>
+ <br/>
+ <strong>Figure 5.</strong> Camera HAL overview</p>
+<h2 id="startup">Startup and expected operation sequence</h2>
+<p>This section contains a detailed explanation of the steps expected when using
+ the camera API. Please see <a href="https://android.googlesource.com/platform/hardware/libhardware/+/master/include/hardware/camera3.h">platform/hardware/libhardware/include/hardware/camera3.h</a> for definitions of these structures and methods.</p>
+<ol>
+ <li>Framework calls camera_module_t->common.open(), which returns a
+ hardware_device_t structure.</li>
+ <li>Framework inspects the hardware_device_t->version field, and instantiates the
+ appropriate handler for that version of the camera hardware device. In case
+ the version is CAMERA_DEVICE_API_VERSION_3_0, the device is cast to a
+ camera3_device_t.</li>
+ <li>Framework calls camera3_device_t->ops->initialize() with the framework
+ callback function pointers. This will only be called this one time after
+ open(), before any other functions in the ops structure are called.</li>
+ <li>The framework calls camera3_device_t->ops->configure_streams() with a list of
+ input/output streams to the HAL device.</li>
+ <li>The framework allocates gralloc buffers and calls
+ camera3_device_t->ops->register_stream_buffers() for at least one of the
+ output streams listed in configure_streams. The same stream is registered
+ only once.</li>
+ <li>The framework requests default settings for some number of use cases with
+ calls to camera3_device_t->ops->construct_default_request_settings(). This
+ may occur any time after step 3.</li>
+ <li>The framework constructs and sends the first capture request to the HAL with
+ settings based on one of the sets of default settings, and with at least one
+ output stream that has been registered earlier by the framework. This is sent
+ to the HAL with camera3_device_t->ops->process_capture_request(). The HAL
+ must block the return of this call until it is ready for the next request to
+ be sent.</li>
+ <li>The framework continues to submit requests, and possibly call
+ register_stream_buffers() for not-yet-registered streams, and call
+ construct_default_request_settings to get default settings buffers for other
+ use cases.</li>
+ <li>When the capture of a request begins (sensor starts exposing for the
+ capture), the HAL calls camera3_callback_ops_t->notify() with the SHUTTER
+ event, including the frame number and the timestamp for start of exposure.
+ This notify call must be made before the first call to
+ process_capture_result() for that frame number.</li>
+ <li>After some pipeline delay, the HAL begins to return completed captures to
+ the framework with camera3_callback_ops_t->process_capture_result(). These
+ are returned in the same order as the requests were submitted. Multiple
+ requests can be in flight at once, depending on the pipeline depth of the
+ camera HAL device.</li>
+ <li>After some time, the framework may stop submitting new requests, wait for
+ the existing captures to complete (all buffers filled, all results
+ returned), and then call configure_streams() again. This resets the camera
+ hardware and pipeline for a new set of input/output streams. Some streams
+ may be reused from the previous configuration; if these streams' buffers had
+ already been registered with the HAL, they will not be registered again. The
+ framework then continues from step 7, if at least one registered output
+ stream remains. (Otherwise, step 5 is required first.)</li>
+ <li>Alternatively, the framework may call camera3_device_t->common->close() to
+ end the camera session. This may be called at any time when no other calls
+ from the framework are active, although the call may block until all
+ in-flight captures have completed (all results returned, all buffers
+ filled). After the close call returns, no more calls to the
+ camera3_callback_ops_t functions are allowed from the HAL. Once the close()
+ call is underway, the framework may not call any other HAL device functions.</li>
+ <li>In case of an error or other asynchronous event, the HAL must call
+ camera3_callback_ops_t->notify() with the appropriate error/event message.
+ After returning from a fatal device-wide error notification, the HAL should
+ act as if close() had been called on it. However, the HAL must either cancel
+ or complete all outstanding captures before calling notify(), so that once
+ notify() is called with a fatal error, the framework will not receive
+ further callbacks from the device. Methods besides close() should return
+ -ENODEV or NULL after the notify() method returns from a fatal error
+ message.</li>
+</ol>
+<p><img src="images/camera-ops-flow.png" width="600" height="434" alt="Camera operations flow" />
+</p>
+<p><strong>Figure 6.</strong> Camera operational flow</p>
+<h2 id="ops-modes">Operational modes</h2>
+<p>The camera 3 HAL device can implement one of two possible operational modes:
+ limited and full. Full support is expected from new higher-end devices. Limited
+ mode has hardware requirements roughly in line with those for a camera HAL
+ device v1 implementation, and is expected from older or inexpensive devices.
+ Full is a strict superset of limited, and they share the same essential
+ operational flow, as documented above.</p>
+<p>The HAL must indicate its level of support with the
+ android.info.supportedHardwareLevel static metadata entry, with 0 indicating
+ limited mode, and 1 indicating full mode support.</p>
+<p>Roughly speaking, limited-mode devices do not allow for application control of
+ capture settings (3A control only), high-rate capture of high-resolution images,
+ raw sensor readout, or support for YUV output streams above maximum recording
+ resolution (JPEG only for large images).<br/>
+ Here are the details of limited-mode behavior:</p>
+<ul>
+ <li>Limited-mode devices do not need to implement accurate synchronization between
+ capture request settings and the actual image data captured. Instead, changes
+ to settings may take effect some time in the future, and possibly not for the
+ same output frame for each settings entry. Rapid changes in settings may
+ result in some settings never being used for a capture. However, captures that
+ include high-resolution output buffers ( > 1080p ) have to use the settings as
+ specified (but see below for processing rate).</li>
+ <li>Captures in limited mode that include high-resolution (> 1080p) output buffers
+ may block in process_capture_request() until all the output buffers have been
+ filled. A full-mode HAL device must process sequences of high-resolution
+ requests at the rate indicated in the static metadata for that pixel format.
+ The HAL must still call process_capture_result() to provide the output; the
+ framework must simply be prepared for process_capture_request() to block until
+ after process_capture_result() for that request completes for high-resolution
+ captures for limited-mode devices.</li>
+ <li>Limited-mode devices do not need to support most of the settings/result/static
+ info metadata. Only the following settings are expected to be consumed or
+ produced by a limited-mode HAL device:
+ <ul>
+ <li>android.control.aeAntibandingMode (controls)</li>
+ <li>android.control.aeExposureCompensation (controls)</li>
+ <li>android.control.aeLock (controls)</li>
+ <li>android.control.aeMode (controls)</li>
+ <li>[OFF means ON_FLASH_TORCH]</li>
+ <li>android.control.aeRegions (controls)</li>
+ <li>android.control.aeTargetFpsRange (controls)</li>
+ <li>android.control.afMode (controls)</li>
+ <li>[OFF means infinity focus]</li>
+ <li>android.control.afRegions (controls)</li>
+ <li>android.control.awbLock (controls)</li>
+ <li>android.control.awbMode (controls)</li>
+ <li>[OFF not supported]</li>
+ <li>android.control.awbRegions (controls)</li>
+ <li>android.control.captureIntent (controls)</li>
+ <li>android.control.effectMode (controls)</li>
+ <li>android.control.mode (controls)</li>
+ <li>[OFF not supported]</li>
+ <li>android.control.sceneMode (controls)</li>
+ <li>android.control.videoStabilizationMode (controls)</li>
+ <li>android.control.aeAvailableAntibandingModes (static)</li>
+ <li>android.control.aeAvailableModes (static)</li>
+ <li>android.control.aeAvailableTargetFpsRanges (static)</li>
+ <li>android.control.aeCompensationRange (static)</li>
+ <li>android.control.aeCompensationStep (static)</li>
+ <li>android.control.afAvailableModes (static)</li>
+ <li>android.control.availableEffects (static)</li>
+ <li>android.control.availableSceneModes (static)</li>
+ <li>android.control.availableVideoStabilizationModes (static)</li>
+ <li>android.control.awbAvailableModes (static)</li>
+ <li>android.control.maxRegions (static)</li>
+ <li>android.control.sceneModeOverrides (static)</li>
+ <li>android.control.aeRegions (dynamic)</li>
+ <li>android.control.aeState (dynamic)</li>
+ <li>android.control.afMode (dynamic)</li>
+ <li>android.control.afRegions (dynamic)</li>
+ <li>android.control.afState (dynamic)</li>
+ <li>android.control.awbMode (dynamic)</li>
+ <li>android.control.awbRegions (dynamic)</li>
+ <li>android.control.awbState (dynamic)</li>
+ <li>android.control.mode (dynamic)</li>
+ <li>android.flash.info.available (static)</li>
+ <li>android.info.supportedHardwareLevel (static)</li>
+ <li>android.jpeg.gpsCoordinates (controls)</li>
+ <li>android.jpeg.gpsProcessingMethod (controls)</li>
+ <li>android.jpeg.gpsTimestamp (controls)</li>
+ <li>android.jpeg.orientation (controls)</li>
+ <li>android.jpeg.quality (controls)</li>
+ <li>android.jpeg.thumbnailQuality (controls)</li>
+ <li>android.jpeg.thumbnailSize (controls)</li>
+ <li>android.jpeg.availableThumbnailSizes (static)</li>
+ <li>android.jpeg.maxSize (static)</li>
+ <li>android.jpeg.gpsCoordinates (dynamic)</li>
+ <li>android.jpeg.gpsProcessingMethod (dynamic)</li>
+ <li>android.jpeg.gpsTimestamp (dynamic)</li>
+ <li>android.jpeg.orientation (dynamic)</li>
+ <li>android.jpeg.quality (dynamic)</li>
+ <li>android.jpeg.size (dynamic)</li>
+ <li>android.jpeg.thumbnailQuality (dynamic)</li>
+ <li>android.jpeg.thumbnailSize (dynamic)</li>
+ <li>android.lens.info.minimumFocusDistance (static)</li>
+ <li>android.request.id (controls)</li>
+ <li>android.request.id (dynamic)</li>
+ <li>android.scaler.cropRegion (controls)</li>
+ <li>[ignores (x,y), assumes center-zoom]</li>
+ <li>android.scaler.availableFormats (static)</li>
+ <li>[RAW not supported]</li>
+ <li>android.scaler.availableJpegMinDurations (static)</li>
+ <li>android.scaler.availableJpegSizes (static)</li>
+ <li>android.scaler.availableMaxDigitalZoom (static)</li>
+ <li>android.scaler.availableProcessedMinDurations (static)</li>
+ <li>android.scaler.availableProcessedSizes (static)</li>
+ <li>[full resolution not supported]</li>
+ <li>android.scaler.maxDigitalZoom (static)</li>
+ <li>android.scaler.cropRegion (dynamic)</li>
+ <li>android.sensor.orientation (static)</li>
+ <li>android.sensor.timestamp (dynamic)</li>
+ <li>android.statistics.faceDetectMode (controls)</li>
+ <li>android.statistics.info.availableFaceDetectModes (static)</li>
+ <li>android.statistics.faceDetectMode (dynamic)</li>
+ <li>android.statistics.faceIds (dynamic)</li>
+ <li>android.statistics.faceLandmarks (dynamic)</li>
+ <li>android.statistics.faceRectangles (dynamic)</li>
+ <li>android.statistics.faceScores (dynamic)</li>
+ </ul>
+ </li>
+</ul>
+<h2 id="interaction">Interaction between the application capture request, 3A
+control, and the processing pipeline</h2>
+<p>Depending on the settings in the 3A control block, the camera pipeline ignores
+ some of the parameters in the application's capture request and uses the values
+ provided by the 3A control routines instead. For example, when auto-exposure is
+ active, the exposure time, frame duration, and sensitivity parameters of the
+ sensor are controlled by the platform 3A algorithm, and any app-specified values
+ are ignored. The values chosen for the frame by the 3A routines must be reported
+ in the output metadata. The following table describes the different modes of the
+ 3A control block and the properties that are controlled by these modes. See
+ the <a href="https://android.googlesource.com/platform/system/media/+/master/camera/docs/docs.html">platform/system/media/camera/docs/docs.html</a> file for definitions of these properties.</p>
+<table>
+ <tr>
+ <th>Parameter</th>
+ <th>State</th>
+ <th>Properties controlled</th>
+ </tr>
+ <tr>
+ <td>android.control.aeMode</td>
+ <td>OFF</td>
+ <td>None</td>
+ </tr>
+ <tr>
+ <td></td>
+ <td>ON</td>
+ <td>android.sensor.exposureTime
+ android.sensor.frameDuration
+ android.sensor.sensitivity
+ android.lens.aperture (if supported)
+ android.lens.filterDensity (if supported)</td>
+ </tr>
+ <tr>
+ <td></td>
+ <td>ON_AUTO_FLASH</td>
+ <td>Everything is ON, plus android.flash.firingPower, android.flash.firingTime, and android.flash.mode</td>
+ </tr>
+ <tr>
+ <td></td>
+ <td>ON_ALWAYS_FLASH</td>
+ <td>Same as ON_AUTO_FLASH</td>
+ </tr>
+ <tr>
+ <td></td>
+ <td>ON_AUTO_FLASH_RED_EYE</td>
+ <td>Same as ON_AUTO_FLASH</td>
+ </tr>
+ <tr>
+ <td>android.control.awbMode</td>
+ <td>OFF</td>
+ <td>None</td>
+ </tr>
+ <tr>
+ <td></td>
+ <td>WHITE_BALANCE_*</td>
+ <td>android.colorCorrection.transform. Platform-specific adjustments if android.colorCorrection.mode is FAST or HIGH_QUALITY.</td>
+ </tr>
+ <tr>
+ <td>android.control.afMode</td>
+ <td>OFF</td>
+ <td>None</td>
+ </tr>
+ <tr>
+ <td></td>
+ <td>FOCUS_MODE_*</td>
+ <td>android.lens.focusDistance</td>
+ </tr>
+ <tr>
+ <td>android.control.videoStabilization</td>
+ <td>OFF</td>
+ <td>None</td>
+ </tr>
+ <tr>
+ <td></td>
+ <td>ON</td>
+ <td>Can adjust android.scaler.cropRegion to implement video stabilization</td>
+ </tr>
+ <tr>
+ <td>android.control.mode</td>
+ <td>OFF</td>
+ <td>AE, AWB, and AF are disabled</td>
+ </tr>
+ <tr>
+ <td></td>
+ <td>AUTO</td>
+ <td>Individual AE, AWB, and AF settings are used</td>
+ </tr>
+ <tr>
+ <td></td>
+ <td>SCENE_MODE_*</td>
+ <td>Can override all parameters listed above. Individual 3A controls are disabled.</td>
+ </tr>
+</table>
+<p>The controls exposed for the 3A algorithm mostly map 1:1 to the old API's
+ parameters (such as exposure compensation, scene mode, or white balance mode).<br/>
+ The controls in the Image Processing block in Figure 2</a> all
+ operate on a similar principle, and generally each block has three modes:</p>
+<ul>
+ <li>OFF: This processing block is disabled. The demosaic, color correction, and
+ tone curve adjustment blocks cannot be disabled.</li>
+ <li>FAST: In this mode, the processing block may not slow down the output frame
+ rate compared to OFF mode, but should otherwise produce the best-quality
+ output it can given that restriction. Typically, this would be used for
+ preview or video recording modes, or burst capture for still images. On some
+ devices, this may be equivalent to OFF mode (no processing can be done without
+ slowing down the frame rate), and on some devices, this may be equivalent to
+ HIGH_QUALITY mode (best quality still does not slow down frame rate).</li>
+ <li>HIGHQUALITY: In this mode, the processing block should produce the best
+ quality result possible, slowing down the output frame rate as needed.
+ Typically, this would be used for high-quality still capture. Some blocks
+ include a manual control which can be optionally selected instead of FAST or
+ HIGHQUALITY. For example, the color correction block supports a color
+ transform matrix, while the tone curve adjustment supports an arbitrary global
+ tone mapping curve.</li>
+</ul>
+ <p>The maximum frame rate that can be supported by a camera subsystem is a function
+ of many factors:</p>
+<ul>
+ <li>Requested resolutions of output image streams</li>
+ <li>Availability of binning / skipping modes on the imager</li>
+ <li>The bandwidth of the imager interface</li>
+ <li>The bandwidth of the various ISP processing blocks</li>
+</ul>
+<p>Since these factors can vary greatly between different ISPs and sensors, the
+ camera HAL interface tries to abstract the bandwidth restrictions into as simple
+ model as possible. The model presented has the following characteristics:</p>
+<ul>
+ <li>The image sensor is always configured to output the smallest resolution
+ possible given the application's requested output stream sizes. The smallest
+ resolution is defined as being at least as large as the largest requested
+ output stream size.</li>
+ <li>Since any request may use any or all the currently configured output streams,
+ the sensor and ISP must be configured to support scaling a single capture to
+ all the streams at the same time. </li>
+ <li>JPEG streams act like processed YUV streams for requests for which they are
+ not included; in requests in which they are directly referenced, they act as
+ JPEG streams.</li>
+ <li>The JPEG processor can run concurrently to the rest of the camera pipeline but
+ cannot process more than one capture at a time.</li>
+</ul>
diff --git a/src/devices/camera/camera3_requests_methods.jd b/src/devices/camera/camera3_requests_methods.jd
new file mode 100644
index 0000000..bde2e44
--- /dev/null
+++ b/src/devices/camera/camera3_requests_methods.jd
@@ -0,0 +1,118 @@
+page.title=Request creation and submission
+@jd:body
+
+<!--
+ Copyright 2013 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<h2 id="request-creation">Request creation and submission</h2>
+<h3 id="default-settings">construct_default_request_settings</h3>
+<p>Create capture settings for standard camera use cases. The device must return a
+ settings buffer that is configured to meet the requested use case, which must be
+ one of the CAMERA3_TEMPLATE_* enums. All request control fields must be
+ included.<br/>
+ The HAL retains ownership of this structure, but the pointer to the structure
+ must be valid until the device is closed. The framework and the HAL may not
+ modify the buffer once it is returned by this call. The same buffer may be
+ returned for subsequent calls for the same template, or for other templates.</p>
+<h4><strong>Return values</strong></h4>
+<ul>
+ <li>Valid metadata: On successful creation of a default settings buffer.</li>
+ <li>NULL: In case of a fatal error. After this is returned, only the close()
+ method can be called successfully by the framework.</li>
+</ul>
+<h3 id="process-request">process_capture_request</h3>
+<p>Send a new capture request to the HAL. The HAL should not return from this call
+ until it is ready to accept the next request to process. Only one call to
+ process_capture_request() will be made at a time by the framework, and the calls
+ will all be from the same thread. The next call to process_capture_request()
+ will be made as soon as a new request and its associated buffers are available.
+ In a normal preview scenario, this means the function will be called again by
+ the framework almost instantly.<br/>
+ The actual request processing is asynchronous, with the results of capture being
+ returned by the HAL through the process_capture_result() call. This call
+ requires the result metadata to be available, but output buffers may simply
+ provide sync fences to wait on. Multiple requests are expected to be in flight
+ at once, to maintain full output frame rate.<br/>
+ The framework retains ownership of the request structure. It is only guaranteed
+ to be valid during this call. The HAL device must make copies of the information
+ it needs to retain for the capture processing. The HAL is responsible for
+ waiting on and closing the buffers' fences and returning the buffer handles to
+ the framework.<br/>
+ The HAL must write the file descriptor for the input buffer's release sync fence
+ into input_buffer->release_fence, if input_buffer is not NULL. If the HAL
+ returns -1 for the input buffer release sync fence, the framework is free to
+ immediately reuse the input buffer. Otherwise, the framework will wait on the
+ sync fence before refilling and reusing the input buffer.</p>
+<h4><strong>Return values</strong></h4>
+<ul>
+ <li>0: On a successful start to processing the capture request</li>
+ <li>-EINVAL: If the input is malformed (the settings are NULL when not allowed,
+ there are 0 output buffers, etc) and capture processing cannot start. Failures
+ during request processing should be handled by calling
+ camera3_callback_ops_t.notify(). In case of this error, the framework will
+ retain responsibility for the stream buffers' fences and the buffer handles;
+ the HAL should not close the fences or return these buffers with
+ process_capture_result.</li>
+ <li>-ENODEV: If the camera device has encountered a serious error. After this
+ error is returned, only the close() method can be successfully called by the
+ framework.</li>
+</ul>
+<h2 id="misc-methods">Miscellaneous methods</h2>
+<h3 id="get-metadata">get_metadata_vendor_tag_ops</h3>
+<p>Get methods to query for vendor extension metadata tag information. The HAL
+ should fill in all the vendor tag operation methods, or leave ops unchanged if
+ no vendor tags are defined. The definition of vendor_tag_query_ops_t can be
+ found in system/media/camera/include/system/camera_metadata.h.</p>
+<h3 id="dump">dump</h3>
+<p>Print out debugging state for the camera device. This will be called by the
+ framework when the camera service is asked for a debug dump, which happens when
+ using the dumpsys tool, or when capturing a bugreport. The passed-in file
+ descriptor can be used to write debugging text using dprintf() or write(). The
+ text should be in ASCII encoding only.</p>
+<h3 id="flush">flush</h3>
+<p>Flush all currently in-process captures and all buffers in the pipeline on the
+ given device. The framework will use this to dump all state as quickly as
+ possible in order to prepare for a configure_streams() call.<br/>
+ No buffers are required to be successfully returned, so every buffer held at the
+ time of flush() (whether sucessfully filled or not) may be returned with
+ CAMERA3_BUFFER_STATUS_ERROR. Note the HAL is still allowed to return valid
+ (STATUS_OK) buffers during this call, provided they are succesfully filled.<br/>
+ All requests currently in the HAL are expected to be returned as soon as
+ possible. Not-in-process requests should return errors immediately. Any
+ interruptible hardware blocks should be stopped, and any uninterruptible blocks
+ should be waited on.<br/>
+ flush() should only return when there are no more outstanding buffers or
+ requests left in the HAL. The framework may call configure_streams (as the HAL
+ state is now quiesced) or may issue new requests.<br/>
+ A flush() call should only take 100ms or less. The maximum time it can take is 1
+ second.</p>
+<h4><strong>Version information</strong></h4>
+<p>This is available only if device version >= CAMERA_DEVICE_API_VERSION_3_1.</p>
+<h4><strong>Return values</strong></h4>
+<ul>
+ <li>0: On a successful flush of the camera HAL.</li>
+ <li>-EINVAL: If the input is malformed (the device is not valid).</li>
+ <li>-ENODEV: If the camera device has encountered a serious error. After this
+ error is returned, only the close() method can be successfully called by the
+ framework.</li>
+</ul>
diff --git a/src/devices/camera/images/camera-hal-overview.png b/src/devices/camera/images/camera-hal-overview.png
new file mode 100644
index 0000000..fed29e7
--- /dev/null
+++ b/src/devices/camera/images/camera-hal-overview.png
Binary files differ
diff --git a/src/devices/camera/images/camera-ops-flow.png b/src/devices/camera/images/camera-ops-flow.png
new file mode 100644
index 0000000..7326782
--- /dev/null
+++ b/src/devices/camera/images/camera-ops-flow.png
Binary files differ
diff --git a/src/devices/images/camera2_block.png b/src/devices/camera/images/camera2_block.png
similarity index 100%
rename from src/devices/images/camera2_block.png
rename to src/devices/camera/images/camera2_block.png
Binary files differ
diff --git a/src/devices/images/camera2_hal.png b/src/devices/camera/images/camera2_hal.png
similarity index 100%
rename from src/devices/images/camera2_hal.png
rename to src/devices/camera/images/camera2_hal.png
Binary files differ
diff --git a/src/devices/images/camera2_block.png b/src/devices/camera/images/camera_block.png
similarity index 100%
copy from src/devices/images/camera2_block.png
copy to src/devices/camera/images/camera_block.png
Binary files differ
diff --git a/src/devices/images/camera2_hal.png b/src/devices/camera/images/camera_hal.png
similarity index 100%
copy from src/devices/images/camera2_hal.png
copy to src/devices/camera/images/camera_hal.png
Binary files differ
diff --git a/src/devices/camera/images/camera_model.png b/src/devices/camera/images/camera_model.png
new file mode 100644
index 0000000..50cbabc
--- /dev/null
+++ b/src/devices/camera/images/camera_model.png
Binary files differ
diff --git a/src/devices/camera/images/camera_simple_model.png b/src/devices/camera/images/camera_simple_model.png
new file mode 100644
index 0000000..fd0fac0
--- /dev/null
+++ b/src/devices/camera/images/camera_simple_model.png
Binary files differ
diff --git a/src/devices/camera/images/crop-region-11-ratio.png b/src/devices/camera/images/crop-region-11-ratio.png
new file mode 100644
index 0000000..8e28230
--- /dev/null
+++ b/src/devices/camera/images/crop-region-11-ratio.png
Binary files differ
diff --git a/src/devices/camera/images/crop-region-169-ratio.png b/src/devices/camera/images/crop-region-169-ratio.png
new file mode 100644
index 0000000..62837e2
--- /dev/null
+++ b/src/devices/camera/images/crop-region-169-ratio.png
Binary files differ
diff --git a/src/devices/camera/images/crop-region-43-ratio.png b/src/devices/camera/images/crop-region-43-ratio.png
new file mode 100644
index 0000000..f48046b
--- /dev/null
+++ b/src/devices/camera/images/crop-region-43-ratio.png
Binary files differ
diff --git a/src/devices/camera/images/crop-region-43-square-ratio.png b/src/devices/camera/images/crop-region-43-square-ratio.png
new file mode 100644
index 0000000..3794dbe
--- /dev/null
+++ b/src/devices/camera/images/crop-region-43-square-ratio.png
Binary files differ
diff --git a/src/devices/camera3.jd b/src/devices/camera3.jd
deleted file mode 100644
index 6ebcfed..0000000
--- a/src/devices/camera3.jd
+++ /dev/null
@@ -1,1570 +0,0 @@
-page.title=Camera Version 3
-@jd:body
-
-<!--
- Copyright 2010 The Android Open Source Project
-
- Licensed under the Apache License, Version 2.0 (the "License");
- you may not use this file except in compliance with the License.
- You may obtain a copy of the License at
-
- http://www.apache.org/licenses/LICENSE-2.0
-
- Unless required by applicable law or agreed to in writing, software
- distributed under the License is distributed on an "AS IS" BASIS,
- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- See the License for the specific language governing permissions and
- limitations under the License.
--->
-<div id="qv-wrapper">
- <div id="qv">
- <h2>In this document</h2>
- <ol id="auto-toc">
- </ol>
- </div>
-</div>
-
-<p>Android's camera HAL connects the higher level
-camera framework APIs in <a
-href="http://developer.android.com/reference/android/hardware/package-summary.html">android.hardware</a>
-to your underlying camera driver and hardware. The latest version of Android introduces a new, underlying
-implementation of the camera stack. If you have previously developed a camera HAL module and driver for
-other versions of Android, be aware that there are significant changes in the camera pipeline.</p>
-
-<p>Version 1 of the camera HAL is still supported for future releases of Android, because many devices
-still rely on it. Implementing both HALs is also supported by
-the Android camera service, which is useful when you want to support a
-less capable front-facing camera with version 1 of HAL and a more advanced
-back-facing camera with the version 3 of HAL. Version 2 was a stepping stone to
-version 3 and is not supported.</p>
-
-<p class="note"><strong>Note:</strong> The new camera HAL is in active development and can change
- at any time. This document describes at a high level the design of the camera subsystem and
- omits many details. Stay tuned for more updates to the PDK repository and look out for updates
- to the HAL and reference implementation of the HAL for more information.
-</p>
-
-
-<h2 id="overview">Overview</h2>
-<p>Version 1 of the camera subsystem was designed as a black box with high-level controls.
- Roughly speaking, the old subsystem has three operating modes:
-</p>
-
-<ul>
-<li>Preview</li>
-<li>Video Record</li>
-<li>Still Capture</li>
-</ul>
-
-<p>Each mode has slightly different capabilities and overlapping capabilities.
-This made it hard to implement new types of features, such as burst mode,
-since it would fall between two of these modes.
-</p>
-
-<p>
-Version 3 of the camera subsystem structures the operation modes into a single unified view,
-which can be used to implement any of the previous modes and several others, such as burst mode.
-In simple terms, the app framework requests a frame from the camera subsystem,
-and the camera subsystem returns results to an output stream.
-In addition, metadata that contains information such as
-color spaces and lens shading is generated for each set of results.
-The following sections and diagram give you more detail about each component.</p>
-
- <img src="images/camera2_block.png" />
-
- <p class="img-caption"><strong>Figure 1.</strong> Camera block diagram</p>
- <h3 id="supported-version">Supported version</h3>
- <p>Camera devices that support this version of the HAL must return
- CAMERA_DEVICE_API_VERSION_3_1 in camera_device_t.common.version and in
- camera_info_t.device_version (from camera_module_t.get_camera_info).</p>
-<p>Camera modules that may contain version 3.1 devices must implement at least
- version 2.0 of the camera module interface (as defined by
- camera_module_t.common.module_api_version).</p>
- <p>See camera_common.h for more versioning details. </p>
- <h3 id="version-history">Version history</h3>
-<h4>1.0</h4>
-<p>Initial Android camera HAL (Android 4.0) [camera.h]: </p>
- <ul>
- <li> Converted from C++ CameraHardwareInterface abstraction layer.</li>
- <li> Supports android.hardware.Camera API.</li>
-</ul>
- <h4>2.0</h4>
- <p>Initial release of expanded-capability HAL (Android 4.2) [camera2.h]:</p>
- <ul>
- <li> Sufficient for implementing existing android.hardware.Camera API.</li>
- <li> Allows for ZSL queue in camera service layer</li>
- <li> Not tested for any new features such manual capture control, Bayer RAW
- capture, reprocessing of RAW data.</li>
- </ul>
- <h4>3.0</h4>
- <p>First revision of expanded-capability HAL:</p>
- <ul>
- <li> Major version change since the ABI is completely different. No change to
- the required hardware capabilities or operational model from 2.0.</li>
- <li> Reworked input request and stream queue interfaces: Framework calls into
- HAL with next request and stream buffers already dequeued. Sync framework
- support is included, necessary for efficient implementations.</li>
- <li> Moved triggers into requests, most notifications into results.</li>
- <li> Consolidated all callbacks into framework into one structure, and all
- setup methods into a single initialize() call.</li>
- <li> Made stream configuration into a single call to simplify stream
- management. Bidirectional streams replace STREAM_FROM_STREAM construct.</li>
- <li> Limited mode semantics for older/limited hardware devices.</li>
- </ul>
- <h4>3.1</h4>
- <p>Minor revision of expanded-capability HAL:</p>
- <ul>
- <li> configure_streams passes consumer usage flags to the HAL.</li>
- <li> flush call to drop all in-flight requests/buffers as fast as possible.
- </li>
- </ul>
-<h2 id="requests">Requests</h2>
-<p>
-The app framework issues requests for captured results to the
-camera subsystem. One request corresponds to one set of results. A request encapsulates
-all configuration information about the capturing
-and processing of those results. This includes things such as resolution and pixel format; manual
-sensor, lens, and flash control; 3A operating modes; RAW to YUV processing control; and statistics
-generation. This allows for much more control over the results' output and processing. Multiple
-requests can be in flight at once and submitting requests is non-blocking. And the requests are always
-processed in the order they are received.
-</p>
-
-
-<h2 id="hal">The HAL and camera subsystem</h2>
-<p>
-The camera subsystem includes the implementations for components in the camera pipeline such as the 3A algorithm and processing controls. The camera HAL
-provides interfaces for you to implement your versions of these components. To maintain cross-platform compatibility between
-multiple device manufacturers and ISP vendors, the camera pipeline model is virtual and does not directly correspond to any real ISP.
-However, it is similar enough to real processing pipelines so that you can map it to your hardware efficiently.
-In addition, it is abstract enough to allow for multiple different algorithms and orders of operation
-without compromising either quality, efficiency, or cross-device compatibility.<p>
-
-<p>
- The camera pipeline also supports triggers
-that the app framework can initiate to turn on things such as auto-focus. It also sends notifications back
-to the app framework, notifying apps of events such as an auto-focus lock or errors. </p>
-
- <img id="figure2" src="images/camera2_hal.png" /> <p class="img-caption"><strong>Figure 2.</strong> Camera pipeline
-
-<p>
-Please note, some image processing blocks shown in the diagram above are not
-well-defined in the initial release.
-</p>
-
-<p>
-The camera pipeline makes the following assumptions:
-</p>
-
-<ul>
- <li>RAW Bayer output undergoes no processing inside the ISP.</li>
- <li>Statistics are generated based off the raw sensor data.</li>
- <li>The various processing blocks that convert raw sensor data to YUV are in
-an arbitrary order.</li>
- <li>While multiple scale and crop units are shown, all scaler units share the output region controls (digital zoom).
- However, each unit may have a different output resolution and pixel format.</li>
-</ul>
-
-<h3 id="startup">Startup and expected operation sequence</h3>
-<p>Please see <a
-href="https://android.googlesource.com/platform/hardware/libhardware/+/master/include/hardware/camera3.h">platform/hardware/libhardware/include/hardware/camera3.h</a>
-for definitions of these structures and methods.</p>
-<ol>
- <li>Framework calls camera_module_t->common.open(), which returns a
- hardware_device_t structure.</li>
- <li>Framework inspects the hardware_device_t->version field, and
-instantiates
- the appropriate handler for that version of the camera hardware device. In
- case the version is CAMERA_DEVICE_API_VERSION_3_0, the device is cast to
- a camera3_device_t.</li>
- <li>Framework calls camera3_device_t->ops->initialize() with the
-framework
- callback function pointers. This will only be called this one time after
- open(), before any other functions in the ops structure are called.</li>
- <li>The framework calls camera3_device_t->ops->configure_streams() with
-a list
- of input/output streams to the HAL device.</li>
- <li>The framework allocates gralloc buffers and calls
- camera3_device_t->ops->register_stream_buffers() for at least one of
-the
- output streams listed in configure_streams. The same stream is registered
- only once.</li>
- <li>The framework requests default settings for some number of use cases with
- calls to camera3_device_t->ops->construct_default_request_settings().
-This
- may occur any time after step 3.</li>
- <li>The framework constructs and sends the first capture request to the HAL
- with settings based on one of the sets of default settings, and with at
- least one output stream that has been registered earlier by the
- framework. This is sent to the HAL with
- camera3_device_t->ops->process_capture_request(). The HAL must block
-the
- return of this call until it is ready for the next request to be sent.</li>
- <li>The framework continues to submit requests, and possibly call
- register_stream_buffers() for not-yet-registered streams, and call
- construct_default_request_settings to get default settings buffers for
- other use cases.</li>
- <li>When the capture of a request begins (sensor starts exposing for the
- capture), the HAL calls camera3_callback_ops_t->notify() with the SHUTTER
- event, including the frame number and the timestamp for start of exposure.
- This notify call must be made before the first call to
- process_capture_result() for that frame number.</li>
- <li>After some pipeline delay, the HAL begins to return completed captures to
- the framework with camera3_callback_ops_t->process_capture_result().
-These
- are returned in the same order as the requests were submitted. Multiple
- requests can be in flight at once, depending on the pipeline depth of the
- camera HAL device.</li>
- <li>After some time, the framework may stop submitting new requests, wait for
- the existing captures to complete (all buffers filled, all results
- returned), and then call configure_streams() again. This resets the camera
- hardware and pipeline for a new set of input/output streams. Some streams
- may be reused from the previous configuration; if these streams' buffers
- had already been registered with the HAL, they will not be registered
- again. The framework then continues from step 7, if at least one
- registered output stream remains. (Otherwise, step 5 is required
-first.)</li>
- <li>Alternatively, the framework may call
-camera3_device_t->common->close()
- to end the camera session. This may be called at any time when no other
- calls from the framework are active, although the call may block until all
- in-flight captures have completed (all results returned, all buffers
- filled). After the close call returns, no more calls to the
- camera3_callback_ops_t functions are allowed from the HAL. Once the
- close() call is underway, the framework may not call any other HAL device
- functions.</li>
- <li>In case of an error or other asynchronous event, the HAL must call
- camera3_callback_ops_t->notify() with the appropriate error/event
- message. After returning from a fatal device-wide error notification, the
- HAL should act as if close() had been called on it. However, the HAL must
- either cancel or complete all outstanding captures before calling
- notify(), so that once notify() is called with a fatal error, the
- framework will not receive further callbacks from the device. Methods
- besides close() should return -ENODEV or NULL after the notify() method
- returns from a fatal error message.
- </li>
-</ol>
-<h3>Operational modes</h3>
-<p>The camera 3 HAL device can implement one of two possible operational modes:
- limited and full. Full support is expected from new higher-end
- devices. Limited mode has hardware requirements roughly in line with those
- for a camera HAL device v1 implementation, and is expected from older or
- inexpensive devices. Full is a strict superset of limited, and they share the
- same essential operational flow, as documented above.</p>
-<p>The HAL must indicate its level of support with the
- android.info.supportedHardwareLevel static metadata entry, with 0 indicating
- limited mode, and 1 indicating full mode support.</p>
-<p>Roughly speaking, limited-mode devices do not allow for application control
- of capture settings (3A control only), high-rate capture of high-resolution
- images, raw sensor readout, or support for YUV output streams above maximum
- recording resolution (JPEG only for large images).</p>
-<p>Here are the details of limited-mode behavior:</p>
-<ul>
- <li>Limited-mode devices do not need to implement accurate synchronization
- between capture request settings and the actual image data
- captured. Instead, changes to settings may take effect some time in the
- future, and possibly not for the same output frame for each settings
- entry. Rapid changes in settings may result in some settings never being
- used for a capture. However, captures that include high-resolution output
- buffers ( > 1080p ) have to use the settings as specified (but see below
- for processing rate).<br />
- <br />
- </li>
- <li>(TODO: Is this reference properly located? It was after the settings list below.) Captures in limited mode that include high-resolution (> 1080p) output
- buffers may block in process_capture_request() until all the output buffers
- have been filled. A full-mode HAL device must process sequences of
- high-resolution requests at the rate indicated in the static metadata for
- that pixel format. The HAL must still call process_capture_result() to
- provide the output; the framework must simply be prepared for
- process_capture_request() to block until after process_capture_result() for
- that request completes for high-resolution captures for limited-mode
- devices.<br />
- <br />
- </li>
- <li>Limited-mode devices do not need to support most of the
- settings/result/static info metadata. Full-mode devices must support all
- metadata fields listed in TODO. Specifically, only the following settings
- are expected to be consumed or produced by a limited-mode HAL device:
- <blockquote>
- <p> android.control.aeAntibandingMode (controls)<br />
-android.control.aeExposureCompensation (controls)<br />
-android.control.aeLock (controls)<br />
-android.control.aeMode (controls)<br />
- [OFF means ON_FLASH_TORCH - TODO]<br />
-android.control.aeRegions (controls)<br />
-android.control.aeTargetFpsRange (controls)<br />
-android.control.afMode (controls)<br />
- [OFF means infinity focus]<br />
-android.control.afRegions (controls)<br />
-android.control.awbLock (controls)<br />
-android.control.awbMode (controls)<br />
- [OFF not supported]<br />
-android.control.awbRegions (controls)<br />
-android.control.captureIntent (controls)<br />
-android.control.effectMode (controls)<br />
-android.control.mode (controls)<br />
- [OFF not supported]<br />
-android.control.sceneMode (controls)<br />
-android.control.videoStabilizationMode (controls)<br />
-android.control.aeAvailableAntibandingModes (static)<br />
-android.control.aeAvailableModes (static)<br />
-android.control.aeAvailableTargetFpsRanges (static)<br />
-android.control.aeCompensationRange (static)<br />
-android.control.aeCompensationStep (static)<br />
-android.control.afAvailableModes (static)<br />
-android.control.availableEffects (static)<br />
-android.control.availableSceneModes (static)<br />
-android.control.availableVideoStabilizationModes (static)<br />
-android.control.awbAvailableModes (static)<br />
-android.control.maxRegions (static)<br />
-android.control.sceneModeOverrides (static)<br />
-android.control.aeRegions (dynamic)<br />
-android.control.aeState (dynamic)<br />
-android.control.afMode (dynamic)<br />
-android.control.afRegions (dynamic)<br />
-android.control.afState (dynamic)<br />
-android.control.awbMode (dynamic)<br />
-android.control.awbRegions (dynamic)<br />
-android.control.awbState (dynamic)<br />
-android.control.mode (dynamic)</p>
- <p> android.flash.info.available (static)</p>
- <p> android.info.supportedHardwareLevel (static)</p>
- <p> android.jpeg.gpsCoordinates (controls)<br />
- android.jpeg.gpsProcessingMethod (controls)<br />
- android.jpeg.gpsTimestamp (controls)<br />
- android.jpeg.orientation (controls)<br />
- android.jpeg.quality (controls)<br />
- android.jpeg.thumbnailQuality (controls)<br />
- android.jpeg.thumbnailSize (controls)<br />
- android.jpeg.availableThumbnailSizes (static)<br />
- android.jpeg.maxSize (static)<br />
- android.jpeg.gpsCoordinates (dynamic)<br />
- android.jpeg.gpsProcessingMethod (dynamic)<br />
- android.jpeg.gpsTimestamp (dynamic)<br />
- android.jpeg.orientation (dynamic)<br />
- android.jpeg.quality (dynamic)<br />
- android.jpeg.size (dynamic)<br />
- android.jpeg.thumbnailQuality (dynamic)<br />
- android.jpeg.thumbnailSize (dynamic)</p>
- <p> android.lens.info.minimumFocusDistance (static)</p>
- <p> android.request.id (controls)<br />
- android.request.id (dynamic)</p>
- <p> android.scaler.cropRegion (controls)<br />
- [ignores (x,y), assumes center-zoom]<br />
- android.scaler.availableFormats (static)<br />
- [RAW not supported]<br />
- android.scaler.availableJpegMinDurations (static)<br />
- android.scaler.availableJpegSizes (static)<br />
- android.scaler.availableMaxDigitalZoom (static)<br />
- android.scaler.availableProcessedMinDurations (static)<br />
- android.scaler.availableProcessedSizes (static)<br />
- [full resolution not supported]<br />
- android.scaler.maxDigitalZoom (static)<br />
- android.scaler.cropRegion (dynamic)</p>
- <p> android.sensor.orientation (static)<br />
- android.sensor.timestamp (dynamic)</p>
- <p> android.statistics.faceDetectMode (controls)<br />
- android.statistics.info.availableFaceDetectModes (static)<br />
- android.statistics.faceDetectMode (dynamic)<br />
- android.statistics.faceIds (dynamic)<br />
- android.statistics.faceLandmarks (dynamic)<br />
- android.statistics.faceRectangles (dynamic)<br />
- android.statistics.faceScores (dynamic)</p>
- </blockquote>
- </li>
-</ul>
-<h3 id="interaction">Interaction between the application capture request, 3A control, and the
-processing pipeline</h3>
-
-<p>
-Depending on the settings in the 3A control block, the camera pipeline ignores some of the parameters in the application’s capture request
-and uses the values provided by the 3A control routines instead. For example, when auto-exposure is active, the exposure time,
-frame duration, and sensitivity parameters of the sensor are controlled by the platform 3A algorithm,
-and any app-specified values are ignored. The values chosen for the frame by the 3A routines must be
-reported in the output metadata. The following table describes the different modes of the 3A control block
-and the properties that are controlled by these modes. See the
-platform/system/media/camera/docs/docs.html file for definitions of these
-properties.
-</p>
-
-
-<table>
- <tr>
- <th>Parameter</th>
- <th>State</th>
- <th>Properties controlled</th>
- </tr>
-
- <tr>
- <td rowspan="5">android.control.aeMode</td>
- <td>OFF</td>
- <td>None</td>
- </tr>
- <tr>
- <td>ON</td>
- <td>
- <ul>
- <li>android.sensor.exposureTime</li>
- <li>android.sensor.frameDuration</li>
- <li>android.sensor.sensitivity</li>
- <li>android.lens.aperture (if supported)</li>
- <li>android.lens.filterDensity (if supported)</li>
- </ul>
- </tr>
- <tr>
- <td>ON_AUTO_FLASH</td>
- <td>Everything is ON, plus android.flash.firingPower, android.flash.firingTime, and android.flash.mode</td>
- </tr>
-
- <tr>
- <td>ON_ALWAYS_FLASH</td>
- <td>Same as ON_AUTO_FLASH</td>
- </tr>
-
- <tr>
- <td>ON_AUTO_FLASH_RED_EYE</td>
- <td>Same as ON_AUTO_FLASH</td>
- </tr>
-
- <tr>
- <td rowspan="2">android.control.awbMode</td>
- <td>OFF</td>
- <td>None</td>
- </tr>
-
- <tr>
- <td>WHITE_BALANCE_*</td>
- <td>android.colorCorrection.transform. Platform-specific adjustments if android.colorCorrection.mode is FAST or HIGH_QUALITY.</td>
- </tr>
-
- <tr>
- <td rowspan="2">android.control.afMode</td>
- <td>OFF</td>
- <td>None</td>
- </tr>
-
- <tr>
- <td>FOCUS_MODE_*</td>
- <td>android.lens.focusDistance</td>
- </tr>
-
- <tr>
- <td rowspan="2">android.control.videoStabilization</td>
- <td>OFF</td>
- <td>None</td>
- </tr>
-
- <tr>
- <td>ON</td>
- <td>Can adjust android.scaler.cropRegion to implement video stabilization</td>
- </tr>
-
- <tr>
- <td rowspan="3">android.control.mode</td>
- <td>OFF</td>
- <td>AE, AWB, and AF are disabled</td>
- </tr>
-
- <tr>
- <td>AUTO</td>
- <td>Individual AE, AWB, and AF settings are used</td>
- </tr>
-
- <tr>
- <td>SCENE_MODE_*</td>
- <td>Can override all parameters listed above. Individual 3A controls are disabled.</td>
- </tr>
-
-</table>
-
-<p>The controls exposed for the 3A algorithm mostly map 1:1 to the old API’s parameters
- (such as exposure compensation, scene mode, or white balance mode).
-</p>
-
-
-<p>
-The controls in the Image Processing block in <a href="#figure2">Figure 2</a> all operate on a similar principle, and generally each block has three modes:
-</p>
-
-<ul>
- <li>
- OFF: This processing block is disabled. The demosaic, color correction, and tone curve adjustment blocks cannot be disabled.
- </li>
- <li>
- FAST: In this mode, the processing block may not slow down the output frame rate compared to OFF mode, but should otherwise produce the best-quality output it can given that restriction. Typically, this would be used for preview or video recording modes, or burst capture for still images. On some devices, this may be equivalent to OFF mode (no processing can be done without slowing down the frame rate), and on some devices, this may be equivalent to HIGH_QUALITY mode (best quality still does not slow down frame rate).
- </li>
- <li>
- HIGH_QUALITY: In this mode, the processing block should produce the best quality result possible, slowing down the output frame rate as needed. Typically, this would be used for high-quality still capture. Some blocks include a manual control which can be optionally selected instead of FAST or HIGH_QUALITY. For example, the color correction block supports a color transform matrix, while the tone curve adjustment supports an arbitrary global tone mapping curve.
- </li>
-</ul>
-
-<p>See the <a href="">Android Camera Processing Pipeline Properties</a> spreadsheet for more information on all available properties.</p>
-
-<h2 id="metadata">Metadata support</h2>
-
-<p>To support the saving of DNG files by the Android framework, substantial metadata is required about the sensor’s characteristics. This includes information such as color spaces and lens shading functions.</p>
-<p>
-Most of this information is a static property of the camera subsystem, and can therefore be queried before configuring any output pipelines or submitting any requests. The new camera APIs greatly expand the information provided by the <code>getCameraInfo()</code> method to provide this information to the application.
-</p>
-<p>
-In addition, manual control of the camera subsystem requires feedback from the
-assorted devices about their current state, and the actual parameters used in
-capturing a given frame. If an application needs to implement a custom 3A
-routine (for example, to properly meter for an HDR burst), it needs to know the settings used to capture the latest set of results it has received in order to update the settings for the next request. Therefore, the new camera API adds a substantial amount of dynamic metadata to each captured frame. This includes the requested and actual parameters used for the capture, as well as additional per-frame metadata such as timestamps and statistics generator output.
-</p>
-
-<h2 id="3amodes">3A modes and state machines</h2>
-<p>While the actual 3A algorithms are up to the HAL implementation, a high-level
- state machine description is defined by the HAL interface to allow the HAL
- device and the framework to communicate about the current state of 3A and
-trigger 3A events.</p>
-<p>When the device is opened, all the individual 3A states must be
- STATE_INACTIVE. Stream configuration does not reset 3A. For example, locked
- focus must be maintained across the configure() call.</p>
-<p>Triggering a 3A action involves simply setting the relevant trigger entry in
- the settings for the next request to indicate start of trigger. For example,
- the trigger for starting an autofocus scan is setting the entry
- ANDROID_CONTROL_AF_TRIGGER to ANDROID_CONTROL_AF_TRIGGER_START for one
- request; and cancelling an autofocus scan is triggered by setting
- ANDROID_CONTROL_AF_TRIGGER to ANDROID_CONTRL_AF_TRIGGER_CANCEL. Otherwise,
- the entry will not exist or be set to ANDROID_CONTROL_AF_TRIGGER_IDLE. Each
- request with a trigger entry set to a non-IDLE value will be treated as an
- independent triggering event.</p>
-<p>At the top level, 3A is controlled by the ANDROID_CONTROL_MODE setting. It
- selects between no 3A (ANDROID_CONTROL_MODE_OFF), normal AUTO mode
- (ANDROID_CONTROL_MODE_AUTO), and using the scene mode setting
- (ANDROID_CONTROL_USE_SCENE_MODE):</p>
-<ul>
- <li>In OFF mode, each of the individual Auto-focus(AF), auto-exposure (AE),
-and auto-whitebalance (AWB) modes are effectively OFF,
- and none of the capture controls may be overridden by the 3A routines.</li>
- <li>In AUTO mode, AF, AE, and AWB modes all run
- their own independent algorithms, and have their own mode, state, and
- trigger metadata entries, as listed in the next section.</li>
- <li>In USE_SCENE_MODE, the value of the ANDROID_CONTROL_SCENE_MODE entry must
- be used to determine the behavior of 3A routines. In SCENE_MODEs other than
- FACE_PRIORITY, the HAL must override the values of
- ANDROID_CONTROL_AE/AWB/AF_MODE to be the mode it prefers for the selected
- SCENE_MODE. For example, the HAL may prefer SCENE_MODE_NIGHT to use
- CONTINUOUS_FOCUS AF mode. Any user selection of AE/AWB/AF_MODE when scene
- must be ignored for these scene modes.</li>
- <li>For SCENE_MODE_FACE_PRIORITY, the AE/AWB/AF_MODE controls work as in
- ANDROID_CONTROL_MODE_AUTO, but the 3A routines must bias toward metering
- and focusing on any detected faces in the scene.
- </li>
-</ul>
-
-<h3 id="autofocus">Auto-focus settings and result entries</h3>
-<p>Main metadata entries:</p>
-<p>ANDROID_CONTROL_AF_MODE: Control for selecting the current autofocus
-mode. Set by the framework in the request settings.</p>
-<p>AF_MODE_OFF: AF is disabled; the framework/app directly controls lens
-position.</p>
-<p>AF_MODE_AUTO: Single-sweep autofocus. No lens movement unless AF is
-triggered.</p>
-<p>AF_MODE_MACRO: Single-sweep up-close autofocus. No lens movement unless
-AF is triggered.</p>
-<p>AF_MODE_CONTINUOUS_VIDEO: Smooth continuous focusing, for recording
- video. Triggering immediately locks focus in current
-position. Canceling resumes cotinuous focusing.</p>
-<p>AF_MODE_CONTINUOUS_PICTURE: Fast continuous focusing, for
- zero-shutter-lag still capture. Triggering locks focus once currently
-active sweep concludes. Canceling resumes continuous focusing.</p>
-<p>AF_MODE_EDOF: Advanced extended depth of field focusing. There is no
- autofocus scan, so triggering one or canceling one has no effect.
-Images are focused automatically by the HAL.</p>
-<p>ANDROID_CONTROL_AF_STATE: Dynamic metadata describing the current AF
-algorithm state, reported by the HAL in the result metadata.</p>
-<p>AF_STATE_INACTIVE: No focusing has been done, or algorithm was
- reset. Lens is not moving. Always the state for MODE_OFF or MODE_EDOF.
-When the device is opened, it must start in this state.</p>
-<p>AF_STATE_PASSIVE_SCAN: A continuous focus algorithm is currently scanning
-for good focus. The lens is moving.</p>
-<p>AF_STATE_PASSIVE_FOCUSED: A continuous focus algorithm believes it is
- well focused. The lens is not moving. The HAL may spontaneously leave
-this state.</p>
-<p>AF_STATE_ACTIVE_SCAN: A scan triggered by the user is underway.</p>
-<p>AF_STATE_FOCUSED_LOCKED: The AF algorithm believes it is focused. The
-lens is not moving.</p>
-<p>AF_STATE_NOT_FOCUSED_LOCKED: The AF algorithm has been unable to
-focus. The lens is not moving.</p>
-<p>ANDROID_CONTROL_AF_TRIGGER: Control for starting an autofocus scan, the
- meaning of which depends on mode and state. Set by the framework in
-the request settings.</p>
-<p>AF_TRIGGER_IDLE: No current trigger.</p>
-<p>AF_TRIGGER_START: Trigger start of AF scan. Effect depends on mode and
-state.</p>
-<p>AF_TRIGGER_CANCEL: Cancel current AF scan if any, and reset algorithm to
-default.</p>
-<p>Additional metadata entries:</p>
-<p>ANDROID_CONTROL_AF_REGIONS: Control for selecting the regions of the field of
-view (FOV)
- that should be used to determine good focus. This applies to all AF
- modes that scan for focus. Set by the framework in the request
-settings.</p>
-
-<h3 id="autoexpose">Auto-exposure settings and result entries</h3>
-<p>Main metadata entries:</p>
-<p>ANDROID_CONTROL_AE_MODE: Control for selecting the current auto-exposure
-mode. Set by the framework in the request settings.</p>
-<p>
- AE_MODE_OFF: Autoexposure is disabled; the user controls exposure, gain,
- frame duration, and flash.
-</p>
-<p>AE_MODE_ON: Standard autoexposure, with flash control disabled. User may
- set flash to fire or to torch mode.
-</p>
-<p>AE_MODE_ON_AUTO_FLASH: Standard autoexposure, with flash on at HAL's
- discretion for precapture and still capture. User control of flash
- disabled.
-</p>
-<p>AE_MODE_ON_ALWAYS_FLASH: Standard autoexposure, with flash always fired
- for capture, and at HAL's discretion for precapture. User control of
- flash disabled.
-</p>
-<p>AE_MODE_ON_AUTO_FLASH_REDEYE: Standard autoexposure, with flash on at
- HAL's discretion for precapture and still capture. Use a flash burst
- at end of precapture sequence to reduce redeye in the final
- picture. User control of flash disabled.
-</p>
-<p>ANDROID_CONTROL_AE_STATE: Dynamic metadata describing the current AE
- algorithm state, reported by the HAL in the result metadata.
-</p>
-<p>AE_STATE_INACTIVE: Initial AE state after mode switch. When the device is
- opened, it must start in this state.
-</p>
-<p>AE_STATE_SEARCHING: AE is not converged to a good value and is adjusting
- exposure parameters.
-</p>
-<p>AE_STATE_CONVERGED: AE has found good exposure values for the current
- scene, and the exposure parameters are not changing. HAL may
- spontaneously leave this state to search for a better solution.
-</p>
-<p>AE_STATE_LOCKED: AE has been locked with the AE_LOCK control. Exposure
- values are not changing.
-</p>
-<p>AE_STATE_FLASH_REQUIRED: The HAL has converged exposure but believes
- flash is required for a sufficiently bright picture. Used for
- determining if a zero-shutter-lag frame can be used.
-</p>
-<p>AE_STATE_PRECAPTURE: The HAL is in the middle of a precapture
- sequence. Depending on AE mode, this mode may involve firing the
- flash for metering or a burst of flash pulses for redeye reduction.
-</p>
-<p>ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER: Control for starting a metering
- sequence before capturing a high-quality image. Set by the framework in
- the request settings.
-</p>
-<p>PRECAPTURE_TRIGGER_IDLE: No current trigger.
-</p>
-<p>PRECAPTURE_TRIGGER_START: Start a precapture sequence. The HAL should
- use the subsequent requests to measure good exposure/white balance
- for an upcoming high-resolution capture.
-</p>
-<p>Additional metadata entries:
-</p>
-<p>ANDROID_CONTROL_AE_LOCK: Control for locking AE controls to their current
- values.</p>
-<p>ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION: Control for adjusting AE
- algorithm target brightness point.</p>
-<p>ANDROID_CONTROL_AE_TARGET_FPS_RANGE: Control for selecting the target frame
- rate range for the AE algorithm. The AE routine cannot change the frame
- rate to be outside these bounds.</p>
-<p>ANDROID_CONTROL_AE_REGIONS: Control for selecting the regions of the FOV
- that should be used to determine good exposure levels. This applies to
- all AE modes besides OFF.
-</p>
-
-<h3 id="autowb">Auto-whitebalance settings and result entries</h3>
-<p>Main metadata entries:</p>
-<p>ANDROID_CONTROL_AWB_MODE: Control for selecting the current white-balance
- mode.
-</p>
-<p>AWB_MODE_OFF: Auto-whitebalance is disabled. User controls color matrix.
-</p>
-<p>AWB_MODE_AUTO: Automatic white balance is enabled; 3A controls color
- transform, possibly using more complex transforms than a simple
- matrix.
-</p>
-<p>AWB_MODE_INCANDESCENT: Fixed white balance settings good for indoor
- incandescent (tungsten) lighting, roughly 2700K.
-</p>
-<p>AWB_MODE_FLUORESCENT: Fixed white balance settings good for fluorescent
- lighting, roughly 5000K.
-</p>
-<p>AWB_MODE_WARM_FLUORESCENT: Fixed white balance settings good for
- fluorescent lighting, roughly 3000K.
-</p>
-<p>AWB_MODE_DAYLIGHT: Fixed white balance settings good for daylight,
- roughly 5500K.
-</p>
-<p>AWB_MODE_CLOUDY_DAYLIGHT: Fixed white balance settings good for clouded
- daylight, roughly 6500K.
-</p>
-<p>AWB_MODE_TWILIGHT: Fixed white balance settings good for
- near-sunset/sunrise, roughly 15000K.
-</p>
-<p>AWB_MODE_SHADE: Fixed white balance settings good for areas indirectly
- lit by the sun, roughly 7500K.
-</p>
-<p>ANDROID_CONTROL_AWB_STATE: Dynamic metadata describing the current AWB
- algorithm state, reported by the HAL in the result metadata.
-</p>
-<p>AWB_STATE_INACTIVE: Initial AWB state after mode switch. When the device
- is opened, it must start in this state.
-</p>
-<p>AWB_STATE_SEARCHING: AWB is not converged to a good value and is
- changing color adjustment parameters.
-</p>
-<p>AWB_STATE_CONVERGED: AWB has found good color adjustment values for the
- current scene, and the parameters are not changing. HAL may
- spontaneously leave this state to search for a better solution.
-</p>
-<p>AWB_STATE_LOCKED: AWB has been locked with the AWB_LOCK control. Color
- adjustment values are not changing.
-</p>
-<p>Additional metadata entries:
-</p>
-<p>ANDROID_CONTROL_AWB_LOCK: Control for locking AWB color adjustments to
- their current values.
-</p>
-<p>ANDROID_CONTROL_AWB_REGIONS: Control for selecting the regions of the FOV
- that should be used to determine good color balance. This applies only
- to auto-whitebalance mode.
-</p>
-
-<h3 id="genstate">General state machine transition notes
-</h3>
-<p>Switching between AF, AE, or AWB modes always resets the algorithm's state
- to INACTIVE. Similarly, switching between CONTROL_MODE or
- CONTROL_SCENE_MODE if CONTROL_MODE == USE_SCENE_MODE resets all the
- algorithm states to INACTIVE.
-</p>
-<p>The tables below are per-mode.
-</p>
-
-<h3 id="af-state">AF state machines</h3>
-<table width="100%" border="1">
- <tr>
- <td colspan="4" scope="col"><h4>mode = AF_MODE_OFF or AF_MODE_EDOF</h4></td>
- </tr>
- <tr>
- <th scope="col">State</th>
- <th scope="col">Transformation cause</th>
- <th scope="col">New state</th>
- <th scope="col">Notes</th>
- </tr>
- <tr>
- <td>INACTIVE</td>
- <td> </td>
- <td> </td>
- <td>AF is disabled</td>
- </tr>
- <tr>
- <td colspan="4"><h4>mode = AF_MODE_AUTO or AF_MODE_MACRO</h4></td>
- </tr>
- <tr>
- <th scope="col">State</th>
- <th scope="col">Transformation cause</th>
- <th scope="col">New state</th>
- <th scope="col">Notes</th>
- </tr>
- <tr>
- <td>INACTIVE</td>
- <td>AF_TRIGGER</td>
- <td>ACTIVE_SCAN</td>
- <td>Start AF sweep<br />
- Lens now moving</td>
- </tr>
- <tr>
- <td>ACTIVE_SCAN</td>
- <td>AF sweep done</td>
- <td>FOCUSED_LOCKED</td>
- <td>If AF successful<br />
- Lens now locked </td>
- </tr>
- <tr>
- <td>ACTIVE_SCAN</td>
- <td>AF sweep done</td>
- <td>NOT_FOCUSED_LOCKED</td>
- <td>If AF successful<br />
-Lens now locked </td>
- </tr>
- <tr>
- <td>ACTIVE_SCAN</td>
- <td>AF_CANCEL</td>
- <td>INACTIVE</td>
- <td>Cancel/reset AF<br />
- Lens now locked</td>
- </tr>
- <tr>
- <td>FOCUSED_LOCKED</td>
- <td>AF_CANCEL</td>
- <td>INACTIVE</td>
- <td>Cancel/reset AF</td>
- </tr>
- <tr>
- <td>FOCUSED_LOCKED</td>
- <td>AF_TRIGGER</td>
- <td>ACTIVE_SCAN </td>
- <td>Start new sweep<br />
- Lens now moving</td>
- </tr>
- <tr>
- <td>NOT_FOCUSED_LOCKED</td>
- <td>AF_CANCEL</td>
- <td>INACTIVE</td>
- <td>Cancel/reset AF</td>
- </tr>
- <tr>
- <td>NOT_FOCUSED_LOCKED</td>
- <td>AF_TRIGGER</td>
- <td>ACTIVE_SCAN</td>
- <td>Start new sweep<br />
-Lens now moving</td>
- </tr>
- <tr>
- <td>All states</td>
- <td>mode change </td>
- <td>INACTIVE</td>
- <td> </td>
- </tr>
- <tr>
- <td colspan="4"><h4>mode = AF_MODE_CONTINUOUS_VIDEO</h4></td>
- </tr>
- <tr>
- <th scope="col">State</th>
- <th scope="col">Transformation cause</th>
- <th scope="col">New state</th>
- <th scope="col">Notes</th>
- </tr>
- <tr>
- <td>INACTIVE</td>
- <td>HAL initiates new scan</td>
- <td>PASSIVE_SCAN</td>
- <td>Start AF sweep<br />
-Lens now moving</td>
- </tr>
- <tr>
- <td>INACTIVE</td>
- <td>AF_TRIGGER</td>
- <td>NOT_FOCUSED_LOCKED</td>
- <td>AF state query <br />
- Lens now locked</td>
- </tr>
- <tr>
- <td>PASSIVE_SCAN</td>
- <td>HAL completes current scan</td>
- <td>PASSIVE_FOCUSED</td>
- <td>End AF scan<br />
- Lens now locked <br /></td>
- </tr>
- <tr>
- <td>PASSIVE_SCAN</td>
- <td>AF_TRIGGER</td>
- <td>FOCUSED_LOCKED</td>
- <td>Immediate transformation<br />
- if focus is good<br />
-Lens now locked</td>
- </tr>
- <tr>
- <td>PASSIVE_SCAN</td>
- <td>AF_TRIGGER</td>
- <td>NOT_FOCUSED_LOCKED</td>
- <td>Immediate transformation<br />
-if focus is bad<br />
-Lens now locked</td>
- </tr>
- <tr>
- <td>PASSIVE_SCAN</td>
- <td>AF_CANCEL</td>
- <td>INACTIVE</td>
- <td>Reset lens position<br />
- Lens now locked</td>
- </tr>
- <tr>
- <td>PASSIVE_FOCUSED</td>
- <td>HAL initiates new scan</td>
- <td>PASSIVE_SCAN</td>
- <td>Start AF scan<br />
- Lens now moving</td>
- </tr>
- <tr>
- <td>PASSIVE_FOCUSED</td>
- <td>AF_TRIGGER</td>
- <td>FOCUSED_LOCKED</td>
- <td>Immediate transformation<br />
-if focus is good<br />
-Lens now locked</td>
- </tr>
- <tr>
- <td>PASSIVE_FOCUSED</td>
- <td>AF_TRIGGER</td>
- <td>NOT_FOCUSED_LOCKED</td>
- <td>Immediate transformation<br />
-if focus is bad<br />
-Lens now locked</td>
- </tr>
- <tr>
- <td>FOCUSED_LOCKED</td>
- <td>AF_TRIGGER</td>
- <td>FOCUSED_LOCKED</td>
- <td>No effect</td>
- </tr>
- <tr>
- <td>FOCUSED_LOCKED</td>
- <td>AF_CANCEL</td>
- <td>INACTIVE</td>
- <td>Restart AF scan</td>
- </tr>
- <tr>
- <td>NOT_FOCUSED_LOCKED</td>
- <td>AF_TRIGGER</td>
- <td>NOT_FOCUSED_LOCKED</td>
- <td>No effect</td>
- </tr>
- <tr>
- <td>NOT_FOCUSED_LOCKED</td>
- <td>AF_CANCEL</td>
- <td>INACTIVE</td>
- <td>Restart AF scan</td>
- </tr>
- <tr>
- <td colspan="4"><h4>mode = AF_MODE_CONTINUOUS_PICTURE</h4></td>
- </tr>
- <tr>
- <th scope="col">State</th>
- <th scope="col">Transformation cause</th>
- <th scope="col">New state</th>
- <th scope="col">Notes</th>
- </tr>
- <tr>
- <td>INACTIVE</td>
- <td>HAL initiates new scan</td>
- <td>PASSIVE_SCAN</td>
- <td>Start AF scan<br />
- Lens now moving</td>
- </tr>
- <tr>
- <td>INACTIVE</td>
- <td>AF_TRIGGER</td>
- <td>NOT_FOCUSED_LOCKED</td>
- <td>AF state query<br />
- Lens now locked</td>
- </tr>
- <tr>
- <td>PASSIVE_SCAN</td>
- <td>HAL completes current scan</td>
- <td>PASSIVE_FOCUSED</td>
- <td>End AF scan<br />
- Lens now locked</td>
- </tr>
- <tr>
- <td>PASSIVE_SCAN</td>
- <td>AF_TRIGGER</td>
- <td>FOCUSED_LOCKED</td>
- <td>Eventual transformation once focus good<br />
- Lens now locked</td>
- </tr>
- <tr>
- <td>PASSIVE_SCAN</td>
- <td>AF_TRIGGER</td>
- <td>NOT_FOCUSED_LOCKED</td>
- <td>Eventual transformation if cannot focus<br />
-Lens now locked</td>
- </tr>
- <tr>
- <td>PASSIVE_SCAN</td>
- <td>AF_CANCEL</td>
- <td>INACTIVE</td>
- <td>Reset lens position<br />
- Lens now locked</td>
- </tr>
- <tr>
- <td>PASSIVE_FOCUSED</td>
- <td>HAL initiates new scan</td>
- <td>PASSIVE_SCAN</td>
- <td>Start AF scan<br />
-Lens now moving</td>
- </tr>
- <tr>
- <td>PASSIVE_FOCUSED</td>
- <td>AF_TRIGGER</td>
- <td>FOCUSED_LOCKED</td>
- <td>Immediate transformation if focus is good<br />
-Lens now locked</td>
- </tr>
- <tr>
- <td>PASSIVE_FOCUSED</td>
- <td>AF_TRIGGER</td>
- <td>NOT_FOCUSED_LOCKED</td>
- <td>Immediate transformation if focus is bad<br />
-Lens now locked</td>
- </tr>
- <tr>
- <td>FOCUSED_LOCKED</td>
- <td>AF_TRIGGER</td>
- <td>FOCUSED_LOCKED</td>
- <td>No effect</td>
- </tr>
- <tr>
- <td>FOCUSED_LOCKED</td>
- <td>AF_CANCEL</td>
- <td>INACTIVE</td>
- <td>Restart AF scan</td>
- </tr>
- <tr>
- <td>NOT_FOCUSED_LOCKED</td>
- <td>AF_TRIGGER</td>
- <td>NOT_FOCUSED_LOCKED</td>
- <td>No effect</td>
- </tr>
- <tr>
- <td>NOT_FOCUSED_LOCKED</td>
- <td>AF_CANCEL</td>
- <td>INACTIVE</td>
- <td>Restart AF scan</td>
- </tr>
-</table>
-<h3 id="aeawb-state">AE and AWB state machines</h3>
-<p>The AE and AWB state machines are mostly identical. AE has additional
-FLASH_REQUIRED and PRECAPTURE states. So rows below that refer to those two
-states should be ignored for the AWB state machine.</p>
-<table width="100%" border="1">
- <tr>
- <td colspan="4" scope="col"><h4>mode = AE_MODE_OFF / AWB mode not
-AUTO</h4></td>
- </tr>
- <tr>
- <th scope="col">State</th>
- <th scope="col">Transformation cause</th>
- <th scope="col">New state</th>
- <th scope="col">Notes</th>
- </tr>
- <tr>
- <td>INACTIVE</td>
- <td> </td>
- <td> </td>
- <td>AE/AWB disabled</td>
- </tr>
- <tr>
- <td colspan="4"><h4>mode = AE_MODE_ON_* / AWB_MODE_AUTO</h4></td>
- </tr>
- <tr>
- <th scope="col">State</th>
- <th scope="col">Transformation cause</th>
- <th scope="col">New state</th>
- <th scope="col">Notes</th>
- </tr>
- <tr>
- <td>INACTIVE</td>
- <td>HAL initiates AE/AWB scan</td>
- <td>SEARCHING</td>
- <td> </td>
- </tr>
- <tr>
- <td>INACTIVE</td>
- <td>AE/AWB_LOCK on</td>
- <td>LOCKED</td>
- <td>Values locked</td>
- </tr>
- <tr>
- <td>SEARCHING</td>
- <td>HAL finishes AE/AWB scan</td>
- <td>CONVERGED</td>
- <td>Good values, not changing</td>
- </tr>
- <tr>
- <td>SEARCHING</td>
- <td>HAL finishes AE scan</td>
- <td>FLASH_REQUIRED</td>
- <td>Converged but too dark without flash</td>
- </tr>
- <tr>
- <td>SEARCHING</td>
- <td>AE/AWB_LOCK on</td>
- <td>LOCKED</td>
- <td>Values locked</td>
- </tr>
- <tr>
- <td>CONVERGED</td>
- <td>HAL initiates AE/AWB scan</td>
- <td>SEARCHING</td>
- <td>Values locked</td>
- </tr>
- <tr>
- <td>CONVERGED</td>
- <td>AE/AWB_LOCK on</td>
- <td>LOCKED</td>
- <td>Values locked</td>
- </tr>
- <tr>
- <td>FLASH_REQUIRED</td>
- <td>HAL initiates AE/AWB scan</td>
- <td>SEARCHING</td>
- <td>Values locked</td>
- </tr>
- <tr>
- <td>FLASH_REQUIRED</td>
- <td>AE/AWB_LOCK on</td>
- <td>LOCKED</td>
- <td>Values locked</td>
- </tr>
- <tr>
- <td>LOCKED</td>
- <td>AE/AWB_LOCK off</td>
- <td>SEARCHING</td>
- <td>Values not good after unlock</td>
- </tr>
- <tr>
- <td>LOCKED</td>
- <td>AE/AWB_LOCK off</td>
- <td>CONVERGED</td>
- <td>Values good after unlock</td>
- </tr>
- <tr>
- <td>LOCKED</td>
- <td>AE_LOCK off</td>
- <td>FLASH_REQUIRED</td>
- <td>Exposure good, but too dark</td>
- </tr>
- <tr>
- <td>All AE states </td>
- <td> PRECAPTURE_START</td>
- <td>PRECAPTURE</td>
- <td>Start precapture sequence</td>
- </tr>
- <tr>
- <td>PRECAPTURE</td>
- <td>Sequence done, AE_LOCK off </td>
- <td>CONVERGED</td>
- <td>Ready for high-quality capture</td>
- </tr>
- <tr>
- <td>PRECAPTURE</td>
- <td>Sequence done, AE_LOCK on </td>
- <td>LOCKED</td>
- <td>Ready for high-quality capture</td>
- </tr>
-</table>
-
-<h2 id="output">Output streams</h2>
-
-<p>Unlike the old camera subsystem, which has 3-4 different ways of producing data from the camera (ANativeWindow-based preview operations, preview callbacks, video callbacks, and takePicture callbacks), the new subsystem operates solely on the ANativeWindow-based pipeline for all resolutions and output formats. Multiple such streams can be configured at once, to send a single frame to many targets such as the GPU, the video encoder, RenderScript, or app-visible buffers (RAW Bayer, processed YUV buffers, or JPEG-encoded buffers).
-</p>
-
-<p>As an optimization, these output streams must be configured ahead of time, and only a limited number may exist at once. This allows for pre-allocation of memory buffers and configuration of the camera hardware, so that when requests are submitted with multiple or varying output pipelines listed, there won’t be delays or latency in fulfilling the request.
-</p>
-
-<p>
-To support backwards compatibility with the current camera API, at least 3 simultaneous YUV output streams must be supported, plus one JPEG stream. This is required for video snapshot support with the application also receiving YUV buffers:
-
-<ul>
- <li>One stream to the GPU/SurfaceView (opaque YUV format) for preview</li>
- <li>One stream to the video encoder (opaque YUV format) for recording</li>
- <li>One stream to the application (known YUV format) for preview frame callbacks
- <li>One stream to the application (JPEG) for video snapshots.</li>
-</ul>
-
-<p> In addition, at least one RAW Bayer output must be supported at the same time for the new camera subsystem.
-This means that the minimum output stream count is five (one RAW, three YUV, and one JPEG).
-</p>
-<h2 id="cropping">Cropping</h2>
-<p>Cropping of the full pixel array (for digital zoom and other use cases where
- a smaller FOV is desirable) is communicated through the
- ANDROID_SCALER_CROP_REGION setting. This is a per-request setting, and can
- change on a per-request basis, which is critical for implementing smooth
- digital zoom.</p>
-<p>The region is defined as a rectangle (x, y, width, height), with (x, y)
- describing the top-left corner of the rectangle. The rectangle is defined on
- the coordinate system of the sensor active pixel array, with (0,0) being the
- top-left pixel of the active pixel array. Therefore, the width and height
- cannot be larger than the dimensions reported in the
- ANDROID_SENSOR_ACTIVE_PIXEL_ARRAY static info field. The minimum allowed
- width and height are reported by the HAL through the
- ANDROID_SCALER_MAX_DIGITAL_ZOOM static info field, which describes the
- maximum supported zoom factor. Therefore, the minimum crop region width and
- height are:</p>
-<pre>
-{width, height} =
- { floor(ANDROID_SENSOR_ACTIVE_PIXEL_ARRAY[0] /
- ANDROID_SCALER_MAX_DIGITAL_ZOOM),
- floor(ANDROID_SENSOR_ACTIVE_PIXEL_ARRAY[1] /
- ANDROID_SCALER_MAX_DIGITAL_ZOOM) }
-</pre>
-<p>If the crop region needs to fulfill specific requirements (for example, it
- needs to start on even coordinates, and its width/height needs to be even),
- the HAL must do the necessary rounding and write out the final crop region
- used in the output result metadata. Similarly, if the HAL implements video
- stabilization, it must adjust the result crop region to describe the region
- actually included in the output after video stabilization is applied. In
- general, a camera-using application must be able to determine the field of
- view it is receiving based on the crop region, the dimensions of the image
- sensor, and the lens focal length.</p>
-<p>Since the crop region applies to all streams, which may have different aspect
- ratios than the crop region, the exact sensor region used for each stream may
- be smaller than the crop region. Specifically, each stream should maintain
- square pixels and its aspect ratio by minimally further cropping the defined
- crop region. If the stream's aspect ratio is wider than the crop region, the
- stream should be further cropped vertically, and if the stream's aspect ratio
- is narrower than the crop region, the stream should be further cropped
- horizontally.</p>
-<p>In all cases, the stream crop must be centered within the full crop region,
- and each stream is only either cropped horizontally or vertical relative to
- the full crop region, never both.</p>
-<p>For example, if two streams are defined, a 640x480 stream (4:3 aspect), and a
- 1280x720 stream (16:9 aspect), below demonstrates the expected output regions
- for each stream for a few sample crop regions, on a hypothetical 3 MP (2000 x
- 1500 pixel array) sensor.</p>
-<p>Crop region: (500, 375, 1000, 750) (4:3 aspect ratio)</p>
-<blockquote>
- <p> 640x480 stream crop: (500, 375, 1000, 750) (equal to crop region)<br />
- 1280x720 stream crop: (500, 469, 1000, 562) (marked with =)</p>
-</blockquote>
-<pre>0 1000 2000
- +---------+---------+---------+----------+
- | Active pixel array |
- | |
- | |
- + +-------------------+ + 375
- | | | |
- | O===================O |
- | I 1280x720 stream I |
- + I I + 750
- | I I |
- | O===================O |
- | | | |
- + +-------------------+ + 1125
- | Crop region, 640x480 stream |
- | |
- | |
- +---------+---------+---------+----------+ 1500</pre>
-<p>(TODO: Recreate these in Omnigraffle and replace.)</p>
-<p>Crop region: (500, 375, 1333, 750) (16:9 aspect ratio)</p>
-<blockquote>
- <p> 640x480 stream crop: (666, 375, 1000, 750) (marked with =)<br />
- 1280x720 stream crop: (500, 375, 1333, 750) (equal to crop region)</p>
-</blockquote>
-<pre>0 1000 2000
- +---------+---------+---------+----------+
- | Active pixel array |
- | |
- | |
- + +---O==================O---+ + 375
- | | I 640x480 stream I | |
- | | I I | |
- | | I I | |
- + | I I | + 750
- | | I I | |
- | | I I | |
- | | I I | |
- + +---O==================O---+ + 1125
- | Crop region, 1280x720 stream |
- | |
- | |
- +---------+---------+---------+----------+ 1500
-</pre>
-<p>Crop region: (500, 375, 750, 750) (1:1 aspect ratio)</p>
-<blockquote>
- <p> 640x480 stream crop: (500, 469, 750, 562) (marked with =)<br />
- 1280x720 stream crop: (500, 543, 750, 414) (marged with #)</p>
-</blockquote>
-<pre>0 1000 2000
- +---------+---------+---------+----------+
- | Active pixel array |
- | |
- | |
- + +--------------+ + 375
- | O==============O |
- | ################ |
- | # # |
- + # # + 750
- | # # |
- | ################ 1280x720 |
- | O==============O 640x480 |
- + +--------------+ + 1125
- | Crop region |
- | |
- | |
- +---------+---------+---------+----------+ 1500
-</pre>
-<p>And a final example, a 1024x1024 square aspect ratio stream instead of the
- 480p stream:</p>
-<p>Crop region: (500, 375, 1000, 750) (4:3 aspect ratio)</p>
-<blockquote>
- <p> 1024x1024 stream crop: (625, 375, 750, 750) (marked with #)<br />
- 1280x720 stream crop: (500, 469, 1000, 562) (marked with =)</p>
-</blockquote>
-<pre>0 1000 2000
- +---------+---------+---------+----------+
- | Active pixel array |
- | |
- | 1024x1024 stream |
- + +--###############--+ + 375
- | | # # | |
- | O===================O |
- | I 1280x720 stream I |
- + I I + 750
- | I I |
- | O===================O |
- | | # # | |
- + +--###############--+ + 1125
- | Crop region |
- | |
- | |
- +---------+---------+---------+----------+ 1500
-</pre>
-<h2 id="reprocessing">Reprocessing</h2>
-
-<p>Additional support for DNGs is provided by reprocessing support for RAW Bayer data.
-This support allows the camera pipeline to process a previously captured RAW buffer and metadata
-(an entire frame that was recorded previously), to produce a new rendered YUV or JPEG output.
-</p>
-<h2 id="errors">Error management</h2>
-<p>Camera HAL device ops functions that have a return value will all return
- -ENODEV / NULL in case of a serious error. This means the device cannot
- continue operation, and must be closed by the framework. Once this error is
- returned by some method, or if notify() is called with ERROR_DEVICE, only
- the close() method can be called successfully. All other methods will return
- -ENODEV / NULL.</p>
-<p>If a device op is called in the wrong sequence, for example if the framework
- calls configure_streams() is called before initialize(), the device must
- return -ENOSYS from the call, and do nothing.</p>
-<p>Transient errors in image capture must be reported through notify() as follows:</p>
-<ul>
- <li>The failure of an entire capture to occur must be reported by the HAL by
- calling notify() with ERROR_REQUEST. Individual errors for the result
- metadata or the output buffers must not be reported in this case.</li>
- <li>If the metadata for a capture cannot be produced, but some image buffers
- were filled, the HAL must call notify() with ERROR_RESULT.</li>
- <li>If an output image buffer could not be filled, but either the metadata was
- produced or some other buffers were filled, the HAL must call notify() with
- ERROR_BUFFER for each failed buffer.</li>
-</ul>
-<p>In each of these transient failure cases, the HAL must still call
- process_capture_result, with valid output buffer_handle_t. If the result
- metadata could not be produced, it should be NULL. If some buffers could not
- be filled, their sync fences must be set to the error state.</p>
-<p>Invalid input arguments result in -EINVAL from the appropriate methods. In
- that case, the framework must act as if that call had never been made.</p>
-<h2 id="stream-mgmt">Stream management</h2>
-<h3 id="configure-streams">configure_streams</h3>
-<p>Reset the HAL camera device processing pipeline and set up new input and
- output streams. This call replaces any existing stream configuration with
- the streams defined in the stream_list. This method will be called at
- least once after initialize() before a request is submitted with
- process_capture_request().</p>
-<p>The stream_list must contain at least one output-capable stream, and may
- not contain more than one input-capable stream.</p>
-<p>The stream_list may contain streams that are also in the currently-active
- set of streams (from the previous call to configure_stream()). These
- streams will already have valid values for usage, max_buffers, and the
- private pointer. If such a stream has already had its buffers registered,
- register_stream_buffers() will not be called again for the stream, and
- buffers from the stream can be immediately included in input requests.</p>
-<p>If the HAL needs to change the stream configuration for an existing
- stream due to the new configuration, it may rewrite the values of usage
- and/or max_buffers during the configure call. The framework will detect
- such a change, and will then reallocate the stream buffers, and call
- register_stream_buffers() again before using buffers from that stream in
- a request.</p>
-<p>If a currently-active stream is not included in stream_list, the HAL may
- safely remove any references to that stream. It will not be reused in a
- later configure() call by the framework, and all the gralloc buffers for
- it will be freed after the configure_streams() call returns.</p>
-<p>The stream_list structure is owned by the framework, and may not be
- accessed once this call completes. The address of an individual
- camera3_stream_t structure will remain valid for access by the HAL until
- the end of the first configure_stream() call which no longer includes
- that camera3_stream_t in the stream_list argument. The HAL may not change
- values in the stream structure outside of the private pointer, except for
- the usage and max_buffers members during the configure_streams() call
- itself.</p>
-<p>If the stream is new, the usage, max_buffer, and private pointer fields
- of the stream structure will all be set to 0. The HAL device must set
- these fields before the configure_streams() call returns. These fields
- are then used by the framework and the platform gralloc module to
- allocate the gralloc buffers for each stream.</p>
-<p>Before such a new stream can have its buffers included in a capture
- request, the framework will call register_stream_buffers() with that
- stream. However, the framework is not required to register buffers for
- _all_ streams before submitting a request. This allows for quick startup
- of (for example) a preview stream, with allocation for other streams
- happening later or concurrently.</p>
-<h4>Preconditions</h4>
-<p>The framework will only call this method when no captures are being
- processed. That is, all results have been returned to the framework, and
- all in-flight input and output buffers have been returned and their
- release sync fences have been signaled by the HAL. The framework will not
- submit new requests for capture while the configure_streams() call is
- underway.</p>
-<h4>Postconditions</h4>
-<p>The HAL device must configure itself to provide maximum possible output
- frame rate given the sizes and formats of the output streams, as
- documented in the camera device's static metadata.</p>
-<h4>Performance expectations</h4>
-<p>This call is expected to be heavyweight and possibly take several hundred
- milliseconds to complete, since it may require resetting and
- reconfiguring the image sensor and the camera processing pipeline.
- Nevertheless, the HAL device should attempt to minimize the
- reconfiguration delay to minimize the user-visible pauses during
- application operational mode changes (such as switching from still
- capture to video recording).</p>
-<h4>Return values</h4>
-<ul>
- <li>0: On successful stream configuration<br />
- </li>
- <li>-EINVAL: If the requested stream configuration is invalid. Some examples
- of invalid stream configurations include:
- <ul>
- <li>Including more than 1 input-capable stream (INPUT or
- BIDIRECTIONAL)</li>
- <li>Not including any output-capable streams (OUTPUT or
- BIDIRECTIONAL)</li>
- <li>Including streams with unsupported formats, or an unsupported
- size for that format.</li>
- <li>Including too many output streams of a certain format.<br />
- Note that the framework submitting an invalid stream
- configuration is not normal operation, since stream
- configurations are checked before configure. An invalid
- configuration means that a bug exists in the framework code, or
- there is a mismatch between the HAL's static metadata and the
- requirements on streams.</li>
- </ul>
- </li>
- <li>-ENODEV: If there has been a fatal error and the device is no longer
- operational. Only close() can be called successfully by the
- framework after this error is returned.</li>
-</ul>
-<h3 id="register-buffers">register_stream_buffers</h3>
-<p>Register buffers for a given stream with the HAL device. This method is
- called by the framework after a new stream is defined by
- configure_streams, and before buffers from that stream are included in a
- capture request. If the same stream is listed in a subsequent
- configure_streams() call, register_stream_buffers will _not_ be called
- again for that stream.</p>
-<p>The framework does not need to register buffers for all configured
- streams before it submits the first capture request. This allows quick
- startup for preview (or similar use cases) while other streams are still
- being allocated.</p>
-<p>This method is intended to allow the HAL device to map or otherwise
- prepare the buffers for later use. The buffers passed in will already be
- locked for use. At the end of the call, all the buffers must be ready to
- be returned to the stream. The buffer_set argument is only valid for the
- duration of this call.</p>
-<p>If the stream format was set to HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED,
- the camera HAL should inspect the passed-in buffers here to determine any
- platform-private pixel format information.</p>
-<h4>Return values</h4>
-<ul>
- <li>0: On successful registration of the new stream buffers</li>
- <li>-EINVAL: If the stream_buffer_set does not refer to a valid active
- stream, or if the buffers array is invalid.</li>
- <li>-ENOMEM: If there was a failure in registering the buffers. The framework
- must consider all the stream buffers to be unregistered, and can
- try to register again later.</li>
- <li>-ENODEV: If there is a fatal error, and the device is no longer
- operational. Only close() can be called successfully by the
- framework after this error is returned.</li>
-</ul>
-<h2 id="request-creation">Request creation and submission</h2>
-<h3 id="default-settings">construct_default_request_settings</h3>
-<p>Create capture settings for standard camera use cases. The device must return a settings buffer that is configured to meet the
- requested use case, which must be one of the CAMERA3_TEMPLATE_*
-enums. All request control fields must be included.</p>
-<p>The HAL retains ownership of this structure, but the pointer to the
- structure must be valid until the device is closed. The framework and the
- HAL may not modify the buffer once it is returned by this call. The same
- buffer may be returned for subsequent calls for the same template, or for
- other templates.</p>
-<h4>Return values</h4>
-<ul>
- <li>Valid metadata: On successful creation of a default settings
- buffer.</li>
- <li>NULL: In case of a fatal error. After this is returned, only
- the close() method can be called successfully by the
- framework. </li>
-</ul>
-<h3 id="process-capture">process_capture_request</h3>
-<p>Send a new capture request to the HAL. The HAL should not return from
- this call until it is ready to accept the next request to process. Only
- one call to process_capture_request() will be made at a time by the
- framework, and the calls will all be from the same thread. The next call
- to process_capture_request() will be made as soon as a new request and
- its associated buffers are available. In a normal preview scenario, this
- means the function will be called again by the framework almost
- instantly.</p>
-<p>The actual request processing is asynchronous, with the results of
- capture being returned by the HAL through the process_capture_result()
- call. This call requires the result metadata to be available, but output
- buffers may simply provide sync fences to wait on. Multiple requests are
- expected to be in flight at once, to maintain full output frame rate.</p>
-<p>The framework retains ownership of the request structure. It is only
- guaranteed to be valid during this call. The HAL device must make copies
- of the information it needs to retain for the capture processing. The HAL
- is responsible for waiting on and closing the buffers' fences and
- returning the buffer handles to the framework.</p>
-<p>The HAL must write the file descriptor for the input buffer's release
- sync fence into input_buffer->release_fence, if input_buffer is not
- NULL. If the HAL returns -1 for the input buffer release sync fence, the
- framework is free to immediately reuse the input buffer. Otherwise, the
- framework will wait on the sync fence before refilling and reusing the
- input buffer.</p>
-<h4>Return values</h4>
-<ul>
- <li>0: On a successful start to processing the capture request</li>
- <li>-EINVAL: If the input is malformed (the settings are NULL when not
- allowed, there are 0 output buffers, etc) and capture processing
- cannot start. Failures during request processing should be
- handled by calling camera3_callback_ops_t.notify(). In case of
- this error, the framework will retain responsibility for the
- stream buffers' fences and the buffer handles; the HAL should
- not close the fences or return these buffers with
- process_capture_result.</li>
- <li>-ENODEV: If the camera device has encountered a serious error. After this
- error is returned, only the close() method can be successfully
- called by the framework.</li>
-</ul>
-<h2 id="misc-methods">Miscellaneous methods</h2>
-<h3 id="get-metadata">get_metadata_vendor_tag_ops</h3>
-<p>Get methods to query for vendor extension metadata tag information. The
- HAL should fill in all the vendor tag operation methods, or leave ops
- unchanged if no vendor tags are defined.
-
- The definition of vendor_tag_query_ops_t can be found in
- system/media/camera/include/system/camera_metadata.h.</p>
-<h3 id="dump">dump</h3>
-<p>Print out debugging state for the camera device. This will be called by
- the framework when the camera service is asked for a debug dump, which
- happens when using the dumpsys tool, or when capturing a bugreport.
-
- The passed-in file descriptor can be used to write debugging text using
- dprintf() or write(). The text should be in ASCII encoding only.</p>
-<h3 id="flush">flush</h3>
-<p>Flush all currently in-process captures and all buffers in the pipeline
- on the given device. The framework will use this to dump all state as
- quickly as possible in order to prepare for a configure_streams() call.</p>
-<p>No buffers are required to be successfully returned, so every buffer
- held at the time of flush() (whether sucessfully filled or not) may be
- returned with CAMERA3_BUFFER_STATUS_ERROR. Note the HAL is still allowed
- to return valid (STATUS_OK) buffers during this call, provided they are
- succesfully filled.</p>
-<p>All requests currently in the HAL are expected to be returned as soon as
- possible. Not-in-process requests should return errors immediately. Any
- interruptible hardware blocks should be stopped, and any uninterruptible
- blocks should be waited on.</p>
-<p>flush() should only return when there are no more outstanding buffers or
- requests left in the HAL. The framework may call configure_streams (as
- the HAL state is now quiesced) or may issue new requests.</p>
-<p>A flush() call should only take 100ms or less. The maximum time it can
- take is 1 second.</p>
-<h4>Version information</h4>
-<p>This is available only if device version >= CAMERA_DEVICE_API_VERSION_3_1.</p>
-<h4>Return values</h4>
-<ul>
- <li>0: On a successful flush of the camera HAL.</li>
- <li>-EINVAL: If the input is malformed (the device is not valid).<br />
- -ENODEV: If the camera device has encountered a serious error. After this
- error is returned, only the close() method can be successfully
- called by the framework.</li>
-</ul>
diff --git a/src/devices/devices_toc.cs b/src/devices/devices_toc.cs
index 963a6a1..a250d52 100644
--- a/src/devices/devices_toc.cs
+++ b/src/devices/devices_toc.cs
@@ -50,7 +50,23 @@
</ul>
</li>
<li><a href="<?cs var:toroot ?>devices/bluetooth.html">Bluetooth</a></li>
- <li><a href="<?cs var:toroot ?>devices/camera.html">Camera</a></li>
+ <li class="nav-section">
+ <div class="nav-section-header">
+ <a href="<?cs var:toroot ?>devices/camera/camera.html">
+ <span class="en">Camera</span>
+ </a>
+ </div>
+ <ul>
+ <li><a href="<?cs var:toroot ?>devices/camera/camera3.html">Camera HAL3</a></li>
+ <li><a href="<?cs var:toroot ?>devices/camera/camera3_requests_hal.html">HAL Subsystem</a></li>
+ <li><a href="<?cs var:toroot ?>devices/camera/camera3_metadata.html">Metadata and Controls</a></li>
+ <li><a href="<?cs var:toroot ?>devices/camera/camera3_3Amodes.html">3A Modes and State</a></li>
+ <li><a href="<?cs var:toroot ?>devices/camera/camera3_crop_reprocess.html">Output and Cropping</a></li>
+ <li><a href="<?cs var:toroot ?>devices/camera/camera3_error_stream.html">Errors and Streams</a></li>
+ <li><a href="<?cs var:toroot ?>devices/camera/camera3_requests_methods.html">Request Creation</a></li>
+ </ul>
+ </li>
+
<li><a href="<?cs var:toroot ?>devices/drm.html">DRM</a></li>
<li><a href="<?cs var:toroot ?>devices/tech/encryption/index.html">Encryption</a></li>
<li class="nav-section">
@@ -112,6 +128,11 @@
</a>
</li>
<li>
+ <a href="<?cs var:toroot ?>devices/tech/security/dm-verity.html">
+ <span class="en">dm-verity on boot</span>
+ </a>
+ </li>
+ <li>
<a href="<?cs var:toroot ?>devices/tech/security/se-linux.html">
<span class="en">Security-Enhanced Linux</span>
</a>
@@ -166,6 +187,7 @@
<li><a href="<?cs var:toroot ?>devices/tech/dalvik/dalvik-bytecode.html">Bytecode Format</a></li>
<li><a href="<?cs var:toroot ?>devices/tech/dalvik/dex-format.html">.Dex Format</a></li>
<li><a href="<?cs var:toroot ?>devices/tech/dalvik/instruction-formats.html">Instruction Formats</a></li>
+ <li><a href="<?cs var:toroot ?>devices/tech/dalvik/art.html">Introducing ART</a></li>
</ul>
</li>
diff --git a/src/devices/images/camera_hal.png b/src/devices/images/camera_hal.png
deleted file mode 100644
index 48b3b69..0000000
--- a/src/devices/images/camera_hal.png
+++ /dev/null
Binary files differ
diff --git a/src/devices/low-ram.jd b/src/devices/low-ram.jd
index efa6377..711fadc 100644
--- a/src/devices/low-ram.jd
+++ b/src/devices/low-ram.jd
@@ -227,9 +227,9 @@
<li>By default, the Linux kernel swaps in 8 pages of memory at a time. When
using ZRAM, the incremental cost of reading 1 page at a time is negligible
and may help in case the device is under extreme memory pressure. To read
- only 1 page at a time, add the following to your init.rc:<br/>
+ only 1 page at a time, add the following to your init.rc:<br />
`write /proc/sys/vm/page-cluster 0`</li>
- <li>In your init.rc, after the `mount_all /fstab.X` line, add:<br/>
+ <li>In your init.rc, after the `mount_all /fstab.X` line, add:<br />
`swapon_all /fstab.X`</li>
<li>The memory cgroups are automatically configured at boot time if the
feature is enabled in kernel.</li>
diff --git a/src/devices/tech/dalvik/art.jd b/src/devices/tech/dalvik/art.jd
new file mode 100644
index 0000000..dd3c6f6
--- /dev/null
+++ b/src/devices/tech/dalvik/art.jd
@@ -0,0 +1,52 @@
+page.title=Introducing ART
+@jd:body
+
+<!--
+ Copyright 2013 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<p>
+ART is a new Android runtime being introduced experimentally in the 4.4
+release. This is a preview of work in progress in KitKat that can be turned on
+in Settings > developer options. This is available for the purpose of
+obtaining early developer and partner feedback.</p>
+
+<p><strong>Important</strong>: Dalvik must remain the default runtime or
+you risk breaking your Android implementations and third-party applications.</p>
+
+<p>
+Two runtimes are now available, the existing Dalvik runtime (libdvm.so) and the
+ART (libart.so). A device can be built using either or both.
+(You can dual boot from Developer options if both are installed.)
+</p>
+
+<p>
+The <code>dalvikvm</code> command line tool can run with either of them now.
+See runtime_common.mk. That is included from build/target/product/runtime_libdvm.mk or
+build/target/product/runtime_libdvm.mk or both.</p>
+
+<p>
+A new <code>PRODUCT_RUNTIMES</code> variable controls which runtimes
+are included in a build. Include it within either
+build/target/product/core_minimal.mk or build/target/product/core_base.mk.
+</p>
+
+<p>
+Add this to the device makefile to have both runtimes
+built and installed, with Dalvik as the default:
+</br>
+<code>PRODUCT_RUNTIMES := runtime_libdvm_default</code>
+</br>
+<code>PRODUCT_RUNTIMES += runtime_libart</code>
+</p>
diff --git a/src/devices/tech/dalvik/index.jd b/src/devices/tech/dalvik/index.jd
index ed36231..7bc11bb 100644
--- a/src/devices/tech/dalvik/index.jd
+++ b/src/devices/tech/dalvik/index.jd
@@ -22,4 +22,8 @@
<p>Much of the documentation in this directory is intended to help
with the ongoing development of Dalvik, as opposed to most of the
other documentation on this site, which is geared more towards
-application development.</p>
\ No newline at end of file
+application development.</p>
+
+<p>Please note, in Android 4.4 a new virtual machine - ART - is being introduced
+experimentally that will eventually replace Dalvik. Please see <a
+href="{@docRoot}devices/tech/dalvik/art.html">Introducing ART</a> for details.
diff --git a/src/devices/tech/security/dm-verity.jd b/src/devices/tech/security/dm-verity.jd
new file mode 100644
index 0000000..79e375f
--- /dev/null
+++ b/src/devices/tech/security/dm-verity.jd
@@ -0,0 +1,318 @@
+page.title=dm-verity on boot
+@jd:body
+
+<!--
+ Copyright 2010 The Android Open Source Project
+
+ Licensed under the Apache License, Version 2.0 (the "License");
+ you may not use this file except in compliance with the License.
+ You may obtain a copy of the License at
+
+ http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing, software
+ distributed under the License is distributed on an "AS IS" BASIS,
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ See the License for the specific language governing permissions and
+ limitations under the License.
+-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
+
+<h2 id="introduction">Introduction</h2>
+
+<p>Android 4.4 supports verified boot through the optional device-mapper-verity
+(dm-verity) kernel feature, which provides transparent integrity checking of
+block devices. dm-verity helps prevent persistent rootkits that can hold onto
+root privileges and compromise devices. This experimental feature helps Android
+users be sure when booting a device it is in the same state as when it was last
+used.</p>
+
+<p>Clever malware with root privileges can hide from detection programs and
+otherwise mask themselves. The rooting software can do this because it is often
+more privileged than the detectors, enabling the software to "lie" to to the
+detection programs.</p>
+
+<p>The dm-verity feature lets you look at a block device, the underlying storage
+layer of the file system, and determine if it matches its expected
+configuration. It does this using a cryptographic hash tree. For every block
+(typically 4k), there is a SHA256 hash.</p>
+
+<p>And since the hash values are stored in a tree of pages, only the top-level
+"root" hash must be trusted to verify the rest of the tree. The ability to
+modify any of the blocks would be equivalent to breaking the cryptographic hash.
+See the following diagram for a depiction of this structure.</p>
+
+<p><img src="images/dm-verity-hash-table.png" alt="dm-verity-hash-table"/><br/>
+A public key is included on the boot partition, which must be verified
+externally by the OEM. That key is used to verify the signature for that hash
+and confirm the device's system partition is protected and unchanged.</p>
+
+<h2 id="operation">Operation</h2>
+
+<p>dm-verity protection lives in the kernel. So if rooting software compromises the
+system before the kernel comes up, it will retain that access. To mitigate this
+risk, most manufacturers verify the kernel using a key burned into the device.
+That key is not changeable once the device leaves the factory.</p>
+
+<p>Manufacturers use that key to verify the signature on the first-level
+bootloader, which in turn verifies the signature on subsequent levels, the
+application bootloader and eventually the kernel. Each manufacturer wishing to
+take advantage of verified boot should have a method for verifying the integrity
+of the kernel. Assuming the kernel has been verified, the kernel can look at a
+block device and verify it as it is mounted.</p>
+
+<p>One way of verifying a block device is to directly hash its contents and compare
+them to a stored value. However, attempting to verify an entire block device can
+take an extended period and consume much of a device's power. Devices would take
+long periods to boot and then be significantly drained prior to use.</p>
+
+<p>Instead, dm-verity verifies blocks individually and only when each one is
+accessed. When read into memory, the block is hashed in parallel. The hash is
+then verified up the tree. And since reading the block is such an expensive
+operation, the latency introduced by this block-level verification is
+comparatively nominal.</p>
+
+<p>If verification fails, the device generates an I/O error indicating the block
+cannot be read. It will appear as if the filesystem has been corrupted, as is
+expected.</p>
+
+<p>Applications may choose to proceed without the resulting data, such as when
+those results are not required to the application's primary function. However,
+if the application cannot continue without the data, it will fail.</p>
+
+<h2 id="prerequisites">Prerequisites</h2>
+
+<h3 id="block-otas">Switching to block-oriented OTAs</h3>
+
+<p>To enable dm-verity on your devices, you <strong>must</strong> move from file-based "over the
+air" (OTA) updates to block-oriented OTAs. This is needed because during OTA,
+Android attempts to change the contents of the system partition at the
+filesystem layer.<br/>
+And since OTA works on a file-by-file basis, it is not guaranteed to write files
+in a consistent order, have a consistent last modified time or superblock, or
+even place the blocks in the same location on the block device. For this reason,
+<em>file-based OTAs will fail on a dm-verity-enabled device.</em><strong>The device will
+not boot after OTA.</strong></p>
+
+<p>So you must use block-oriented OTAs. With block-oriented OTAs, you serve the
+device the difference between the two block images rather than the two sets of
+files. Many manufacturers have already moved to block-oriented OTAs to make them
+more reproducible and predictable.</p>
+
+<p>A block-oriented OTA checks a device build against the corresponding build
+server at the block device level, below the filesystem. This can be done in a
+couple of different ways, each with their own benefits and drawbacks:</p>
+
+<ul>
+<li><em>Copy the full system image to the device</em> - This is simple and makes patch
+generation easy. But it also makes the application of those patches quite
+expensive as the resulting images are large.</li>
+<li><em>Employ a binary differ</em> - These tools, such as <code>bsdiff</code>, simplify patch
+application as images are much smaller. But these tools tend to be memory
+intensive and therefore expensive in generating the patches themselves.</li>
+</ul>
+
+<h3 id="config-dm-verity">Configuring dm-verity</h3>
+
+<p>After switching to block-oriented OTAs, incorporate the latest Android kernel or
+use a stock upstream kernel and enable dm-verity support by including the
+relevant configuration option:<br/>
+<code>CONFIG_DM_VERITY
+</code></p>
+<p>When using the Android kernel, dm-verity is turned on when the kernel is built.</p>
+
+<h2 id="implementation">Implementation</h2>
+
+<h3 id="summary">Summary</h3>
+
+<ol>
+<li>Generate an ext4 system image.</li>
+<li><a href="#heading=h.wiiuowe37q8h">Generate a hash tree</a> for that image.</li>
+<li><a href="#heading=h.cw7mesnrerea">Build a dm-verity table</a> for that hash tree.</li>
+<li><a href="#heading=h.maq6jfk4vx92">Sign that dm-verity table</a> to produce a table
+signature.</li>
+<li><a href="#heading=h.tkceh5wnx7z2">Bundle the table signature</a> and dm-verity table
+into verity metadata.</li>
+<li>Concatenate the system image, the verity metadata, and the hash tree.</li>
+</ol>
+
+<p>See the <a href="http://www.chromium.org/chromium-os/chromiumos-design-docs/verified-boot">The Chromium Projects - Verified
+Boot</a>
+for a detailed description of the hash tree and dm-verity table.</p>
+
+<h3 id="hash-tree">Generating the hash tree</h3>
+
+<p>As described in the <a href="#heading=h.q4z3ftrhbehy">Introduction</a>, the hash tree is
+integral to dm-verity. The
+<a href="https://code.google.com/p/cryptsetup/wiki/DMVerity">cryptsetup</a> tool will
+generate a hash tree for you. Alternatively, a compatible one is defined here:</p>
+
+<pre>
+<your block device name> <your block device name> <block size> <block size> <image size in blocks> <image size in blocks + 8> <root hash> <salt>
+</pre>
+
+<p>To form the hash, the system image is split at layer 0 into 4k blocks, each
+assigned a SHA256 hash. Layer 1 is formed by joining only those SHA256 hashes
+into 4k blocks, resulting in a much smaller image. Layer 2 is formed
+identically, with the SHA256 hashes of Layer 1.</p>
+
+<p>This is done until the SHA256 hashes of the previous layer can fit in a single
+block. When get the SHA256 of that block, you have the root hash of the tree. </p>
+
+<p>The size of the hash tree (and corresponding disk space usage) varies with the
+size of the verified partition. In practice, the size of hash trees tends to be
+small, often less than 30 MB.</p>
+
+<p>If you have a block in a layer that isn't completely filled naturally by the
+hashes of the previous layer, you should pad it with zeroes to achieve the
+expected 4k. This allows you to know the hash tree hasn't been removed and is
+instead completed with blank data.</p>
+
+<p>To generate the hash tree, concatenate the layer 2 hashes onto those for layer
+1, the layer 3 the hashes onto those of layer 2, and so on. Write all of this
+out to disk. Note that this doesn't reference layer 0 of the root hash.</p>
+
+<p>To recap, the general algorithm to construct the hash tree is as follows:</p>
+
+<ol>
+<li>Choose a random salt (hexadecimal encoding).</li>
+<li>Unsparse your system image into 4k blocks.</li>
+<li>For each block, get its (salted) SHA256 hash.</li>
+<li>Concatenate these hashes to form a level</li>
+<li>Pad the level with 0s to a 4k block boundary.</li>
+<li>Concatenate the level to your hash tree.</li>
+<li>Repeat steps 2-6 using the previous level as the source for the next until
+you have only a single hash.</li>
+</ol>
+
+<p>The result of this is a single hash, which is your root hash. This and your salt
+are used during the construction of your dm-verity mapping hash table.</p>
+
+<h3 id="mapping-table">Building the dm-verity mapping table</h3>
+
+<p>Build the dm-verity mapping table, which identifies the block device (or target)
+for the kernel and the location of the hash tree (which is the same value.) This
+mapping is used for <code>fstab</code> generation and booting. The table also identifies
+the size of the blocks and the hash_start, or the offset in hash size blocks
+(length of layer 0).</p>
+
+<p>See <a href="https://code.google.com/p/cryptsetup/wiki/DMVerity">cryptsetup</a> for a
+detailed description of the verity target mapping table fields.</p>
+
+<h3 id="signing">Signing the dm-verity table</h3>
+
+<p>Sign the dm-verity table to produce a table signature. When verifying a
+partition, the table signature is validated first. This is done against a key on
+your boot image in a fixed location. Keys are typically included in the
+manufacturers' build systems for automatic inclusion on devices in a fixed
+location.</p>
+
+<p>To verify the partition with this signature and key combination:</p>
+
+<ol>
+<li>Add an RSA-2048 key in libmincrypt-compatible format to the /boot partition
+at /verity_key. Identify the location of the key used to verify the hash
+tree.</li>
+<li>In the fstab for the relevant entry, add 'verify' to the fs_mgr flags.</li>
+</ol>
+
+<h3 id="metadata">Bundling the table signature into metadata</h3>
+
+<p>Bundle the table signature and dm-verity table into verity metadata. The entire
+block of metadata is versioned so it may be extended, such as to add a second
+kind of signature or change some ordering.</p>
+
+<p>As a sanity check, a magic number is associated with each set of table metadata
+that helps identify the table. Since the length is included in the ext4 system
+image header, this provides a way to search for the metadata without knowing the
+contents of the data itself.</p>
+
+<p>This makes sure you haven't elected to verify an unverified partition. If so,
+the absence of this magic number will halt the verification process. This number
+resembles:<br/>
+0xb001b001</p>
+
+<p>The byte values in hex are:</p>
+
+<ul>
+<li>first byte = b0</li>
+<li>second byte = 01</li>
+<li>third byte = b0</li>
+<li>fourth byte = 01</li>
+</ul>
+
+<p>The following diagram depicts the breakdown of the verity metadata:</p>
+
+<pre><magic number>|<version>|<signature>|<table length>|<table>|<padding>
+\-------------------------------------------------------------------/
+\----------------------------------------------------------/ |
+ | |
+ | 32K
+ block content
+</pre>
+
+<p>And this table describes those metadata fields.</p>
+
+<table>
+<tr>
+<th>Field</th>
+<th>Purpose</th>
+<th>Size</th>
+<th>Value</th>
+</tr>
+<tr>
+<td>magic number</td>
+<td>used by fs_mgr as a sanity check</td>
+<td>4 bytes</td>
+<td>0xb001b001</td>
+</tr>
+<tr>
+<td>version</td>
+<td>used to version the metadata block</td>
+<td>4 bytes</td>
+<td>currently 0</td>
+</tr>
+<tr>
+<td>signature</td>
+<td>the signature of the table in PKCS1.5 padded form</td>
+<td>256 bytes</td>
+<td></td>
+</tr>
+<tr>
+<td>table length</td>
+<td>the length of the dm-verity table in bytes</td>
+<td>4 bytes</td>
+<td></td>
+</tr>
+<tr>
+<td>table</td>
+<td>the dm-verity table described earlier</td>
+<td>`table length` bytes</td>
+<td></td>
+</tr>
+<tr>
+<td>padding</td>
+<td>this structure is 0-padded to 32k in length</td>
+<td></td>
+<td>0</td>
+</tr>
+</table>
+
+<p>For additional assistance, contact
+<a href="mailto:gcondra@google.com">gcondra@google.com</a>.</p>
+
+<h2 id="supporting-docs">Supporting documentation</h2>
+
+<p><a href="https://code.google.com/p/cryptsetup/wiki/DMVerity">cryptsetup - dm-verity: device-mapper block integrity checking
+target</a><br/>
+<a href="http://www.chromium.org/chromium-os/chromiumos-design-docs/verified-boot">The Chromium Projects - Verified
+Boot</a><br/>
+<a
+href="http://git.kernel.org/?p=linux/kernel/git/torvalds/linux-2.6.git;a=blob;f=Documentation/device-mapper/verity.txt">Linux Kernel Documentation:
+verity.txt</a></p>
diff --git a/src/devices/tech/security/images/dm-verity-hash-table.png b/src/devices/tech/security/images/dm-verity-hash-table.png
new file mode 100644
index 0000000..3761dc6
--- /dev/null
+++ b/src/devices/tech/security/images/dm-verity-hash-table.png
Binary files differ
diff --git a/src/devices/tech/security/index.jd b/src/devices/tech/security/index.jd
index 2c0186c..e53895a 100644
--- a/src/devices/tech/security/index.jd
+++ b/src/devices/tech/security/index.jd
@@ -16,6 +16,13 @@
See the License for the specific language governing permissions and
limitations under the License.
-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
<h2 id="introduction">Introduction</h2>
<p>Android is a modern mobile platform that was designed to be truly open. Android
@@ -273,6 +280,14 @@
applications, files created by one application cannot be read or altered by
another application.</p>
+<h2 id="se-linux">Security-Enhanced Linux</h2>
+
+<p>Android uses Security-Enhanced
+Linux (SELinux) to apply access control policies and establish an environment of
+mandatory access control (mac). See <a
+href="{@docRoot}devices/tech/security/se-linux.html">Validating
+Security-Enhanced Linux in
+Android</a> for details.</p>
<h2 id="crypto">Cryptography</h2>
diff --git a/src/devices/tech/security/se-linux.jd b/src/devices/tech/security/se-linux.jd
index d23be8c..acf9291 100644
--- a/src/devices/tech/security/se-linux.jd
+++ b/src/devices/tech/security/se-linux.jd
@@ -1,4 +1,4 @@
-page.title=Security-Enhanced Linux
+page.title=Validating Security-Enhanced Linux in Android
@jd:body
<!--
@@ -16,70 +16,121 @@
See the License for the specific language governing permissions and
limitations under the License.
-->
+<div id="qv-wrapper">
+ <div id="qv">
+ <h2>In this document</h2>
+ <ol id="auto-toc">
+ </ol>
+ </div>
+</div>
-<h2 id="introduction">Introduction</h2> <p>In Android 4.3,
-Android begins supporting Security-Enhanced Linux (SELinux), a tool for applying
-access control policies. SELinux enhances Android security, and contributions to
-it have been made by a number of companies and organizations; all Android code
-and contributors are publicly available for review on this same site <a
-href="http://source.android.com/">source.android.com</a>. With SELinux, Android
-can better control access to application data and system logs, reduce the
-effects of malicious software, and protect users from potential flaws in mobile
-code. </p>
+<h2 id="introduction">Introduction</h2>
+<p>
+As part of the Android <a href="{@docRoot}devices/tech/security/index.html">security
+model</a>, Android uses Security-Enhanced Linux (SELinux) to apply access
+control policies. SELinux enhances Android security, and contributions to it
+have been made by a number of companies and organizations; all Android code and
+contributors are publicly available for review on
+<a href="https://android.googlesource.com/">android.googlesource.com</a>. With SELinux,
+Android can
+better control access to application data and system logs, reduce the effects of
+malicious software, and protect users from potential flaws in code on mobile
+devices.
+</p>
+<p>
+Android includes SELinux in enforcing mode and a corresponding security policy
+that works by default across the <a
+href="https://android.googlesource.com/">Android Open Source
+Project</a>. In enforcing mode, illegitimate
+actions are prevented and all potential violations are logged by the kernel to
+<code>dmesg</code>. Android device manufacturers should gather information about errors so
+they may refine their software and SELinux policies before enforcing them.
+</p>
-<p>In this release, Android includes SELinux in permissive mode and a
-corresponding security policy that works by default across the <a
-href="https://android.googlesource.com/">Android Open Source Project</a>. In
-permissive mode, no actions are prevented. Instead, all potential violations are
-logged by the kernel to <code>dmesg</code>. This allows Android and Android device
-manufacturers to gather information about errors so they may refine their
-software and SELinux policies before enforcing them.</p>
+<h2 id="background">Background</h2>
+<p>
+Please note, Android upgraded its SELinux policy version to allow the SELinux
+mode to be set on a per-domain basis. For example, if you run all of your
+applications on a single domain, you could set that domain to be permissive and
+then have all other functions and their domains set to enforcement. Domains are
+associated with applications by the key used to sign each application. This
+setting is made at the top of each SELinux policy source (*.te) file.
+</p>
+<p>
+Android follows this model of isolating applications to a single domain. With
+this, only the root domain and root-level processes (such as <code>initd</code>,
+<code>installd</code> and
+<code>vold</code>) are now set to enforcing mode. <em>The application domain remains in
+permissive mode to allow further evaluation and prevent failures. Still, an
+errant application could trigger an action in the root domain that is not
+allowed, thereby causing the application to crash.</em>
+</p>
+<p>
+For this reason, device manufacturers should retain the default settings
+provided by Android and limit enforcing mode to the root domain only until
+they've resolved issues reported in dmesg. That said, device manufacturers may
+need to augment their SELinux implementation to account for their additions and
+other changes to the operating system. See the <em>Customization</em> section for
+instructions.
+</p>
-<h2 id="background">Background</h2> <p>Used properly, SELinux greatly limits the
-potential damage of compromised machines and accounts. When you adopt SELinux,
-you instill a structure by which software runs at only the minimum privilege
-level. This mitigates the effects of attacks and reduces the likelihood of
-errant processes overwriting or even transmitting data.</p>
-
-<p>SELinux provides a mandatory access control (MAC) umbrella over traditional
+<h2 id="mac">Mandatory access control</h2>
+<p>
+In conjunction with other Android security measures, Android's access control
+policy greatly limits the potential damage of compromised
+machines and accounts. Using tools like Android's discretionary and mandatory
+access controls gives you a structure to ensure your software runs
+only at the minimum privilege level. This mitigates the effects of
+attacks and reduces the likelihood of errant processes overwriting or even
+transmitting data.
+</p>
+<p>
+Starting in Android 4.3, SELinux provides a mandatory access control (MAC) umbrella over traditional
discretionary access control (DAC) environments. For instance, software must
typically run as the root user account to write to raw block devices. In a
traditional DAC-based Linux environment, if the root user becomes compromised
that user can write to every raw block device. However, SELinux can be used to
label these devices so the user role assigned the root privilege can write to
only those specified in the associated policy. In this way, root cannot
-overwrite data and system settings outside of the specific raw block device.</p>
+overwrite data and system settings outside of the specific raw block
+device.
+</p>
+<p>
+See the <em>Use Cases</em> section for more examples of threats and ways to address
+them with SELinux.
+</p>
-<p>See the <em>Use Cases</em> section for more examples of threats and ways to
-address them with SELinux.</p>
-
-<h2 id="implementation">Implementation</h2> <p>Android’s initial SELinux
-implementation is launching in permissive mode - rather than the non-functional
-disabled mode or the most stringent enforcing mode - to act as a reference and
-facilitate testing and development.</p>
-
-<p>SELinux is launching in permissive mode on Android to enable the first phase
-of policy development, and it is accompanied by everything you need to enable
-SELinux now.</p>
-
-<p>You merely need to integrate the <a
-href="https://android.googlesource.com/kernel/common/">latest Android kernel</a>
-and then incorporate the files found in the ~<a
+<h2 id="implementation">Implementation</h2>
+<p>
+Android's SELinux implementation is in enforcing mode - rather than the
+non-functional disabled mode or the notification-only permissive mode - to act
+as a reference and facilitate testing and development. Although enforcing mode
+is set globally, please remember this can be overridden on a per-domain basis
+as is in the case of the application domain.
+</p>
+<p>
+SELinux for Android is accompanied by everything you need to enable SELinux
+now. You merely need to integrate the <a
+href="https://android.googlesource.com/kernel/common/">latest Android
+kernel</a> and then incorporate the files found in the
+~<a
href="https://android.googlesource.com/platform/external/sepolicy/">platform/external/sepolicy</a>
-directory:<br>
+directory (where examples can also be found):<br/>
<a
href="https://android.googlesource.com/kernel/common/">https://android.googlesource.com/kernel/common/</a>
-<br>
+<br/>
<a
-href="https://android.googlesource.com/platform/external/sepolicy/">https://android.googlesource.com/platform/external/sepolicy/</a></p>
+href="https://android.googlesource.com/platform/external/sepolicy/">https://android.googlesource.com/platform/external/sepolicy/</a>
+</p>
-<p>Those files when compiled comprise the SELinux kernel security policy and
-cover the upstream Android operating system. Place those files within the
-<root>/device/manufacturer/device-name/sepolicy directory.</p>
-
-<p>Then just update your <code>BoardConfig.mk</code> makefile - located in the
-<device-name> directory containing the sepolicy subdirectory - to
-reference the sepolicy subdirectory once created, like so:</p>
+</p>
+ Those files when compiled comprise the SELinux kernel security policy and cover
+the upstream Android operating system. Place those files within the
+<root>/device/manufacturer/device-name/sepolicy directory.<br/>
+Then just update your <code>BoardConfig.mk</code> makefile - located in the <device-name>
+directory containing the sepolicy subdirectory - to reference the sepolicy
+subdirectory once created, like so:
+</p>
<pre>
BOARD_SEPOLICY_DIRS := \
@@ -91,88 +142,100 @@
sepolicy.te
</pre>
-<p>After rebuilding your device, it is enabled with SELinux. You can now either
+<p>
+After rebuilding your device, it is enabled with SELinux. You can now either
customize your SELinux policies to accommodate your own additions to the Android
operating system as described in the <em>Customization</em> section or verify
-your existing setup as covered in the <em>Validation</em> section.</p>
+your
+existing setup as covered in the <em>Validation</em> section.
+</p>
-<h2 id="customization">Customization</h2> <p>Once you’ve integrated this
-base level of functionality and thoroughly analyzed the results, you may add
-your own policy settings to cover your customizations to the Android operating
-system. Of course, these policies must still meet the <a
-href="{@docRoot}compatibility/index.html">Android Compatibility
-program</a> requirements and not remove the default SELinux settings.</p>
-
-<p>Manufacturers should not remove existing security settings. Otherwise, they
-risk breaking the Android SELinux implementation and the applications it
-governs. This includes third-party applications that will likely need to be
-improved to be compliant and operational. Applications must require no
-modification to continue functioning on SELinux-enabled devices.</p>
-
-<p>See section 9.7 of the Android 4.3 Compatibility Definition document for
-specific requirements:<br><a
-href="{@docRoot}compatibility/android-4.3-cdd.pdf">http://source.android.com/compatibility/android-4.3-cdd.pdf</a></p>
-
-<p>SELinux uses a whitelist approach, meaning it grants special privileges based
-upon role. Because the default policy provided by Android is so permissive, OEMs
-have great leeway in strengthening it. Here is how we recommend proceeding:</p>
+<h2 id="customization">Customization</h2>
+<p>
+Once you've integrated this base level of functionality and thoroughly analyzed
+the results, you may add your own policy settings to cover your customizations
+to the Android operating system. Of course, these policies must still meet the
+<a href="http://source.android.com/compatibility/index.html">Android
+Compatibility
+program</a> requirements and
+not remove the default SELinux settings.
+</p>
+<p>
+Manufacturers should not remove existing security settings. Otherwise, they risk
+breaking the Android SELinux implementation and the applications it governs.
+This includes third-party applications that will likely need to be improved to
+be compliant and operational. Applications must require no modification to
+continue functioning on SELinux-enabled devices.
+</p>
+<p>
+See the <em>Kernel Security Features</em> section of the Android Compatibility
+Definition document for specific requirements:<br/>
+<a
+href="http://source.android.com/compatibility/index.html">http://source.android.com/compatibility/index.html</a>
+</p>
+<p>
+SELinux uses a whitelist approach, meaning it grants special privileges based
+upon role. Since Android's default SELinux policy already supports the Android
+Open Source Project, OEMs are not required to modify SELinux settings in any
+way. If they do customize SELinux settings, they should take great care not to
+break existing applications. Here is how we recommend proceeding:
+</p>
<ol>
-<li>
-<p>Use the <a
-href="https://android.googlesource.com/kernel/common/">latest Android
-kernel</a>.</p> </li>
-<li>
-<p>Adopt the <a
+<li>Use the <a href="https://android.googlesource.com/kernel/common/">latest
+Android
+kernel</a>.</li>
+<li>Adopt the <a
href="http://en.wikipedia.org/wiki/Principle_of_least_privilege">principle of
-least privilege</a>.</p></li>
-<li>
-<p>Address only your own additions to
-Android. The default policy works with the <a
-href="https://android.googlesource.com/">Android Open Source Project</a>
-codebase automatically.</p></li>
-<li>
-<p>Compartmentalize software components
-into modules that conduct singular tasks.</p></li>
-<li>
-<p>Create SELinux
-policies that isolate those tasks from unrelated functions.</p></li>
-<li>
-<p>Put those policies in *.te files (the extension for SELinux policy source
-files) within the <root>/device/manufacturer/device-name/sepolicy
-directory.</p></li>
-<li>
-<p>Release your SELinux implementation in permissive
-mode first.</p></li>
-<li><p>Analyze results and refine policy settings.</p>
-</li>
+least
+privilege</a>.</li>
+<li>Address only your own additions to Android. The default policy works with
+the
+<a href="https://android.googlesource.com/">Android Open Source Project</a>
+codebase
+automatically.</li>
+<li>Compartmentalize software components into modules that conduct singular
+tasks.</li>
+<li>Create SELinux policies that isolate those tasks from unrelated
+functions.</li>
+<li>Put those policies in *.te files (the extension for SELinux policy source
+files) within the <root>/device/manufacturer/device-name/sepolicy
+directory.</li>
+<li>Release your SELinux implementation in permissive mode first.</li>
+<li>Analyze results and refine policy settings.</li>
</ol>
-<p>Once integrated, OEM Android development should include a step to ensure
-SELinux compatibility going forward. In an ideal software development process,
-SELinux policy changes only when the software model changes and not the actual
-implementation.</p>
-
-<p>As device manufacturers begin to customize SELinux, they should first audit
-their additions to Android. If you’ve added a component that conducts a
-new function, the manufacturer will need to ensure the component meets the
-security policy applied by Android, as well as any associated policy crafted by
-the OEM, before turning on enforcement.</p>
-
-<p>To prevent unnecessary issues, it is better to be overbroad and
-over-compatible than too restrictive and incompatible, which results in broken
-device functions. Conversely, if a manufacturer’s changes will benefit
-others, it should supply the modifications to the default SELinux policy as a <a
-href="{@docRoot}source/submit-patches.html">patch</a>. If the
-patch is applied to the default security policy, the manufacturer will no longer
-need to make this change with each new Android release.</p>
+<p>
+Once integrated, OEM Android development should include a step to ensure
+SELinux
+compatibility going forward. In an ideal software development process, SELinux
+policy changes only when the software model changes and not the actual
+implementation.
+</p>
+<p>
+As device manufacturers begin to customize SELinux, they should first audit
+their additions to Android. If they've added a component that conducts a new
+function, the manufacturers will need to ensure the component meets the security
+policy applied by Android, as well as any associated policy crafted by the OEM,
+before turning on enforcement.
+</p>
+<p>
+To prevent unnecessary issues, it is better to be overbroad and over-compatible
+than too restrictive and incompatible, which results in broken device functions.
+Conversely, if a manufacturer's changes will benefit others, it should supply
+the modifications to the default SELinux policy as a
+<a href="http://source.android.com/source/submit-patches.html">patch</a>. If the
+patch is
+applied to the default security policy, the manufacturer will no longer need to
+make this change with each new Android release.
+</p>
<h2 id="use-cases">Use Cases</h2> <p>Here are specific examples of exploits to
consider when crafting your own software and associated SELinux policies:</p>
<p><strong>Symlinks</strong> - Because symlinks appear as files, they are often read
just as that. This can lead to exploits. For instance, some privileged components such
-as init change the permissions of certain files, sometimes to be excessively
+as <code>init</code> change the permissions of certain files, sometimes to be excessively
open.</p>
<p>Attackers might then replace those files with symlinks to code they control,
@@ -224,7 +287,11 @@
<root>/device/manufacturer/device-name/sepolicy directory These files
define domains and their labels. The new policy files get concatenated with the
existing policy files during compilation into a single SELinux kernel policy
-file.</p></li>
+file.</p>
+<p><strong>Important</strong>:Do not alter the app.te file provided by the
+Android Open Source Project. Doing so risks breaking all third-party applications.
+</p>
+</li>
<li>
<p><em>Updated <code>BoardConfig.mk</code> makefile</em> - Located in the
<device-name> directory containing the sepolicy subdirectory. It must be
@@ -268,20 +335,19 @@
<h2 id="validation">Validation</h2> <p>Android strongly encourages OEMs to test
their SELinux implementations thoroughly. As manufacturers implement SELinux,
they should initially release their own policies in permissive mode. If
-possible, apply the new policy to devices of employees first as a test.</p>
+possible, apply the new policy to a test pool of devices first.</p>
<p>Once applied, make sure SELinux is running in the correct mode on the device
by issuing the command: <code>getenforce</code></p>
-<p>This will print the SELinux mode: either Disabled, Enforcing, or Permissive.
-If permissive, you are compliant. Enforcing is explicitly not compliant in
-Android 4.3. (Because of its risk, enforcing mode comes with a much heavier
-testing burden.)</p>
+<p>This will print the global SELinux mode: either Disabled, Enforcing, or Permissive.
+Please note, this command shows only the global SELinux mode. To determine the
+SELinux mode for each domain, you must examine the corresponding files.</p>
<p>Then check for errors. Errors are routed as event logs to <code>dmesg</code>
and viewable locally on the device. Manufacturers should examine the SELinux output
to <code>dmesg</code> on these devices and refine settings prior to public release in
-permissive mode.</p>
+permissive mode and eventual switch to enforcing mode.</p>
<p>With this output, manufacturers can readily identify when system users or
components are in violation of SELinux policy. Manufacturers can then repair
@@ -289,7 +355,29 @@
both.</p>
<p>Specifically, these log messages indicate what roles and processes would fail
-under policy enforcement and why. Android is taking this information, analyzing
+under policy enforcement and why. Here is an example:</p>
+
+<pre>
+denied { connectto } for pid=2671 comm="ping" path="/dev/socket/dnsproxyd"
+scontext=u:r:shell:s0 tcontext=u:r:netd:s0 tclass=unix_stream_socket
+</pre>
+
+<p>Interpret this output like so:</p>
+<ul>
+<li>The { connectto } above represents the action being taken. Together with the
+tclass at the end (unix_stream_socket) it tells you roughly what was being done
+to what. In this case, something was trying to connect to a unix stream
+socket.</li>
+<li>The scontext (u:r:shell:s0) tells you what context initiated the action. In
+this case this is something running as the shell.</li>
+<li>The tcontext (u:r:netd:s0) tells you the context of the action’s target. In
+this case, that’s a unix_stream_socket owned by netd.</li>
+<li>The comm="ping" at the top gives you an additional hint about what was being
+run at the time the denial was generated. In this case, it’s a pretty good
+hint.</li>
+</ul>
+
+<p>Android is taking this information, analyzing
it and refining its default security policy so that it works on a wide range of
Android devices with little customization. With this policy, OEMs must only
accommodate their own changes to the Android operating system.</p>
@@ -298,11 +386,7 @@
href="{@docRoot}compatibility/cts-intro.html">Android
Compatibility Test Suite</a> (CTS).</p> <p>As said, any new policies must still
meet the <a href="{@docRoot}compatibility/index.html">Android
-Compatibility program</a> requirements:<br><a
-href="{@docRoot}compatibility/android-4.3-cdd.pdf">http://source.android.com/compatibility/android-4.3-cdd.pdf</a></p>
-
-<p>If you run the devices through the CTS and find no errors in
-<code>dmesg</code>, you can consider your SELinux implementation compatible.</p>
+Compatibility program</a> requirements.</p>
<p>Finally, if possible, turn on enforcement internally (on devices of
employees) to raise the visibility of failures. Identify any user issues and
diff --git a/src/index.jd b/src/index.jd
index 08c2bef..5f2639a 100644
--- a/src/index.jd
+++ b/src/index.jd
@@ -43,7 +43,7 @@
<div class="col-8">
<h3>Updates</h3>
<a href="{@docRoot}source/index.html">
- <h4>Source Code Available for Android 4.4</h4>
+ <h4>Source Code Available for Android</h4>
<p>Android is an open-source software stack for a wide array of mobile devices with different form factors.
<img border="0" src="images/Android_Robot_100.png" alt="Android Partner icon" style="display:inline;float:right;margin:5px 10px">
We created Android in response to our own experiences launching mobile apps. We wanted to make sure there was
@@ -51,7 +51,7 @@
why we created Android and made its source code open.</p>
</a>
<a href="{@docRoot}compatibility/index.html">
- <h4>Compatibility Definition for Android 4.4</h4>
+ <h4>Compatibility Definition for Android</h4>
<p>Android's purpose is to establish an open platform for developers to build innovative apps. The Android
Compatibility program defines the technical details of the Android platform and provides tools for device manufacturers to
ensure developers' apps run on a variety of devices.</p>