blob: 6ebcfedd33d70de27b3fdcaaa8f0854ca1e6a19c [file] [log] [blame]
Clay Murphy0db98152013-09-11 18:29:31 -07001page.title=Camera Version 3
2@jd:body
3
4<!--
5 Copyright 2010 The Android Open Source Project
6
7 Licensed under the Apache License, Version 2.0 (the "License");
8 you may not use this file except in compliance with the License.
9 You may obtain a copy of the License at
10
11 http://www.apache.org/licenses/LICENSE-2.0
12
13 Unless required by applicable law or agreed to in writing, software
14 distributed under the License is distributed on an "AS IS" BASIS,
15 WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
16 See the License for the specific language governing permissions and
17 limitations under the License.
18-->
19<div id="qv-wrapper">
20 <div id="qv">
21 <h2>In this document</h2>
22 <ol id="auto-toc">
23 </ol>
24 </div>
25</div>
26
27<p>Android's camera HAL connects the higher level
28camera framework APIs in <a
29href="http://developer.android.com/reference/android/hardware/package-summary.html">android.hardware</a>
30to your underlying camera driver and hardware. The latest version of Android introduces a new, underlying
31implementation of the camera stack. If you have previously developed a camera HAL module and driver for
32other versions of Android, be aware that there are significant changes in the camera pipeline.</p>
33
34<p>Version 1 of the camera HAL is still supported for future releases of Android, because many devices
35still rely on it. Implementing both HALs is also supported by
36the Android camera service, which is useful when you want to support a
37less capable front-facing camera with version 1 of HAL and a more advanced
38back-facing camera with the version 3 of HAL. Version 2 was a stepping stone to
39version 3 and is not supported.</p>
40
41<p class="note"><strong>Note:</strong> The new camera HAL is in active development and can change
42 at any time. This document describes at a high level the design of the camera subsystem and
43 omits many details. Stay tuned for more updates to the PDK repository and look out for updates
44 to the HAL and reference implementation of the HAL for more information.
45</p>
46
47
48<h2 id="overview">Overview</h2>
49<p>Version 1 of the camera subsystem was designed as a black box with high-level controls.
50 Roughly speaking, the old subsystem has three operating modes:
51</p>
52
53<ul>
54<li>Preview</li>
55<li>Video Record</li>
56<li>Still Capture</li>
57</ul>
58
59<p>Each mode has slightly different capabilities and overlapping capabilities.
60This made it hard to implement new types of features, such as burst mode,
61since it would fall between two of these modes.
62</p>
63
64<p>
65Version 3 of the camera subsystem structures the operation modes into a single unified view,
66which can be used to implement any of the previous modes and several others, such as burst mode.
67In simple terms, the app framework requests a frame from the camera subsystem,
68and the camera subsystem returns results to an output stream.
69In addition, metadata that contains information such as
70color spaces and lens shading is generated for each set of results.
71The following sections and diagram give you more detail about each component.</p>
72
73 <img src="images/camera2_block.png" />
74
75 <p class="img-caption"><strong>Figure 1.</strong> Camera block diagram</p>
76 <h3 id="supported-version">Supported version</h3>
77 <p>Camera devices that support this version of the HAL must return
78 CAMERA_DEVICE_API_VERSION_3_1 in camera_device_t.common.version and in
79 camera_info_t.device_version (from camera_module_t.get_camera_info).</p>
80<p>Camera modules that may contain version 3.1 devices must implement at least
81 version 2.0 of the camera module interface (as defined by
82 camera_module_t.common.module_api_version).</p>
83 <p>See camera_common.h for more versioning details. </p>
84 <h3 id="version-history">Version history</h3>
85<h4>1.0</h4>
86<p>Initial Android camera HAL (Android 4.0) [camera.h]: </p>
87 <ul>
88 <li> Converted from C++ CameraHardwareInterface abstraction layer.</li>
89 <li> Supports android.hardware.Camera API.</li>
90</ul>
91 <h4>2.0</h4>
92 <p>Initial release of expanded-capability HAL (Android 4.2) [camera2.h]:</p>
93 <ul>
94 <li> Sufficient for implementing existing android.hardware.Camera API.</li>
95 <li> Allows for ZSL queue in camera service layer</li>
96 <li> Not tested for any new features such manual capture control, Bayer RAW
97 capture, reprocessing of RAW data.</li>
98 </ul>
99 <h4>3.0</h4>
100 <p>First revision of expanded-capability HAL:</p>
101 <ul>
102 <li> Major version change since the ABI is completely different. No change to
103 the required hardware capabilities or operational model from 2.0.</li>
104 <li> Reworked input request and stream queue interfaces: Framework calls into
105 HAL with next request and stream buffers already dequeued. Sync framework
106 support is included, necessary for efficient implementations.</li>
107 <li> Moved triggers into requests, most notifications into results.</li>
108 <li> Consolidated all callbacks into framework into one structure, and all
109 setup methods into a single initialize() call.</li>
110 <li> Made stream configuration into a single call to simplify stream
111 management. Bidirectional streams replace STREAM_FROM_STREAM construct.</li>
112 <li> Limited mode semantics for older/limited hardware devices.</li>
113 </ul>
114 <h4>3.1</h4>
115 <p>Minor revision of expanded-capability HAL:</p>
116 <ul>
117 <li> configure_streams passes consumer usage flags to the HAL.</li>
118 <li> flush call to drop all in-flight requests/buffers as fast as possible.
119 </li>
120 </ul>
121<h2 id="requests">Requests</h2>
122<p>
123The app framework issues requests for captured results to the
124camera subsystem. One request corresponds to one set of results. A request encapsulates
125all configuration information about the capturing
126and processing of those results. This includes things such as resolution and pixel format; manual
127sensor, lens, and flash control; 3A operating modes; RAW to YUV processing control; and statistics
128generation. This allows for much more control over the results' output and processing. Multiple
129requests can be in flight at once and submitting requests is non-blocking. And the requests are always
130processed in the order they are received.
131</p>
132
133
134<h2 id="hal">The HAL and camera subsystem</h2>
135<p>
136The camera subsystem includes the implementations for components in the camera pipeline such as the 3A algorithm and processing controls. The camera HAL
137provides interfaces for you to implement your versions of these components. To maintain cross-platform compatibility between
138multiple device manufacturers and ISP vendors, the camera pipeline model is virtual and does not directly correspond to any real ISP.
139However, it is similar enough to real processing pipelines so that you can map it to your hardware efficiently.
140In addition, it is abstract enough to allow for multiple different algorithms and orders of operation
141without compromising either quality, efficiency, or cross-device compatibility.<p>
142
143<p>
144 The camera pipeline also supports triggers
145that the app framework can initiate to turn on things such as auto-focus. It also sends notifications back
146to the app framework, notifying apps of events such as an auto-focus lock or errors. </p>
147
148 <img id="figure2" src="images/camera2_hal.png" /> <p class="img-caption"><strong>Figure 2.</strong> Camera pipeline
149
150<p>
151Please note, some image processing blocks shown in the diagram above are not
152well-defined in the initial release.
153</p>
154
155<p>
156The camera pipeline makes the following assumptions:
157</p>
158
159<ul>
160 <li>RAW Bayer output undergoes no processing inside the ISP.</li>
161 <li>Statistics are generated based off the raw sensor data.</li>
162 <li>The various processing blocks that convert raw sensor data to YUV are in
163an arbitrary order.</li>
164 <li>While multiple scale and crop units are shown, all scaler units share the output region controls (digital zoom).
165 However, each unit may have a different output resolution and pixel format.</li>
166</ul>
167
168<h3 id="startup">Startup and expected operation sequence</h3>
169<p>Please see <a
170href="https://android.googlesource.com/platform/hardware/libhardware/+/master/include/hardware/camera3.h">platform/hardware/libhardware/include/hardware/camera3.h</a>
171for definitions of these structures and methods.</p>
172<ol>
173 <li>Framework calls camera_module_t-&gt;common.open(), which returns a
174 hardware_device_t structure.</li>
175 <li>Framework inspects the hardware_device_t-&gt;version field, and
176instantiates
177 the appropriate handler for that version of the camera hardware device. In
178 case the version is CAMERA_DEVICE_API_VERSION_3_0, the device is cast to
179 a camera3_device_t.</li>
180 <li>Framework calls camera3_device_t-&gt;ops-&gt;initialize() with the
181framework
182 callback function pointers. This will only be called this one time after
183 open(), before any other functions in the ops structure are called.</li>
184 <li>The framework calls camera3_device_t-&gt;ops-&gt;configure_streams() with
185a list
186 of input/output streams to the HAL device.</li>
187 <li>The framework allocates gralloc buffers and calls
188 camera3_device_t-&gt;ops-&gt;register_stream_buffers() for at least one of
189the
190 output streams listed in configure_streams. The same stream is registered
191 only once.</li>
192 <li>The framework requests default settings for some number of use cases with
193 calls to camera3_device_t-&gt;ops-&gt;construct_default_request_settings().
194This
195 may occur any time after step 3.</li>
196 <li>The framework constructs and sends the first capture request to the HAL
197 with settings based on one of the sets of default settings, and with at
198 least one output stream that has been registered earlier by the
199 framework. This is sent to the HAL with
200 camera3_device_t-&gt;ops-&gt;process_capture_request(). The HAL must block
201the
202 return of this call until it is ready for the next request to be sent.</li>
203 <li>The framework continues to submit requests, and possibly call
204 register_stream_buffers() for not-yet-registered streams, and call
205 construct_default_request_settings to get default settings buffers for
206 other use cases.</li>
207 <li>When the capture of a request begins (sensor starts exposing for the
208 capture), the HAL calls camera3_callback_ops_t-&gt;notify() with the SHUTTER
209 event, including the frame number and the timestamp for start of exposure.
210 This notify call must be made before the first call to
211 process_capture_result() for that frame number.</li>
212 <li>After some pipeline delay, the HAL begins to return completed captures to
213 the framework with camera3_callback_ops_t-&gt;process_capture_result().
214These
215 are returned in the same order as the requests were submitted. Multiple
216 requests can be in flight at once, depending on the pipeline depth of the
217 camera HAL device.</li>
218 <li>After some time, the framework may stop submitting new requests, wait for
219 the existing captures to complete (all buffers filled, all results
220 returned), and then call configure_streams() again. This resets the camera
221 hardware and pipeline for a new set of input/output streams. Some streams
222 may be reused from the previous configuration; if these streams' buffers
223 had already been registered with the HAL, they will not be registered
224 again. The framework then continues from step 7, if at least one
225 registered output stream remains. (Otherwise, step 5 is required
226first.)</li>
227 <li>Alternatively, the framework may call
228camera3_device_t-&gt;common-&gt;close()
229 to end the camera session. This may be called at any time when no other
230 calls from the framework are active, although the call may block until all
231 in-flight captures have completed (all results returned, all buffers
232 filled). After the close call returns, no more calls to the
233 camera3_callback_ops_t functions are allowed from the HAL. Once the
234 close() call is underway, the framework may not call any other HAL device
235 functions.</li>
236 <li>In case of an error or other asynchronous event, the HAL must call
237 camera3_callback_ops_t-&gt;notify() with the appropriate error/event
238 message. After returning from a fatal device-wide error notification, the
239 HAL should act as if close() had been called on it. However, the HAL must
240 either cancel or complete all outstanding captures before calling
241 notify(), so that once notify() is called with a fatal error, the
242 framework will not receive further callbacks from the device. Methods
243 besides close() should return -ENODEV or NULL after the notify() method
244 returns from a fatal error message.
245 </li>
246</ol>
247<h3>Operational modes</h3>
248<p>The camera 3 HAL device can implement one of two possible operational modes:
249 limited and full. Full support is expected from new higher-end
250 devices. Limited mode has hardware requirements roughly in line with those
251 for a camera HAL device v1 implementation, and is expected from older or
252 inexpensive devices. Full is a strict superset of limited, and they share the
253 same essential operational flow, as documented above.</p>
254<p>The HAL must indicate its level of support with the
255 android.info.supportedHardwareLevel static metadata entry, with 0 indicating
256 limited mode, and 1 indicating full mode support.</p>
257<p>Roughly speaking, limited-mode devices do not allow for application control
258 of capture settings (3A control only), high-rate capture of high-resolution
259 images, raw sensor readout, or support for YUV output streams above maximum
260 recording resolution (JPEG only for large images).</p>
261<p>Here are the details of limited-mode behavior:</p>
262<ul>
263 <li>Limited-mode devices do not need to implement accurate synchronization
264 between capture request settings and the actual image data
265 captured. Instead, changes to settings may take effect some time in the
266 future, and possibly not for the same output frame for each settings
267 entry. Rapid changes in settings may result in some settings never being
268 used for a capture. However, captures that include high-resolution output
269 buffers ( &gt; 1080p ) have to use the settings as specified (but see below
270 for processing rate).<br />
271 <br />
272 </li>
273 <li>(TODO: Is this reference properly located? It was after the settings list below.) Captures in limited mode that include high-resolution (&gt; 1080p) output
274 buffers may block in process_capture_request() until all the output buffers
275 have been filled. A full-mode HAL device must process sequences of
276 high-resolution requests at the rate indicated in the static metadata for
277 that pixel format. The HAL must still call process_capture_result() to
278 provide the output; the framework must simply be prepared for
279 process_capture_request() to block until after process_capture_result() for
280 that request completes for high-resolution captures for limited-mode
281 devices.<br />
282 <br />
283 </li>
284 <li>Limited-mode devices do not need to support most of the
285 settings/result/static info metadata. Full-mode devices must support all
286 metadata fields listed in TODO. Specifically, only the following settings
287 are expected to be consumed or produced by a limited-mode HAL device:
288 <blockquote>
289 <p> android.control.aeAntibandingMode (controls)<br />
290android.control.aeExposureCompensation (controls)<br />
291android.control.aeLock (controls)<br />
292android.control.aeMode (controls)<br />
293&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[OFF means ON_FLASH_TORCH - TODO]<br />
294android.control.aeRegions (controls)<br />
295android.control.aeTargetFpsRange (controls)<br />
296android.control.afMode (controls)<br />
297&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[OFF means infinity focus]<br />
298android.control.afRegions (controls)<br />
299android.control.awbLock (controls)<br />
300android.control.awbMode (controls)<br />
301&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[OFF not supported]<br />
302android.control.awbRegions (controls)<br />
303android.control.captureIntent (controls)<br />
304android.control.effectMode (controls)<br />
305android.control.mode (controls)<br />
306&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[OFF not supported]<br />
307android.control.sceneMode (controls)<br />
308android.control.videoStabilizationMode (controls)<br />
309android.control.aeAvailableAntibandingModes (static)<br />
310android.control.aeAvailableModes (static)<br />
311android.control.aeAvailableTargetFpsRanges (static)<br />
312android.control.aeCompensationRange (static)<br />
313android.control.aeCompensationStep (static)<br />
314android.control.afAvailableModes (static)<br />
315android.control.availableEffects (static)<br />
316android.control.availableSceneModes (static)<br />
317android.control.availableVideoStabilizationModes (static)<br />
318android.control.awbAvailableModes (static)<br />
319android.control.maxRegions (static)<br />
320android.control.sceneModeOverrides (static)<br />
321android.control.aeRegions (dynamic)<br />
322android.control.aeState (dynamic)<br />
323android.control.afMode (dynamic)<br />
324android.control.afRegions (dynamic)<br />
325android.control.afState (dynamic)<br />
326android.control.awbMode (dynamic)<br />
327android.control.awbRegions (dynamic)<br />
328android.control.awbState (dynamic)<br />
329android.control.mode (dynamic)</p>
330 <p> android.flash.info.available (static)</p>
331 <p> android.info.supportedHardwareLevel (static)</p>
332 <p> android.jpeg.gpsCoordinates (controls)<br />
333 android.jpeg.gpsProcessingMethod (controls)<br />
334 android.jpeg.gpsTimestamp (controls)<br />
335 android.jpeg.orientation (controls)<br />
336 android.jpeg.quality (controls)<br />
337 android.jpeg.thumbnailQuality (controls)<br />
338 android.jpeg.thumbnailSize (controls)<br />
339 android.jpeg.availableThumbnailSizes (static)<br />
340 android.jpeg.maxSize (static)<br />
341 android.jpeg.gpsCoordinates (dynamic)<br />
342 android.jpeg.gpsProcessingMethod (dynamic)<br />
343 android.jpeg.gpsTimestamp (dynamic)<br />
344 android.jpeg.orientation (dynamic)<br />
345 android.jpeg.quality (dynamic)<br />
346 android.jpeg.size (dynamic)<br />
347 android.jpeg.thumbnailQuality (dynamic)<br />
348 android.jpeg.thumbnailSize (dynamic)</p>
349 <p> android.lens.info.minimumFocusDistance (static)</p>
350 <p> android.request.id (controls)<br />
351 android.request.id (dynamic)</p>
352 <p> android.scaler.cropRegion (controls)<br />
353 &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[ignores (x,y), assumes center-zoom]<br />
354 android.scaler.availableFormats (static)<br />
355 &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[RAW not supported]<br />
356 android.scaler.availableJpegMinDurations (static)<br />
357 android.scaler.availableJpegSizes (static)<br />
358 android.scaler.availableMaxDigitalZoom (static)<br />
359 android.scaler.availableProcessedMinDurations (static)<br />
360 android.scaler.availableProcessedSizes (static)<br />
361 &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[full resolution not supported]<br />
362 android.scaler.maxDigitalZoom (static)<br />
363 android.scaler.cropRegion (dynamic)</p>
364 <p> android.sensor.orientation (static)<br />
365 android.sensor.timestamp (dynamic)</p>
366 <p> android.statistics.faceDetectMode (controls)<br />
367 android.statistics.info.availableFaceDetectModes (static)<br />
368 android.statistics.faceDetectMode (dynamic)<br />
369 android.statistics.faceIds (dynamic)<br />
370 android.statistics.faceLandmarks (dynamic)<br />
371 android.statistics.faceRectangles (dynamic)<br />
372 android.statistics.faceScores (dynamic)</p>
373 </blockquote>
374 </li>
375</ul>
376<h3 id="interaction">Interaction between the application capture request, 3A control, and the
377processing pipeline</h3>
378
379<p>
380Depending on the settings in the 3A control block, the camera pipeline ignores some of the parameters in the application’s capture request
381and uses the values provided by the 3A control routines instead. For example, when auto-exposure is active, the exposure time,
382frame duration, and sensitivity parameters of the sensor are controlled by the platform 3A algorithm,
383and any app-specified values are ignored. The values chosen for the frame by the 3A routines must be
384reported in the output metadata. The following table describes the different modes of the 3A control block
385and the properties that are controlled by these modes. See the
386platform/system/media/camera/docs/docs.html file for definitions of these
387properties.
388</p>
389
390
391<table>
392 <tr>
393 <th>Parameter</th>
394 <th>State</th>
395 <th>Properties controlled</th>
396 </tr>
397
398 <tr>
399 <td rowspan="5">android.control.aeMode</td>
400 <td>OFF</td>
401 <td>None</td>
402 </tr>
403 <tr>
404 <td>ON</td>
405 <td>
406 <ul>
407 <li>android.sensor.exposureTime</li>
408 <li>android.sensor.frameDuration</li>
409 <li>android.sensor.sensitivity</li>
410 <li>android.lens.aperture (if supported)</li>
411 <li>android.lens.filterDensity (if supported)</li>
412 </ul>
413 </tr>
414 <tr>
415 <td>ON_AUTO_FLASH</td>
416 <td>Everything is ON, plus android.flash.firingPower, android.flash.firingTime, and android.flash.mode</td>
417 </tr>
418
419 <tr>
420 <td>ON_ALWAYS_FLASH</td>
421 <td>Same as ON_AUTO_FLASH</td>
422 </tr>
423
424 <tr>
425 <td>ON_AUTO_FLASH_RED_EYE</td>
426 <td>Same as ON_AUTO_FLASH</td>
427 </tr>
428
429 <tr>
430 <td rowspan="2">android.control.awbMode</td>
431 <td>OFF</td>
432 <td>None</td>
433 </tr>
434
435 <tr>
436 <td>WHITE_BALANCE_*</td>
437 <td>android.colorCorrection.transform. Platform-specific adjustments if android.colorCorrection.mode is FAST or HIGH_QUALITY.</td>
438 </tr>
439
440 <tr>
441 <td rowspan="2">android.control.afMode</td>
442 <td>OFF</td>
443 <td>None</td>
444 </tr>
445
446 <tr>
447 <td>FOCUS_MODE_*</td>
448 <td>android.lens.focusDistance</td>
449 </tr>
450
451 <tr>
452 <td rowspan="2">android.control.videoStabilization</td>
453 <td>OFF</td>
454 <td>None</td>
455 </tr>
456
457 <tr>
458 <td>ON</td>
459 <td>Can adjust android.scaler.cropRegion to implement video stabilization</td>
460 </tr>
461
462 <tr>
463 <td rowspan="3">android.control.mode</td>
464 <td>OFF</td>
465 <td>AE, AWB, and AF are disabled</td>
466 </tr>
467
468 <tr>
469 <td>AUTO</td>
470 <td>Individual AE, AWB, and AF settings are used</td>
471 </tr>
472
473 <tr>
474 <td>SCENE_MODE_*</td>
475 <td>Can override all parameters listed above. Individual 3A controls are disabled.</td>
476 </tr>
477
478</table>
479
480<p>The controls exposed for the 3A algorithm mostly map 1:1 to the old API’s parameters
481 (such as exposure compensation, scene mode, or white balance mode).
482</p>
483
484
485<p>
486The controls in the Image Processing block in <a href="#figure2">Figure 2</a> all operate on a similar principle, and generally each block has three modes:
487</p>
488
489<ul>
490 <li>
491 OFF: This processing block is disabled. The demosaic, color correction, and tone curve adjustment blocks cannot be disabled.
492 </li>
493 <li>
494 FAST: In this mode, the processing block may not slow down the output frame rate compared to OFF mode, but should otherwise produce the best-quality output it can given that restriction. Typically, this would be used for preview or video recording modes, or burst capture for still images. On some devices, this may be equivalent to OFF mode (no processing can be done without slowing down the frame rate), and on some devices, this may be equivalent to HIGH_QUALITY mode (best quality still does not slow down frame rate).
495 </li>
496 <li>
497 HIGH_QUALITY: In this mode, the processing block should produce the best quality result possible, slowing down the output frame rate as needed. Typically, this would be used for high-quality still capture. Some blocks include a manual control which can be optionally selected instead of FAST or HIGH_QUALITY. For example, the color correction block supports a color transform matrix, while the tone curve adjustment supports an arbitrary global tone mapping curve.
498 </li>
499</ul>
500
501<p>See the <a href="">Android Camera Processing Pipeline Properties</a> spreadsheet for more information on all available properties.</p>
502
503<h2 id="metadata">Metadata support</h2>
504
505<p>To support the saving of DNG files by the Android framework, substantial metadata is required about the sensor’s characteristics. This includes information such as color spaces and lens shading functions.</p>
506<p>
507Most of this information is a static property of the camera subsystem, and can therefore be queried before configuring any output pipelines or submitting any requests. The new camera APIs greatly expand the information provided by the <code>getCameraInfo()</code> method to provide this information to the application.
508</p>
509<p>
510In addition, manual control of the camera subsystem requires feedback from the
511assorted devices about their current state, and the actual parameters used in
512capturing a given frame. If an application needs to implement a custom 3A
513routine (for example, to properly meter for an HDR burst), it needs to know the settings used to capture the latest set of results it has received in order to update the settings for the next request. Therefore, the new camera API adds a substantial amount of dynamic metadata to each captured frame. This includes the requested and actual parameters used for the capture, as well as additional per-frame metadata such as timestamps and statistics generator output.
514</p>
515
516<h2 id="3amodes">3A modes and state machines</h2>
517<p>While the actual 3A algorithms are up to the HAL implementation, a high-level
518 state machine description is defined by the HAL interface to allow the HAL
519 device and the framework to communicate about the current state of 3A and
520trigger 3A events.</p>
521<p>When the device is opened, all the individual 3A states must be
522 STATE_INACTIVE. Stream configuration does not reset 3A. For example, locked
523 focus must be maintained across the configure() call.</p>
524<p>Triggering a 3A action involves simply setting the relevant trigger entry in
525 the settings for the next request to indicate start of trigger. For example,
526 the trigger for starting an autofocus scan is setting the entry
527 ANDROID_CONTROL_AF_TRIGGER to ANDROID_CONTROL_AF_TRIGGER_START for one
528 request; and cancelling an autofocus scan is triggered by setting
529 ANDROID_CONTROL_AF_TRIGGER to ANDROID_CONTRL_AF_TRIGGER_CANCEL. Otherwise,
530 the entry will not exist or be set to ANDROID_CONTROL_AF_TRIGGER_IDLE. Each
531 request with a trigger entry set to a non-IDLE value will be treated as an
532 independent triggering event.</p>
533<p>At the top level, 3A is controlled by the ANDROID_CONTROL_MODE setting. It
534 selects between no 3A (ANDROID_CONTROL_MODE_OFF), normal AUTO mode
535 (ANDROID_CONTROL_MODE_AUTO), and using the scene mode setting
536 (ANDROID_CONTROL_USE_SCENE_MODE):</p>
537<ul>
538 <li>In OFF mode, each of the individual Auto-focus(AF), auto-exposure (AE),
539and auto-whitebalance (AWB) modes are effectively OFF,
540 and none of the capture controls may be overridden by the 3A routines.</li>
541 <li>In AUTO mode, AF, AE, and AWB modes all run
542 their own independent algorithms, and have their own mode, state, and
543 trigger metadata entries, as listed in the next section.</li>
544 <li>In USE_SCENE_MODE, the value of the ANDROID_CONTROL_SCENE_MODE entry must
545 be used to determine the behavior of 3A routines. In SCENE_MODEs other than
546 FACE_PRIORITY, the HAL must override the values of
547 ANDROID_CONTROL_AE/AWB/AF_MODE to be the mode it prefers for the selected
548 SCENE_MODE. For example, the HAL may prefer SCENE_MODE_NIGHT to use
549 CONTINUOUS_FOCUS AF mode. Any user selection of AE/AWB/AF_MODE when scene
550 must be ignored for these scene modes.</li>
551 <li>For SCENE_MODE_FACE_PRIORITY, the AE/AWB/AF_MODE controls work as in
552 ANDROID_CONTROL_MODE_AUTO, but the 3A routines must bias toward metering
553 and focusing on any detected faces in the scene.
554 </li>
555</ul>
556
557<h3 id="autofocus">Auto-focus settings and result entries</h3>
558<p>Main metadata entries:</p>
559<p>ANDROID_CONTROL_AF_MODE: Control for selecting the current autofocus
560mode. Set by the framework in the request settings.</p>
561<p>AF_MODE_OFF: AF is disabled; the framework/app directly controls lens
562position.</p>
563<p>AF_MODE_AUTO: Single-sweep autofocus. No lens movement unless AF is
564triggered.</p>
565<p>AF_MODE_MACRO: Single-sweep up-close autofocus. No lens movement unless
566AF is triggered.</p>
567<p>AF_MODE_CONTINUOUS_VIDEO: Smooth continuous focusing, for recording
568 video. Triggering immediately locks focus in current
569position. Canceling resumes cotinuous focusing.</p>
570<p>AF_MODE_CONTINUOUS_PICTURE: Fast continuous focusing, for
571 zero-shutter-lag still capture. Triggering locks focus once currently
572active sweep concludes. Canceling resumes continuous focusing.</p>
573<p>AF_MODE_EDOF: Advanced extended depth of field focusing. There is no
574 autofocus scan, so triggering one or canceling one has no effect.
575Images are focused automatically by the HAL.</p>
576<p>ANDROID_CONTROL_AF_STATE: Dynamic metadata describing the current AF
577algorithm state, reported by the HAL in the result metadata.</p>
578<p>AF_STATE_INACTIVE: No focusing has been done, or algorithm was
579 reset. Lens is not moving. Always the state for MODE_OFF or MODE_EDOF.
580When the device is opened, it must start in this state.</p>
581<p>AF_STATE_PASSIVE_SCAN: A continuous focus algorithm is currently scanning
582for good focus. The lens is moving.</p>
583<p>AF_STATE_PASSIVE_FOCUSED: A continuous focus algorithm believes it is
584 well focused. The lens is not moving. The HAL may spontaneously leave
585this state.</p>
586<p>AF_STATE_ACTIVE_SCAN: A scan triggered by the user is underway.</p>
587<p>AF_STATE_FOCUSED_LOCKED: The AF algorithm believes it is focused. The
588lens is not moving.</p>
589<p>AF_STATE_NOT_FOCUSED_LOCKED: The AF algorithm has been unable to
590focus. The lens is not moving.</p>
591<p>ANDROID_CONTROL_AF_TRIGGER: Control for starting an autofocus scan, the
592 meaning of which depends on mode and state. Set by the framework in
593the request settings.</p>
594<p>AF_TRIGGER_IDLE: No current trigger.</p>
595<p>AF_TRIGGER_START: Trigger start of AF scan. Effect depends on mode and
596state.</p>
597<p>AF_TRIGGER_CANCEL: Cancel current AF scan if any, and reset algorithm to
598default.</p>
599<p>Additional metadata entries:</p>
600<p>ANDROID_CONTROL_AF_REGIONS: Control for selecting the regions of the field of
601view (FOV)
602 that should be used to determine good focus. This applies to all AF
603 modes that scan for focus. Set by the framework in the request
604settings.</p>
605
606<h3 id="autoexpose">Auto-exposure settings and result entries</h3>
607<p>Main metadata entries:</p>
608<p>ANDROID_CONTROL_AE_MODE: Control for selecting the current auto-exposure
609mode. Set by the framework in the request settings.</p>
610<p>
611 AE_MODE_OFF: Autoexposure is disabled; the user controls exposure, gain,
612 frame duration, and flash.
613</p>
614<p>AE_MODE_ON: Standard autoexposure, with flash control disabled. User may
615 set flash to fire or to torch mode.
616</p>
617<p>AE_MODE_ON_AUTO_FLASH: Standard autoexposure, with flash on at HAL's
618 discretion for precapture and still capture. User control of flash
619 disabled.
620</p>
621<p>AE_MODE_ON_ALWAYS_FLASH: Standard autoexposure, with flash always fired
622 for capture, and at HAL's discretion for precapture. User control of
623 flash disabled.
624</p>
625<p>AE_MODE_ON_AUTO_FLASH_REDEYE: Standard autoexposure, with flash on at
626 HAL's discretion for precapture and still capture. Use a flash burst
627 at end of precapture sequence to reduce redeye in the final
628 picture. User control of flash disabled.
629</p>
630<p>ANDROID_CONTROL_AE_STATE: Dynamic metadata describing the current AE
631 algorithm state, reported by the HAL in the result metadata.
632</p>
633<p>AE_STATE_INACTIVE: Initial AE state after mode switch. When the device is
634 opened, it must start in this state.
635</p>
636<p>AE_STATE_SEARCHING: AE is not converged to a good value and is adjusting
637 exposure parameters.
638</p>
639<p>AE_STATE_CONVERGED: AE has found good exposure values for the current
640 scene, and the exposure parameters are not changing. HAL may
641 spontaneously leave this state to search for a better solution.
642</p>
643<p>AE_STATE_LOCKED: AE has been locked with the AE_LOCK control. Exposure
644 values are not changing.
645</p>
646<p>AE_STATE_FLASH_REQUIRED: The HAL has converged exposure but believes
647 flash is required for a sufficiently bright picture. Used for
648 determining if a zero-shutter-lag frame can be used.
649</p>
650<p>AE_STATE_PRECAPTURE: The HAL is in the middle of a precapture
651 sequence. Depending on AE mode, this mode may involve firing the
652 flash for metering or a burst of flash pulses for redeye reduction.
653</p>
654<p>ANDROID_CONTROL_AE_PRECAPTURE_TRIGGER: Control for starting a metering
655 sequence before capturing a high-quality image. Set by the framework in
656 the request settings.
657</p>
658<p>PRECAPTURE_TRIGGER_IDLE: No current trigger.
659</p>
660<p>PRECAPTURE_TRIGGER_START: Start a precapture sequence. The HAL should
661 use the subsequent requests to measure good exposure/white balance
662 for an upcoming high-resolution capture.
663</p>
664<p>Additional metadata entries:
665</p>
666<p>ANDROID_CONTROL_AE_LOCK: Control for locking AE controls to their current
667 values.</p>
668<p>ANDROID_CONTROL_AE_EXPOSURE_COMPENSATION: Control for adjusting AE
669 algorithm target brightness point.</p>
670<p>ANDROID_CONTROL_AE_TARGET_FPS_RANGE: Control for selecting the target frame
671 rate range for the AE algorithm. The AE routine cannot change the frame
672 rate to be outside these bounds.</p>
673<p>ANDROID_CONTROL_AE_REGIONS: Control for selecting the regions of the FOV
674 that should be used to determine good exposure levels. This applies to
675 all AE modes besides OFF.
676</p>
677
678<h3 id="autowb">Auto-whitebalance settings and result entries</h3>
679<p>Main metadata entries:</p>
680<p>ANDROID_CONTROL_AWB_MODE: Control for selecting the current white-balance
681 mode.
682</p>
683<p>AWB_MODE_OFF: Auto-whitebalance is disabled. User controls color matrix.
684</p>
685<p>AWB_MODE_AUTO: Automatic white balance is enabled; 3A controls color
686 transform, possibly using more complex transforms than a simple
687 matrix.
688</p>
689<p>AWB_MODE_INCANDESCENT: Fixed white balance settings good for indoor
690 incandescent (tungsten) lighting, roughly 2700K.
691</p>
692<p>AWB_MODE_FLUORESCENT: Fixed white balance settings good for fluorescent
693 lighting, roughly 5000K.
694</p>
695<p>AWB_MODE_WARM_FLUORESCENT: Fixed white balance settings good for
696 fluorescent lighting, roughly 3000K.
697</p>
698<p>AWB_MODE_DAYLIGHT: Fixed white balance settings good for daylight,
699 roughly 5500K.
700</p>
701<p>AWB_MODE_CLOUDY_DAYLIGHT: Fixed white balance settings good for clouded
702 daylight, roughly 6500K.
703</p>
704<p>AWB_MODE_TWILIGHT: Fixed white balance settings good for
705 near-sunset/sunrise, roughly 15000K.
706</p>
707<p>AWB_MODE_SHADE: Fixed white balance settings good for areas indirectly
708 lit by the sun, roughly 7500K.
709</p>
710<p>ANDROID_CONTROL_AWB_STATE: Dynamic metadata describing the current AWB
711 algorithm state, reported by the HAL in the result metadata.
712</p>
713<p>AWB_STATE_INACTIVE: Initial AWB state after mode switch. When the device
714 is opened, it must start in this state.
715</p>
716<p>AWB_STATE_SEARCHING: AWB is not converged to a good value and is
717 changing color adjustment parameters.
718</p>
719<p>AWB_STATE_CONVERGED: AWB has found good color adjustment values for the
720 current scene, and the parameters are not changing. HAL may
721 spontaneously leave this state to search for a better solution.
722</p>
723<p>AWB_STATE_LOCKED: AWB has been locked with the AWB_LOCK control. Color
724 adjustment values are not changing.
725</p>
726<p>Additional metadata entries:
727</p>
728<p>ANDROID_CONTROL_AWB_LOCK: Control for locking AWB color adjustments to
729 their current values.
730</p>
731<p>ANDROID_CONTROL_AWB_REGIONS: Control for selecting the regions of the FOV
732 that should be used to determine good color balance. This applies only
733 to auto-whitebalance mode.
734</p>
735
736<h3 id="genstate">General state machine transition notes
737</h3>
738<p>Switching between AF, AE, or AWB modes always resets the algorithm's state
739 to INACTIVE. Similarly, switching between CONTROL_MODE or
740 CONTROL_SCENE_MODE if CONTROL_MODE == USE_SCENE_MODE resets all the
741 algorithm states to INACTIVE.
742</p>
743<p>The tables below are per-mode.
744</p>
745
746<h3 id="af-state">AF state machines</h3>
747<table width="100%" border="1">
748 <tr>
749 <td colspan="4" scope="col"><h4>mode = AF_MODE_OFF or AF_MODE_EDOF</h4></td>
750 </tr>
751 <tr>
752 <th scope="col">State</th>
753 <th scope="col">Transformation cause</th>
754 <th scope="col">New state</th>
755 <th scope="col">Notes</th>
756 </tr>
757 <tr>
758 <td>INACTIVE</td>
759 <td>&nbsp;</td>
760 <td>&nbsp;</td>
761 <td>AF is disabled</td>
762 </tr>
763 <tr>
764 <td colspan="4"><h4>mode = AF_MODE_AUTO or AF_MODE_MACRO</h4></td>
765 </tr>
766 <tr>
767 <th scope="col">State</th>
768 <th scope="col">Transformation cause</th>
769 <th scope="col">New state</th>
770 <th scope="col">Notes</th>
771 </tr>
772 <tr>
773 <td>INACTIVE</td>
774 <td>AF_TRIGGER</td>
775 <td>ACTIVE_SCAN</td>
776 <td>Start AF sweep<br />
777 Lens now moving</td>
778 </tr>
779 <tr>
780 <td>ACTIVE_SCAN</td>
781 <td>AF sweep done</td>
782 <td>FOCUSED_LOCKED</td>
783 <td>If AF successful<br />
784 Lens now locked </td>
785 </tr>
786 <tr>
787 <td>ACTIVE_SCAN</td>
788 <td>AF sweep done</td>
789 <td>NOT_FOCUSED_LOCKED</td>
790 <td>If AF successful<br />
791Lens now locked </td>
792 </tr>
793 <tr>
794 <td>ACTIVE_SCAN</td>
795 <td>AF_CANCEL</td>
796 <td>INACTIVE</td>
797 <td>Cancel/reset AF<br />
798 Lens now locked</td>
799 </tr>
800 <tr>
801 <td>FOCUSED_LOCKED</td>
802 <td>AF_CANCEL</td>
803 <td>INACTIVE</td>
804 <td>Cancel/reset AF</td>
805 </tr>
806 <tr>
807 <td>FOCUSED_LOCKED</td>
808 <td>AF_TRIGGER</td>
809 <td>ACTIVE_SCAN </td>
810 <td>Start new sweep<br />
811 Lens now moving</td>
812 </tr>
813 <tr>
814 <td>NOT_FOCUSED_LOCKED</td>
815 <td>AF_CANCEL</td>
816 <td>INACTIVE</td>
817 <td>Cancel/reset AF</td>
818 </tr>
819 <tr>
820 <td>NOT_FOCUSED_LOCKED</td>
821 <td>AF_TRIGGER</td>
822 <td>ACTIVE_SCAN</td>
823 <td>Start new sweep<br />
824Lens now moving</td>
825 </tr>
826 <tr>
827 <td>All states</td>
828 <td>mode change </td>
829 <td>INACTIVE</td>
830 <td>&nbsp;</td>
831 </tr>
832 <tr>
833 <td colspan="4"><h4>mode = AF_MODE_CONTINUOUS_VIDEO</h4></td>
834 </tr>
835 <tr>
836 <th scope="col">State</th>
837 <th scope="col">Transformation cause</th>
838 <th scope="col">New state</th>
839 <th scope="col">Notes</th>
840 </tr>
841 <tr>
842 <td>INACTIVE</td>
843 <td>HAL initiates new scan</td>
844 <td>PASSIVE_SCAN</td>
845 <td>Start AF sweep<br />
846Lens now moving</td>
847 </tr>
848 <tr>
849 <td>INACTIVE</td>
850 <td>AF_TRIGGER</td>
851 <td>NOT_FOCUSED_LOCKED</td>
852 <td>AF state query <br />
853 Lens now locked</td>
854 </tr>
855 <tr>
856 <td>PASSIVE_SCAN</td>
857 <td>HAL completes current scan</td>
858 <td>PASSIVE_FOCUSED</td>
859 <td>End AF scan<br />
860 Lens now locked <br /></td>
861 </tr>
862 <tr>
863 <td>PASSIVE_SCAN</td>
864 <td>AF_TRIGGER</td>
865 <td>FOCUSED_LOCKED</td>
866 <td>Immediate transformation<br />
867 if focus is good<br />
868Lens now locked</td>
869 </tr>
870 <tr>
871 <td>PASSIVE_SCAN</td>
872 <td>AF_TRIGGER</td>
873 <td>NOT_FOCUSED_LOCKED</td>
874 <td>Immediate transformation<br />
875if focus is bad<br />
876Lens now locked</td>
877 </tr>
878 <tr>
879 <td>PASSIVE_SCAN</td>
880 <td>AF_CANCEL</td>
881 <td>INACTIVE</td>
882 <td>Reset lens position<br />
883 Lens now locked</td>
884 </tr>
885 <tr>
886 <td>PASSIVE_FOCUSED</td>
887 <td>HAL initiates new scan</td>
888 <td>PASSIVE_SCAN</td>
889 <td>Start AF scan<br />
890 Lens now moving</td>
891 </tr>
892 <tr>
893 <td>PASSIVE_FOCUSED</td>
894 <td>AF_TRIGGER</td>
895 <td>FOCUSED_LOCKED</td>
896 <td>Immediate transformation<br />
897if focus is good<br />
898Lens now locked</td>
899 </tr>
900 <tr>
901 <td>PASSIVE_FOCUSED</td>
902 <td>AF_TRIGGER</td>
903 <td>NOT_FOCUSED_LOCKED</td>
904 <td>Immediate transformation<br />
905if focus is bad<br />
906Lens now locked</td>
907 </tr>
908 <tr>
909 <td>FOCUSED_LOCKED</td>
910 <td>AF_TRIGGER</td>
911 <td>FOCUSED_LOCKED</td>
912 <td>No effect</td>
913 </tr>
914 <tr>
915 <td>FOCUSED_LOCKED</td>
916 <td>AF_CANCEL</td>
917 <td>INACTIVE</td>
918 <td>Restart AF scan</td>
919 </tr>
920 <tr>
921 <td>NOT_FOCUSED_LOCKED</td>
922 <td>AF_TRIGGER</td>
923 <td>NOT_FOCUSED_LOCKED</td>
924 <td>No effect</td>
925 </tr>
926 <tr>
927 <td>NOT_FOCUSED_LOCKED</td>
928 <td>AF_CANCEL</td>
929 <td>INACTIVE</td>
930 <td>Restart AF scan</td>
931 </tr>
932 <tr>
933 <td colspan="4"><h4>mode = AF_MODE_CONTINUOUS_PICTURE</h4></td>
934 </tr>
935 <tr>
936 <th scope="col">State</th>
937 <th scope="col">Transformation cause</th>
938 <th scope="col">New state</th>
939 <th scope="col">Notes</th>
940 </tr>
941 <tr>
942 <td>INACTIVE</td>
943 <td>HAL initiates new scan</td>
944 <td>PASSIVE_SCAN</td>
945 <td>Start AF scan<br />
946 Lens now moving</td>
947 </tr>
948 <tr>
949 <td>INACTIVE</td>
950 <td>AF_TRIGGER</td>
951 <td>NOT_FOCUSED_LOCKED</td>
952 <td>AF state query<br />
953 Lens now locked</td>
954 </tr>
955 <tr>
956 <td>PASSIVE_SCAN</td>
957 <td>HAL completes current scan</td>
958 <td>PASSIVE_FOCUSED</td>
959 <td>End AF scan<br />
960 Lens now locked</td>
961 </tr>
962 <tr>
963 <td>PASSIVE_SCAN</td>
964 <td>AF_TRIGGER</td>
965 <td>FOCUSED_LOCKED</td>
966 <td>Eventual transformation once focus good<br />
967 Lens now locked</td>
968 </tr>
969 <tr>
970 <td>PASSIVE_SCAN</td>
971 <td>AF_TRIGGER</td>
972 <td>NOT_FOCUSED_LOCKED</td>
973 <td>Eventual transformation if cannot focus<br />
974Lens now locked</td>
975 </tr>
976 <tr>
977 <td>PASSIVE_SCAN</td>
978 <td>AF_CANCEL</td>
979 <td>INACTIVE</td>
980 <td>Reset lens position<br />
981 Lens now locked</td>
982 </tr>
983 <tr>
984 <td>PASSIVE_FOCUSED</td>
985 <td>HAL initiates new scan</td>
986 <td>PASSIVE_SCAN</td>
987 <td>Start AF scan<br />
988Lens now moving</td>
989 </tr>
990 <tr>
991 <td>PASSIVE_FOCUSED</td>
992 <td>AF_TRIGGER</td>
993 <td>FOCUSED_LOCKED</td>
994 <td>Immediate transformation if focus is good<br />
995Lens now locked</td>
996 </tr>
997 <tr>
998 <td>PASSIVE_FOCUSED</td>
999 <td>AF_TRIGGER</td>
1000 <td>NOT_FOCUSED_LOCKED</td>
1001 <td>Immediate transformation if focus is bad<br />
1002Lens now locked</td>
1003 </tr>
1004 <tr>
1005 <td>FOCUSED_LOCKED</td>
1006 <td>AF_TRIGGER</td>
1007 <td>FOCUSED_LOCKED</td>
1008 <td>No effect</td>
1009 </tr>
1010 <tr>
1011 <td>FOCUSED_LOCKED</td>
1012 <td>AF_CANCEL</td>
1013 <td>INACTIVE</td>
1014 <td>Restart AF scan</td>
1015 </tr>
1016 <tr>
1017 <td>NOT_FOCUSED_LOCKED</td>
1018 <td>AF_TRIGGER</td>
1019 <td>NOT_FOCUSED_LOCKED</td>
1020 <td>No effect</td>
1021 </tr>
1022 <tr>
1023 <td>NOT_FOCUSED_LOCKED</td>
1024 <td>AF_CANCEL</td>
1025 <td>INACTIVE</td>
1026 <td>Restart AF scan</td>
1027 </tr>
1028</table>
1029<h3 id="aeawb-state">AE and AWB state machines</h3>
1030<p>The AE and AWB state machines are mostly identical. AE has additional
1031FLASH_REQUIRED and PRECAPTURE states. So rows below that refer to those two
1032states should be ignored for the AWB state machine.</p>
1033<table width="100%" border="1">
1034 <tr>
1035 <td colspan="4" scope="col"><h4>mode = AE_MODE_OFF / AWB mode not
1036AUTO</h4></td>
1037 </tr>
1038 <tr>
1039 <th scope="col">State</th>
1040 <th scope="col">Transformation cause</th>
1041 <th scope="col">New state</th>
1042 <th scope="col">Notes</th>
1043 </tr>
1044 <tr>
1045 <td>INACTIVE</td>
1046 <td>&nbsp;</td>
1047 <td>&nbsp;</td>
1048 <td>AE/AWB disabled</td>
1049 </tr>
1050 <tr>
1051 <td colspan="4"><h4>mode = AE_MODE_ON_* / AWB_MODE_AUTO</h4></td>
1052 </tr>
1053 <tr>
1054 <th scope="col">State</th>
1055 <th scope="col">Transformation cause</th>
1056 <th scope="col">New state</th>
1057 <th scope="col">Notes</th>
1058 </tr>
1059 <tr>
1060 <td>INACTIVE</td>
1061 <td>HAL initiates AE/AWB scan</td>
1062 <td>SEARCHING</td>
1063 <td>&nbsp;</td>
1064 </tr>
1065 <tr>
1066 <td>INACTIVE</td>
1067 <td>AE/AWB_LOCK on</td>
1068 <td>LOCKED</td>
1069 <td>Values locked</td>
1070 </tr>
1071 <tr>
1072 <td>SEARCHING</td>
1073 <td>HAL finishes AE/AWB scan</td>
1074 <td>CONVERGED</td>
1075 <td>Good values, not changing</td>
1076 </tr>
1077 <tr>
1078 <td>SEARCHING</td>
1079 <td>HAL finishes AE scan</td>
1080 <td>FLASH_REQUIRED</td>
1081 <td>Converged but too dark without flash</td>
1082 </tr>
1083 <tr>
1084 <td>SEARCHING</td>
1085 <td>AE/AWB_LOCK on</td>
1086 <td>LOCKED</td>
1087 <td>Values locked</td>
1088 </tr>
1089 <tr>
1090 <td>CONVERGED</td>
1091 <td>HAL initiates AE/AWB scan</td>
1092 <td>SEARCHING</td>
1093 <td>Values locked</td>
1094 </tr>
1095 <tr>
1096 <td>CONVERGED</td>
1097 <td>AE/AWB_LOCK on</td>
1098 <td>LOCKED</td>
1099 <td>Values locked</td>
1100 </tr>
1101 <tr>
1102 <td>FLASH_REQUIRED</td>
1103 <td>HAL initiates AE/AWB scan</td>
1104 <td>SEARCHING</td>
1105 <td>Values locked</td>
1106 </tr>
1107 <tr>
1108 <td>FLASH_REQUIRED</td>
1109 <td>AE/AWB_LOCK on</td>
1110 <td>LOCKED</td>
1111 <td>Values locked</td>
1112 </tr>
1113 <tr>
1114 <td>LOCKED</td>
1115 <td>AE/AWB_LOCK off</td>
1116 <td>SEARCHING</td>
1117 <td>Values not good after unlock</td>
1118 </tr>
1119 <tr>
1120 <td>LOCKED</td>
1121 <td>AE/AWB_LOCK off</td>
1122 <td>CONVERGED</td>
1123 <td>Values good after unlock</td>
1124 </tr>
1125 <tr>
1126 <td>LOCKED</td>
1127 <td>AE_LOCK off</td>
1128 <td>FLASH_REQUIRED</td>
1129 <td>Exposure good, but too dark</td>
1130 </tr>
1131 <tr>
1132 <td>All AE states </td>
1133 <td> PRECAPTURE_START</td>
1134 <td>PRECAPTURE</td>
1135 <td>Start precapture sequence</td>
1136 </tr>
1137 <tr>
1138 <td>PRECAPTURE</td>
1139 <td>Sequence done, AE_LOCK off </td>
1140 <td>CONVERGED</td>
1141 <td>Ready for high-quality capture</td>
1142 </tr>
1143 <tr>
1144 <td>PRECAPTURE</td>
1145 <td>Sequence done, AE_LOCK on </td>
1146 <td>LOCKED</td>
1147 <td>Ready for high-quality capture</td>
1148 </tr>
1149</table>
1150
1151<h2 id="output">Output streams</h2>
1152
1153<p>Unlike the old camera subsystem, which has 3-4 different ways of producing data from the camera (ANativeWindow-based preview operations, preview callbacks, video callbacks, and takePicture callbacks), the new subsystem operates solely on the ANativeWindow-based pipeline for all resolutions and output formats. Multiple such streams can be configured at once, to send a single frame to many targets such as the GPU, the video encoder, RenderScript, or app-visible buffers (RAW Bayer, processed YUV buffers, or JPEG-encoded buffers).
1154</p>
1155
1156<p>As an optimization, these output streams must be configured ahead of time, and only a limited number may exist at once. This allows for pre-allocation of memory buffers and configuration of the camera hardware, so that when requests are submitted with multiple or varying output pipelines listed, there won’t be delays or latency in fulfilling the request.
1157</p>
1158
1159<p>
1160To support backwards compatibility with the current camera API, at least 3 simultaneous YUV output streams must be supported, plus one JPEG stream. This is required for video snapshot support with the application also receiving YUV buffers:
1161
1162<ul>
1163 <li>One stream to the GPU/SurfaceView (opaque YUV format) for preview</li>
1164 <li>One stream to the video encoder (opaque YUV format) for recording</li>
1165 <li>One stream to the application (known YUV format) for preview frame callbacks
1166 <li>One stream to the application (JPEG) for video snapshots.</li>
1167</ul>
1168
1169<p> In addition, at least one RAW Bayer output must be supported at the same time for the new camera subsystem.
1170This means that the minimum output stream count is five (one RAW, three YUV, and one JPEG).
1171</p>
1172<h2 id="cropping">Cropping</h2>
1173<p>Cropping of the full pixel array (for digital zoom and other use cases where
1174 a smaller FOV is desirable) is communicated through the
1175 ANDROID_SCALER_CROP_REGION setting. This is a per-request setting, and can
1176 change on a per-request basis, which is critical for implementing smooth
1177 digital zoom.</p>
1178<p>The region is defined as a rectangle (x, y, width, height), with (x, y)
1179 describing the top-left corner of the rectangle. The rectangle is defined on
1180 the coordinate system of the sensor active pixel array, with (0,0) being the
1181 top-left pixel of the active pixel array. Therefore, the width and height
1182 cannot be larger than the dimensions reported in the
1183 ANDROID_SENSOR_ACTIVE_PIXEL_ARRAY static info field. The minimum allowed
1184 width and height are reported by the HAL through the
1185 ANDROID_SCALER_MAX_DIGITAL_ZOOM static info field, which describes the
1186 maximum supported zoom factor. Therefore, the minimum crop region width and
1187 height are:</p>
1188<pre>
1189{width, height} =
1190 { floor(ANDROID_SENSOR_ACTIVE_PIXEL_ARRAY[0] /
1191 ANDROID_SCALER_MAX_DIGITAL_ZOOM),
1192 floor(ANDROID_SENSOR_ACTIVE_PIXEL_ARRAY[1] /
1193 ANDROID_SCALER_MAX_DIGITAL_ZOOM) }
1194</pre>
1195<p>If the crop region needs to fulfill specific requirements (for example, it
1196 needs to start on even coordinates, and its width/height needs to be even),
1197 the HAL must do the necessary rounding and write out the final crop region
1198 used in the output result metadata. Similarly, if the HAL implements video
1199 stabilization, it must adjust the result crop region to describe the region
1200 actually included in the output after video stabilization is applied. In
1201 general, a camera-using application must be able to determine the field of
1202 view it is receiving based on the crop region, the dimensions of the image
1203 sensor, and the lens focal length.</p>
1204<p>Since the crop region applies to all streams, which may have different aspect
1205 ratios than the crop region, the exact sensor region used for each stream may
1206 be smaller than the crop region. Specifically, each stream should maintain
1207 square pixels and its aspect ratio by minimally further cropping the defined
1208 crop region. If the stream's aspect ratio is wider than the crop region, the
1209 stream should be further cropped vertically, and if the stream's aspect ratio
1210 is narrower than the crop region, the stream should be further cropped
1211 horizontally.</p>
1212<p>In all cases, the stream crop must be centered within the full crop region,
1213 and each stream is only either cropped horizontally or vertical relative to
1214 the full crop region, never both.</p>
1215<p>For example, if two streams are defined, a 640x480 stream (4:3 aspect), and a
1216 1280x720 stream (16:9 aspect), below demonstrates the expected output regions
1217 for each stream for a few sample crop regions, on a hypothetical 3 MP (2000 x
1218 1500 pixel array) sensor.</p>
1219<p>Crop region: (500, 375, 1000, 750) (4:3 aspect ratio)</p>
1220<blockquote>
1221 <p> 640x480 stream crop: (500, 375, 1000, 750) (equal to crop region)<br />
1222 1280x720 stream crop: (500, 469, 1000, 562) (marked with =)</p>
1223</blockquote>
1224<pre>0 1000 2000
1225 +---------+---------+---------+----------+
1226 | Active pixel array |
1227 | |
1228 | |
1229 + +-------------------+ + 375
1230 | | | |
1231 | O===================O |
1232 | I 1280x720 stream I |
1233 + I I + 750
1234 | I I |
1235 | O===================O |
1236 | | | |
1237 + +-------------------+ + 1125
1238 | Crop region, 640x480 stream |
1239 | |
1240 | |
1241 +---------+---------+---------+----------+ 1500</pre>
1242<p>(TODO: Recreate these in Omnigraffle and replace.)</p>
1243<p>Crop region: (500, 375, 1333, 750) (16:9 aspect ratio)</p>
1244<blockquote>
1245 <p> 640x480 stream crop: (666, 375, 1000, 750) (marked with =)<br />
1246 1280x720 stream crop: (500, 375, 1333, 750) (equal to crop region)</p>
1247</blockquote>
1248<pre>0 1000 2000
1249 +---------+---------+---------+----------+
1250 | Active pixel array |
1251 | |
1252 | |
1253 + +---O==================O---+ + 375
1254 | | I 640x480 stream I | |
1255 | | I I | |
1256 | | I I | |
1257 + | I I | + 750
1258 | | I I | |
1259 | | I I | |
1260 | | I I | |
1261 + +---O==================O---+ + 1125
1262 | Crop region, 1280x720 stream |
1263 | |
1264 | |
1265 +---------+---------+---------+----------+ 1500
1266</pre>
1267<p>Crop region: (500, 375, 750, 750) (1:1 aspect ratio)</p>
1268<blockquote>
1269 <p> 640x480 stream crop: (500, 469, 750, 562) (marked with =)<br />
1270 1280x720 stream crop: (500, 543, 750, 414) (marged with #)</p>
1271</blockquote>
1272<pre>0 1000 2000
1273 +---------+---------+---------+----------+
1274 | Active pixel array |
1275 | |
1276 | |
1277 + +--------------+ + 375
1278 | O==============O |
1279 | ################ |
1280 | # # |
1281 + # # + 750
1282 | # # |
1283 | ################ 1280x720 |
1284 | O==============O 640x480 |
1285 + +--------------+ + 1125
1286 | Crop region |
1287 | |
1288 | |
1289 +---------+---------+---------+----------+ 1500
1290</pre>
1291<p>And a final example, a 1024x1024 square aspect ratio stream instead of the
1292 480p stream:</p>
1293<p>Crop region: (500, 375, 1000, 750) (4:3 aspect ratio)</p>
1294<blockquote>
1295 <p> 1024x1024 stream crop: (625, 375, 750, 750) (marked with #)<br />
1296 1280x720 stream crop: (500, 469, 1000, 562) (marked with =)</p>
1297</blockquote>
1298<pre>0 1000 2000
1299 +---------+---------+---------+----------+
1300 | Active pixel array |
1301 | |
1302 | 1024x1024 stream |
1303 + +--###############--+ + 375
1304 | | # # | |
1305 | O===================O |
1306 | I 1280x720 stream I |
1307 + I I + 750
1308 | I I |
1309 | O===================O |
1310 | | # # | |
1311 + +--###############--+ + 1125
1312 | Crop region |
1313 | |
1314 | |
1315 +---------+---------+---------+----------+ 1500
1316</pre>
1317<h2 id="reprocessing">Reprocessing</h2>
1318
1319<p>Additional support for DNGs is provided by reprocessing support for RAW Bayer data.
1320This support allows the camera pipeline to process a previously captured RAW buffer and metadata
1321(an entire frame that was recorded previously), to produce a new rendered YUV or JPEG output.
1322</p>
1323<h2 id="errors">Error management</h2>
1324<p>Camera HAL device ops functions that have a return value will all return
1325 -ENODEV / NULL in case of a serious error. This means the device cannot
1326 continue operation, and must be closed by the framework. Once this error is
1327 returned by some method, or if notify() is called with ERROR_DEVICE, only
1328 the close() method can be called successfully. All other methods will return
1329 -ENODEV / NULL.</p>
1330<p>If a device op is called in the wrong sequence, for example if the framework
1331 calls configure_streams() is called before initialize(), the device must
1332 return -ENOSYS from the call, and do nothing.</p>
1333<p>Transient errors in image capture must be reported through notify() as follows:</p>
1334<ul>
1335 <li>The failure of an entire capture to occur must be reported by the HAL by
1336 calling notify() with ERROR_REQUEST. Individual errors for the result
1337 metadata or the output buffers must not be reported in this case.</li>
1338 <li>If the metadata for a capture cannot be produced, but some image buffers
1339 were filled, the HAL must call notify() with ERROR_RESULT.</li>
1340 <li>If an output image buffer could not be filled, but either the metadata was
1341 produced or some other buffers were filled, the HAL must call notify() with
1342 ERROR_BUFFER for each failed buffer.</li>
1343</ul>
1344<p>In each of these transient failure cases, the HAL must still call
1345 process_capture_result, with valid output buffer_handle_t. If the result
1346 metadata could not be produced, it should be NULL. If some buffers could not
1347 be filled, their sync fences must be set to the error state.</p>
1348<p>Invalid input arguments result in -EINVAL from the appropriate methods. In
1349 that case, the framework must act as if that call had never been made.</p>
1350<h2 id="stream-mgmt">Stream management</h2>
1351<h3 id="configure-streams">configure_streams</h3>
1352<p>Reset the HAL camera device processing pipeline and set up new input and
1353 output streams. This call replaces any existing stream configuration with
1354 the streams defined in the stream_list. This method will be called at
1355 least once after initialize() before a request is submitted with
1356 process_capture_request().</p>
1357<p>The stream_list must contain at least one output-capable stream, and may
1358 not contain more than one input-capable stream.</p>
1359<p>The stream_list may contain streams that are also in the currently-active
1360 set of streams (from the previous call to configure_stream()). These
1361 streams will already have valid values for usage, max_buffers, and the
1362 private pointer. If such a stream has already had its buffers registered,
1363 register_stream_buffers() will not be called again for the stream, and
1364 buffers from the stream can be immediately included in input requests.</p>
1365<p>If the HAL needs to change the stream configuration for an existing
1366 stream due to the new configuration, it may rewrite the values of usage
1367 and/or max_buffers during the configure call. The framework will detect
1368 such a change, and will then reallocate the stream buffers, and call
1369 register_stream_buffers() again before using buffers from that stream in
1370 a request.</p>
1371<p>If a currently-active stream is not included in stream_list, the HAL may
1372 safely remove any references to that stream. It will not be reused in a
1373 later configure() call by the framework, and all the gralloc buffers for
1374 it will be freed after the configure_streams() call returns.</p>
1375<p>The stream_list structure is owned by the framework, and may not be
1376 accessed once this call completes. The address of an individual
1377 camera3_stream_t structure will remain valid for access by the HAL until
1378 the end of the first configure_stream() call which no longer includes
1379 that camera3_stream_t in the stream_list argument. The HAL may not change
1380 values in the stream structure outside of the private pointer, except for
1381 the usage and max_buffers members during the configure_streams() call
1382 itself.</p>
1383<p>If the stream is new, the usage, max_buffer, and private pointer fields
1384 of the stream structure will all be set to 0. The HAL device must set
1385 these fields before the configure_streams() call returns. These fields
1386 are then used by the framework and the platform gralloc module to
1387 allocate the gralloc buffers for each stream.</p>
1388<p>Before such a new stream can have its buffers included in a capture
1389 request, the framework will call register_stream_buffers() with that
1390 stream. However, the framework is not required to register buffers for
1391 _all_ streams before submitting a request. This allows for quick startup
1392 of (for example) a preview stream, with allocation for other streams
1393 happening later or concurrently.</p>
1394<h4>Preconditions</h4>
1395<p>The framework will only call this method when no captures are being
1396 processed. That is, all results have been returned to the framework, and
1397 all in-flight input and output buffers have been returned and their
1398 release sync fences have been signaled by the HAL. The framework will not
1399 submit new requests for capture while the configure_streams() call is
1400 underway.</p>
1401<h4>Postconditions</h4>
1402<p>The HAL device must configure itself to provide maximum possible output
1403 frame rate given the sizes and formats of the output streams, as
1404 documented in the camera device's static metadata.</p>
1405<h4>Performance expectations</h4>
1406<p>This call is expected to be heavyweight and possibly take several hundred
1407 milliseconds to complete, since it may require resetting and
1408 reconfiguring the image sensor and the camera processing pipeline.
1409 Nevertheless, the HAL device should attempt to minimize the
1410 reconfiguration delay to minimize the user-visible pauses during
1411 application operational mode changes (such as switching from still
1412 capture to video recording).</p>
1413<h4>Return values</h4>
1414<ul>
1415 <li>0: On successful stream configuration<br />
1416 </li>
1417 <li>-EINVAL: If the requested stream configuration is invalid. Some examples
1418 of invalid stream configurations include:
1419 <ul>
1420 <li>Including more than 1 input-capable stream (INPUT or
1421 BIDIRECTIONAL)</li>
1422 <li>Not including any output-capable streams (OUTPUT or
1423 BIDIRECTIONAL)</li>
1424 <li>Including streams with unsupported formats, or an unsupported
1425 size for that format.</li>
1426 <li>Including too many output streams of a certain format.<br />
1427 Note that the framework submitting an invalid stream
1428 configuration is not normal operation, since stream
1429 configurations are checked before configure. An invalid
1430 configuration means that a bug exists in the framework code, or
1431 there is a mismatch between the HAL's static metadata and the
1432 requirements on streams.</li>
1433 </ul>
1434 </li>
1435 <li>-ENODEV: If there has been a fatal error and the device is no longer
1436 operational. Only close() can be called successfully by the
1437 framework after this error is returned.</li>
1438</ul>
1439<h3 id="register-buffers">register_stream_buffers</h3>
1440<p>Register buffers for a given stream with the HAL device. This method is
1441 called by the framework after a new stream is defined by
1442 configure_streams, and before buffers from that stream are included in a
1443 capture request. If the same stream is listed in a subsequent
1444 configure_streams() call, register_stream_buffers will _not_ be called
1445 again for that stream.</p>
1446<p>The framework does not need to register buffers for all configured
1447 streams before it submits the first capture request. This allows quick
1448 startup for preview (or similar use cases) while other streams are still
1449 being allocated.</p>
1450<p>This method is intended to allow the HAL device to map or otherwise
1451 prepare the buffers for later use. The buffers passed in will already be
1452 locked for use. At the end of the call, all the buffers must be ready to
1453 be returned to the stream. The buffer_set argument is only valid for the
1454 duration of this call.</p>
1455<p>If the stream format was set to HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED,
1456 the camera HAL should inspect the passed-in buffers here to determine any
1457 platform-private pixel format information.</p>
1458<h4>Return values</h4>
1459<ul>
1460 <li>0: On successful registration of the new stream buffers</li>
1461 <li>-EINVAL: If the stream_buffer_set does not refer to a valid active
1462 stream, or if the buffers array is invalid.</li>
1463 <li>-ENOMEM: If there was a failure in registering the buffers. The framework
1464 must consider all the stream buffers to be unregistered, and can
1465 try to register again later.</li>
1466 <li>-ENODEV: If there is a fatal error, and the device is no longer
1467 operational. Only close() can be called successfully by the
1468 framework after this error is returned.</li>
1469</ul>
1470<h2 id="request-creation">Request creation and submission</h2>
1471<h3 id="default-settings">construct_default_request_settings</h3>
1472<p>Create capture settings for standard camera use cases. The device must return a settings buffer that is configured to meet the
1473 requested use case, which must be one of the CAMERA3_TEMPLATE_*
1474enums. All request control fields must be included.</p>
1475<p>The HAL retains ownership of this structure, but the pointer to the
1476 structure must be valid until the device is closed. The framework and the
1477 HAL may not modify the buffer once it is returned by this call. The same
1478 buffer may be returned for subsequent calls for the same template, or for
1479 other templates.</p>
1480<h4>Return values</h4>
1481<ul>
1482 <li>Valid metadata: On successful creation of a default settings
1483 buffer.</li>
1484 <li>NULL: In case of a fatal error. After this is returned, only
1485 the close() method can be called successfully by the
1486 framework. </li>
1487</ul>
1488<h3 id="process-capture">process_capture_request</h3>
1489<p>Send a new capture request to the HAL. The HAL should not return from
1490 this call until it is ready to accept the next request to process. Only
1491 one call to process_capture_request() will be made at a time by the
1492 framework, and the calls will all be from the same thread. The next call
1493 to process_capture_request() will be made as soon as a new request and
1494 its associated buffers are available. In a normal preview scenario, this
1495 means the function will be called again by the framework almost
1496 instantly.</p>
1497<p>The actual request processing is asynchronous, with the results of
1498 capture being returned by the HAL through the process_capture_result()
1499 call. This call requires the result metadata to be available, but output
1500 buffers may simply provide sync fences to wait on. Multiple requests are
1501 expected to be in flight at once, to maintain full output frame rate.</p>
1502<p>The framework retains ownership of the request structure. It is only
1503 guaranteed to be valid during this call. The HAL device must make copies
1504 of the information it needs to retain for the capture processing. The HAL
1505 is responsible for waiting on and closing the buffers' fences and
1506 returning the buffer handles to the framework.</p>
1507<p>The HAL must write the file descriptor for the input buffer's release
1508 sync fence into input_buffer-&gt;release_fence, if input_buffer is not
1509 NULL. If the HAL returns -1 for the input buffer release sync fence, the
1510 framework is free to immediately reuse the input buffer. Otherwise, the
1511 framework will wait on the sync fence before refilling and reusing the
1512 input buffer.</p>
1513<h4>Return values</h4>
1514<ul>
1515 <li>0: On a successful start to processing the capture request</li>
1516 <li>-EINVAL: If the input is malformed (the settings are NULL when not
1517 allowed, there are 0 output buffers, etc) and capture processing
1518 cannot start. Failures during request processing should be
1519 handled by calling camera3_callback_ops_t.notify(). In case of
1520 this error, the framework will retain responsibility for the
1521 stream buffers' fences and the buffer handles; the HAL should
1522 not close the fences or return these buffers with
1523 process_capture_result.</li>
1524 <li>-ENODEV: If the camera device has encountered a serious error. After this
1525 error is returned, only the close() method can be successfully
1526 called by the framework.</li>
1527</ul>
1528<h2 id="misc-methods">Miscellaneous methods</h2>
1529<h3 id="get-metadata">get_metadata_vendor_tag_ops</h3>
1530<p>Get methods to query for vendor extension metadata tag information. The
1531 HAL should fill in all the vendor tag operation methods, or leave ops
1532 unchanged if no vendor tags are defined.
1533
1534 The definition of vendor_tag_query_ops_t can be found in
1535 system/media/camera/include/system/camera_metadata.h.</p>
1536<h3 id="dump">dump</h3>
1537<p>Print out debugging state for the camera device. This will be called by
1538 the framework when the camera service is asked for a debug dump, which
1539 happens when using the dumpsys tool, or when capturing a bugreport.
1540
1541 The passed-in file descriptor can be used to write debugging text using
1542 dprintf() or write(). The text should be in ASCII encoding only.</p>
1543<h3 id="flush">flush</h3>
1544<p>Flush all currently in-process captures and all buffers in the pipeline
1545 on the given device. The framework will use this to dump all state as
1546 quickly as possible in order to prepare for a configure_streams() call.</p>
1547<p>No buffers are required to be successfully returned, so every buffer
1548 held at the time of flush() (whether sucessfully filled or not) may be
1549 returned with CAMERA3_BUFFER_STATUS_ERROR. Note the HAL is still allowed
1550 to return valid (STATUS_OK) buffers during this call, provided they are
1551 succesfully filled.</p>
1552<p>All requests currently in the HAL are expected to be returned as soon as
1553 possible. Not-in-process requests should return errors immediately. Any
1554 interruptible hardware blocks should be stopped, and any uninterruptible
1555 blocks should be waited on.</p>
1556<p>flush() should only return when there are no more outstanding buffers or
1557 requests left in the HAL. The framework may call configure_streams (as
1558 the HAL state is now quiesced) or may issue new requests.</p>
1559<p>A flush() call should only take 100ms or less. The maximum time it can
1560 take is 1 second.</p>
1561<h4>Version information</h4>
1562<p>This is available only if device version &gt;= CAMERA_DEVICE_API_VERSION_3_1.</p>
1563<h4>Return values</h4>
1564<ul>
1565 <li>0: On a successful flush of the camera HAL.</li>
1566 <li>-EINVAL: If the input is malformed (the device is not valid).<br />
1567 -ENODEV: If the camera device has encountered a serious error. After this
1568 error is returned, only the close() method can be successfully
1569 called by the framework.</li>
1570</ul>