camera_metadata: Specify frame durations and minimum frame durations

Change-Id: Ic52c54c3c26e3547a1064fc5afc8ebbac5b392ad
diff --git a/camera/docs/docs.html b/camera/docs/docs.html
index 4de5b9d..d222383 100644
--- a/camera/docs/docs.html
+++ b/camera/docs/docs.html
@@ -9367,13 +9367,11 @@
 
             <td class="entry_description">
               <p>The minimum frame duration that is supported
-for each resolution in availableJpegSizes.<wbr/> Should
-correspond to the frame duration when only that JPEG
-stream is active and captured in a burst,<wbr/> with all
-processing set to FAST</p>
+for each resolution in <a href="#static_android.scaler.availableJpegSizes">android.<wbr/>scaler.<wbr/>available<wbr/>Jpeg<wbr/>Sizes</a>.<wbr/></p>
             </td>
 
             <td class="entry_units">
+              ns
             </td>
 
             <td class="entry_range">
@@ -9391,7 +9389,10 @@
           </tr>
           <tr class="entry_cont">
             <td class="entry_details" colspan="5">
-              <p>When multiple streams are configured,<wbr/> the minimum
+              <p>This corresponds to the minimum steady-state frame duration when only
+that JPEG stream is active and captured in a burst,<wbr/> with all
+processing (typically in android.<wbr/>*.<wbr/>mode) set to FAST.<wbr/></p>
+<p>When multiple streams are configured,<wbr/> the minimum
 frame duration will be &gt;= max(individual stream min
 durations)</p>
             </td>
@@ -9516,14 +9517,13 @@
             </td> <!-- entry_type -->
 
             <td class="entry_description">
-              <p>The minimum frame duration that is supported
-for each resolution in availableProcessedSizes.<wbr/> Should
-correspond to the frame duration when only that processed
-stream is active,<wbr/> with all processing set to
-FAST</p>
+              <p>For each available processed output size (defined in
+<a href="#static_android.scaler.availableProcessedSizes">android.<wbr/>scaler.<wbr/>available<wbr/>Processed<wbr/>Sizes</a>),<wbr/> this property lists the
+minimum supportable frame duration for that size.<wbr/></p>
             </td>
 
             <td class="entry_units">
+              ns
             </td>
 
             <td class="entry_range">
@@ -9541,9 +9541,11 @@
           </tr>
           <tr class="entry_cont">
             <td class="entry_details" colspan="5">
-              <p>When multiple streams are configured,<wbr/> the minimum
-frame duration will be &gt;= max(individual stream min
-durations)</p>
+              <p>This should correspond to the frame duration when only that processed
+stream is active,<wbr/> with all processing (typically in android.<wbr/>*.<wbr/>mode)
+set to FAST.<wbr/></p>
+<p>When multiple streams are configured,<wbr/> the minimum frame duration will
+be &gt;= max(individual stream min durations).<wbr/></p>
             </td>
           </tr>
 
@@ -9646,13 +9648,13 @@
             </td> <!-- entry_type -->
 
             <td class="entry_description">
-              <p>The minimum frame duration that is supported
-for each raw resolution in availableRawSizes.<wbr/> Should
-correspond to the frame duration when only the raw stream
-is active.<wbr/></p>
+              <p>For each available processed output size (defined in
+<a href="#static_android.scaler.availableRawSizes">android.<wbr/>scaler.<wbr/>available<wbr/>Raw<wbr/>Sizes</a>),<wbr/> this property lists the minimum
+supportable frame duration for that size.<wbr/></p>
             </td>
 
             <td class="entry_units">
+              ns
             </td>
 
             <td class="entry_range">
@@ -9670,7 +9672,9 @@
           </tr>
           <tr class="entry_cont">
             <td class="entry_details" colspan="5">
-              <p>When multiple streams are configured,<wbr/> the minimum
+              <p>Should correspond to the frame duration when only the raw stream is
+active.<wbr/></p>
+<p>When multiple streams are configured,<wbr/> the minimum
 frame duration will be &gt;= max(individual stream min
 durations)</p>
             </td>
@@ -9918,7 +9922,7 @@
 
             <td class="entry_description">
               <p>Duration from start of frame exposure to
-start of next frame exposure</p>
+start of next frame exposure.<wbr/></p>
             </td>
 
             <td class="entry_units">
@@ -9926,8 +9930,9 @@
             </td>
 
             <td class="entry_range">
-              <p>see <a href="#static_android.sensor.info.maxFrameDuration">android.<wbr/>sensor.<wbr/>info.<wbr/>max<wbr/>Frame<wbr/>Duration</a>,<wbr/>
-android.<wbr/>scaler.<wbr/>info.<wbr/>available<wbr/>Min<wbr/>Frame<wbr/>Durations</p>
+              <p>See <a href="#static_android.sensor.info.maxFrameDuration">android.<wbr/>sensor.<wbr/>info.<wbr/>max<wbr/>Frame<wbr/>Duration</a>,<wbr/>
+android.<wbr/>scaler.<wbr/>available*Min<wbr/>Durations.<wbr/> The duration
+is capped to <code>max(duration,<wbr/> exposureTime + overhead)</code>.<wbr/></p>
             </td>
 
             <td class="entry_tags">
@@ -9943,8 +9948,89 @@
           </tr>
           <tr class="entry_cont">
             <td class="entry_details" colspan="5">
-              <p>Exposure time has priority,<wbr/> so duration is set to
-max(duration,<wbr/> exposure time + overhead)</p>
+              <p>The maximum frame rate that can be supported by a camera subsystem is
+a function of many factors:</p>
+<ul>
+<li>Requested resolutions of output image streams</li>
+<li>Availability of binning /<wbr/> skipping modes on the imager</li>
+<li>The bandwidth of the imager interface</li>
+<li>The bandwidth of the various ISP processing blocks</li>
+</ul>
+<p>Since these factors can vary greatly between different ISPs and
+sensors,<wbr/> the camera abstraction tries to represent the bandwidth
+restrictions with as simple a model as possible.<wbr/></p>
+<p>The model presented has the following characteristics:</p>
+<ul>
+<li>The image sensor is always configured to output the smallest
+resolution possible given the application's requested output stream
+sizes.<wbr/>  The smallest resolution is defined as being at least as large
+as the largest requested output stream size; the camera pipeline must
+never digitally upsample sensor data when the crop region covers the
+whole sensor.<wbr/> In general,<wbr/> this means that if only small output stream
+resolutions are configured,<wbr/> the sensor can provide a higher frame
+rate.<wbr/></li>
+<li>Since any request may use any or all the currently configured
+output streams,<wbr/> the sensor and ISP must be configured to support
+scaling a single capture to all the streams at the same time.<wbr/>  This
+means the camera pipeline must be ready to produce the largest
+requested output size without any delay.<wbr/>  Therefore,<wbr/> the overall
+frame rate of a given configured stream set is governed only by the
+largest requested stream resolution.<wbr/></li>
+<li>Using more than one output stream in a request does not affect the
+frame duration.<wbr/></li>
+<li>JPEG streams act like processed YUV streams in requests for which
+they are not included; in requests in which they are directly
+referenced,<wbr/> they act as JPEG streams.<wbr/> This is because supporting a
+JPEG stream requires the underlying YUV data to always be ready for
+use by a JPEG encoder,<wbr/> but the encoder will only be used (and impact
+frame duration) on requests that actually reference a JPEG stream.<wbr/></li>
+<li>The JPEG processor can run concurrently to the rest of the camera
+pipeline,<wbr/> but cannot process more than 1 capture at a time.<wbr/></li>
+</ul>
+<p>The necessary information for the application,<wbr/> given the model above,<wbr/>
+is provided via the android.<wbr/>scaler.<wbr/>available*Min<wbr/>Durations fields.<wbr/>
+These are used to determine the maximum frame rate /<wbr/> minimum frame
+duration that is possible for a given stream configuration.<wbr/></p>
+<p>Specifically,<wbr/> the application can use the following rules to
+determine the minimum frame duration it can request from the HAL
+device:</p>
+<ol>
+<li>Given the application's currently configured set of output
+streams,<wbr/> <code>S</code>,<wbr/> divide them into three sets: streams in a JPEG format
+<code>SJ</code>,<wbr/> streams in a raw sensor format <code>SR</code>,<wbr/> and the rest ('processed')
+<code>SP</code>.<wbr/></li>
+<li>For each subset of streams,<wbr/> find the largest resolution (by pixel
+count) in the subset.<wbr/> This gives (at most) three resolutions <code>RJ</code>,<wbr/>
+<code>RR</code>,<wbr/> and <code>RP</code>.<wbr/></li>
+<li>If <code>RJ</code> is greater than <code>RP</code>,<wbr/> set <code>RP</code> equal to <code>RJ</code>.<wbr/> If there is
+no exact match for <code>RP == RJ</code> (in particular there isn't an available
+processed resolution at the same size as <code>RJ</code>),<wbr/> then set <code>RP</code> equal
+to the smallest processed resolution that is larger than <code>RJ</code>.<wbr/> If
+there are no processed resolutions larger than <code>RJ</code>,<wbr/> then set <code>RJ</code> to
+the processed resolution closest to <code>RJ</code>.<wbr/></li>
+<li>If <code>RP</code> is greater than <code>RR</code>,<wbr/> set <code>RR</code> equal to <code>RP</code>.<wbr/> If there is
+no exact match for <code>RR == RP</code> (in particular there isn't an available
+raw resolution at the same size as <code>RP</code>),<wbr/> then set <code>RR</code> equal to
+or to the smallest raw resolution that is larger than <code>RP</code>.<wbr/> If
+there are no raw resolutions larger than <code>RP</code>,<wbr/> then set <code>RR</code> to
+the raw resolution closest to <code>RP</code>.<wbr/></li>
+<li>Look up the matching minimum frame durations in the property lists
+<a href="#static_android.scaler.availableJpegMinDurations">android.<wbr/>scaler.<wbr/>available<wbr/>Jpeg<wbr/>Min<wbr/>Durations</a>,<wbr/>
+<a href="#static_android.scaler.availableRawMinDurations">android.<wbr/>scaler.<wbr/>available<wbr/>Raw<wbr/>Min<wbr/>Durations</a>,<wbr/> and
+<a href="#static_android.scaler.availableProcessedMinDurations">android.<wbr/>scaler.<wbr/>available<wbr/>Processed<wbr/>Min<wbr/>Durations</a>.<wbr/>  This gives three
+minimum frame durations <code>FJ</code>,<wbr/> <code>FR</code>,<wbr/> and <code>FP</code>.<wbr/></li>
+<li>If a stream of requests do not use a JPEG stream,<wbr/> then the minimum
+supported frame duration for each request is <code>max(FR,<wbr/> FP)</code>.<wbr/></li>
+<li>If a stream of requests all use the JPEG stream,<wbr/> then the minimum
+supported frame duration for each request is <code>max(FR,<wbr/> FP,<wbr/> FJ)</code>.<wbr/></li>
+<li>If a mix of JPEG-using and non-JPEG-using requests is submitted by
+the application,<wbr/> then the HAL will have to delay JPEG-using requests
+whenever the JPEG encoder is still busy processing an older capture.<wbr/>
+This will happen whenever a JPEG-using request starts capture less
+than <code>FJ</code> <em>ns</em> after a previous JPEG-using request.<wbr/> The minimum
+supported frame duration will vary between the values calculated in
+#6 and #7.<wbr/></li>
+</ol>
             </td>
           </tr>
 
@@ -11162,7 +11248,7 @@
 
             <td class="entry_description">
               <p>Duration from start of frame exposure to
-start of next frame exposure</p>
+start of next frame exposure.<wbr/></p>
             </td>
 
             <td class="entry_units">
@@ -11170,8 +11256,9 @@
             </td>
 
             <td class="entry_range">
-              <p>see <a href="#static_android.sensor.info.maxFrameDuration">android.<wbr/>sensor.<wbr/>info.<wbr/>max<wbr/>Frame<wbr/>Duration</a>,<wbr/>
-android.<wbr/>scaler.<wbr/>info.<wbr/>available<wbr/>Min<wbr/>Frame<wbr/>Durations</p>
+              <p>See <a href="#static_android.sensor.info.maxFrameDuration">android.<wbr/>sensor.<wbr/>info.<wbr/>max<wbr/>Frame<wbr/>Duration</a>,<wbr/>
+android.<wbr/>scaler.<wbr/>available*Min<wbr/>Durations.<wbr/> The duration
+is capped to <code>max(duration,<wbr/> exposureTime + overhead)</code>.<wbr/></p>
             </td>
 
             <td class="entry_tags">
@@ -11187,8 +11274,89 @@
           </tr>
           <tr class="entry_cont">
             <td class="entry_details" colspan="5">
-              <p>Exposure time has priority,<wbr/> so duration is set to
-max(duration,<wbr/> exposure time + overhead)</p>
+              <p>The maximum frame rate that can be supported by a camera subsystem is
+a function of many factors:</p>
+<ul>
+<li>Requested resolutions of output image streams</li>
+<li>Availability of binning /<wbr/> skipping modes on the imager</li>
+<li>The bandwidth of the imager interface</li>
+<li>The bandwidth of the various ISP processing blocks</li>
+</ul>
+<p>Since these factors can vary greatly between different ISPs and
+sensors,<wbr/> the camera abstraction tries to represent the bandwidth
+restrictions with as simple a model as possible.<wbr/></p>
+<p>The model presented has the following characteristics:</p>
+<ul>
+<li>The image sensor is always configured to output the smallest
+resolution possible given the application's requested output stream
+sizes.<wbr/>  The smallest resolution is defined as being at least as large
+as the largest requested output stream size; the camera pipeline must
+never digitally upsample sensor data when the crop region covers the
+whole sensor.<wbr/> In general,<wbr/> this means that if only small output stream
+resolutions are configured,<wbr/> the sensor can provide a higher frame
+rate.<wbr/></li>
+<li>Since any request may use any or all the currently configured
+output streams,<wbr/> the sensor and ISP must be configured to support
+scaling a single capture to all the streams at the same time.<wbr/>  This
+means the camera pipeline must be ready to produce the largest
+requested output size without any delay.<wbr/>  Therefore,<wbr/> the overall
+frame rate of a given configured stream set is governed only by the
+largest requested stream resolution.<wbr/></li>
+<li>Using more than one output stream in a request does not affect the
+frame duration.<wbr/></li>
+<li>JPEG streams act like processed YUV streams in requests for which
+they are not included; in requests in which they are directly
+referenced,<wbr/> they act as JPEG streams.<wbr/> This is because supporting a
+JPEG stream requires the underlying YUV data to always be ready for
+use by a JPEG encoder,<wbr/> but the encoder will only be used (and impact
+frame duration) on requests that actually reference a JPEG stream.<wbr/></li>
+<li>The JPEG processor can run concurrently to the rest of the camera
+pipeline,<wbr/> but cannot process more than 1 capture at a time.<wbr/></li>
+</ul>
+<p>The necessary information for the application,<wbr/> given the model above,<wbr/>
+is provided via the android.<wbr/>scaler.<wbr/>available*Min<wbr/>Durations fields.<wbr/>
+These are used to determine the maximum frame rate /<wbr/> minimum frame
+duration that is possible for a given stream configuration.<wbr/></p>
+<p>Specifically,<wbr/> the application can use the following rules to
+determine the minimum frame duration it can request from the HAL
+device:</p>
+<ol>
+<li>Given the application's currently configured set of output
+streams,<wbr/> <code>S</code>,<wbr/> divide them into three sets: streams in a JPEG format
+<code>SJ</code>,<wbr/> streams in a raw sensor format <code>SR</code>,<wbr/> and the rest ('processed')
+<code>SP</code>.<wbr/></li>
+<li>For each subset of streams,<wbr/> find the largest resolution (by pixel
+count) in the subset.<wbr/> This gives (at most) three resolutions <code>RJ</code>,<wbr/>
+<code>RR</code>,<wbr/> and <code>RP</code>.<wbr/></li>
+<li>If <code>RJ</code> is greater than <code>RP</code>,<wbr/> set <code>RP</code> equal to <code>RJ</code>.<wbr/> If there is
+no exact match for <code>RP == RJ</code> (in particular there isn't an available
+processed resolution at the same size as <code>RJ</code>),<wbr/> then set <code>RP</code> equal
+to the smallest processed resolution that is larger than <code>RJ</code>.<wbr/> If
+there are no processed resolutions larger than <code>RJ</code>,<wbr/> then set <code>RJ</code> to
+the processed resolution closest to <code>RJ</code>.<wbr/></li>
+<li>If <code>RP</code> is greater than <code>RR</code>,<wbr/> set <code>RR</code> equal to <code>RP</code>.<wbr/> If there is
+no exact match for <code>RR == RP</code> (in particular there isn't an available
+raw resolution at the same size as <code>RP</code>),<wbr/> then set <code>RR</code> equal to
+or to the smallest raw resolution that is larger than <code>RP</code>.<wbr/> If
+there are no raw resolutions larger than <code>RP</code>,<wbr/> then set <code>RR</code> to
+the raw resolution closest to <code>RP</code>.<wbr/></li>
+<li>Look up the matching minimum frame durations in the property lists
+<a href="#static_android.scaler.availableJpegMinDurations">android.<wbr/>scaler.<wbr/>available<wbr/>Jpeg<wbr/>Min<wbr/>Durations</a>,<wbr/>
+<a href="#static_android.scaler.availableRawMinDurations">android.<wbr/>scaler.<wbr/>available<wbr/>Raw<wbr/>Min<wbr/>Durations</a>,<wbr/> and
+<a href="#static_android.scaler.availableProcessedMinDurations">android.<wbr/>scaler.<wbr/>available<wbr/>Processed<wbr/>Min<wbr/>Durations</a>.<wbr/>  This gives three
+minimum frame durations <code>FJ</code>,<wbr/> <code>FR</code>,<wbr/> and <code>FP</code>.<wbr/></li>
+<li>If a stream of requests do not use a JPEG stream,<wbr/> then the minimum
+supported frame duration for each request is <code>max(FR,<wbr/> FP)</code>.<wbr/></li>
+<li>If a stream of requests all use the JPEG stream,<wbr/> then the minimum
+supported frame duration for each request is <code>max(FR,<wbr/> FP,<wbr/> FJ)</code>.<wbr/></li>
+<li>If a mix of JPEG-using and non-JPEG-using requests is submitted by
+the application,<wbr/> then the HAL will have to delay JPEG-using requests
+whenever the JPEG encoder is still busy processing an older capture.<wbr/>
+This will happen whenever a JPEG-using request starts capture less
+than <code>FJ</code> <em>ns</em> after a previous JPEG-using request.<wbr/> The minimum
+supported frame duration will vary between the values calculated in
+#6 and #7.<wbr/></li>
+</ol>
             </td>
           </tr>
 
diff --git a/camera/docs/metadata_properties.xml b/camera/docs/metadata_properties.xml
index bacf3fc..77e3c1c 100644
--- a/camera/docs/metadata_properties.xml
+++ b/camera/docs/metadata_properties.xml
@@ -2273,11 +2273,15 @@
             <size>n</size>
           </array>
           <description>The minimum frame duration that is supported
-          for each resolution in availableJpegSizes. Should
-          correspond to the frame duration when only that JPEG
-          stream is active and captured in a burst, with all
-          processing set to FAST</description>
-          <details>When multiple streams are configured, the minimum
+          for each resolution in android.scaler.availableJpegSizes.
+          </description>
+          <units>ns</units>
+          <details>
+          This corresponds to the minimum steady-state frame duration when only
+          that JPEG stream is active and captured in a burst, with all
+          processing (typically in android.*.mode) set to FAST.
+
+          When multiple streams are configured, the minimum
           frame duration will be &amp;gt;= max(individual stream min
           durations)</details>
           <tag id="BC" />
@@ -2313,14 +2317,20 @@
           <array>
             <size>n</size>
           </array>
-          <description>The minimum frame duration that is supported
-          for each resolution in availableProcessedSizes. Should
-          correspond to the frame duration when only that processed
-          stream is active, with all processing set to
-          FAST</description>
-          <details>When multiple streams are configured, the minimum
-          frame duration will be &amp;gt;= max(individual stream min
-          durations)</details>
+          <description>For each available processed output size (defined in
+          android.scaler.availableProcessedSizes), this property lists the
+          minimum supportable frame duration for that size.
+
+          </description>
+          <units>ns</units>
+          <details>
+          This should correspond to the frame duration when only that processed
+          stream is active, with all processing (typically in android.*.mode)
+          set to FAST.
+
+          When multiple streams are configured, the minimum frame duration will
+          be &amp;gt;= max(individual stream min durations).
+          </details>
           <tag id="BC" />
         </entry>
         <entry name="availableProcessedSizes" type="int32" visibility="public"
@@ -2367,11 +2377,17 @@
           <array>
             <size>n</size>
           </array>
-          <description>The minimum frame duration that is supported
-          for each raw resolution in availableRawSizes. Should
-          correspond to the frame duration when only the raw stream
-          is active.</description>
-          <details>When multiple streams are configured, the minimum
+          <description>
+          For each available processed output size (defined in
+          android.scaler.availableRawSizes), this property lists the minimum
+          supportable frame duration for that size.
+          </description>
+          <units>ns</units>
+          <details>
+          Should correspond to the frame duration when only the raw stream is
+          active.
+
+          When multiple streams are configured, the minimum
           frame duration will be &amp;gt;= max(individual stream min
           durations)</details>
           <tag id="BC" />
@@ -2409,12 +2425,97 @@
         </entry>
         <entry name="frameDuration" type="int64" visibility="public">
           <description>Duration from start of frame exposure to
-          start of next frame exposure</description>
+          start of next frame exposure.</description>
           <units>nanoseconds</units>
-          <range>see android.sensor.info.maxFrameDuration,
-          android.scaler.info.availableMinFrameDurations</range>
-          <details>Exposure time has priority, so duration is set to
-          max(duration, exposure time + overhead)</details>
+          <range>See android.sensor.info.maxFrameDuration,
+          android.scaler.available*MinDurations. The duration
+          is capped to `max(duration, exposureTime + overhead)`.</range>
+          <details>
+          The maximum frame rate that can be supported by a camera subsystem is
+          a function of many factors:
+
+          * Requested resolutions of output image streams
+          * Availability of binning / skipping modes on the imager
+          * The bandwidth of the imager interface
+          * The bandwidth of the various ISP processing blocks
+
+          Since these factors can vary greatly between different ISPs and
+          sensors, the camera abstraction tries to represent the bandwidth
+          restrictions with as simple a model as possible.
+
+          The model presented has the following characteristics:
+
+          * The image sensor is always configured to output the smallest
+          resolution possible given the application's requested output stream
+          sizes.  The smallest resolution is defined as being at least as large
+          as the largest requested output stream size; the camera pipeline must
+          never digitally upsample sensor data when the crop region covers the
+          whole sensor. In general, this means that if only small output stream
+          resolutions are configured, the sensor can provide a higher frame
+          rate.
+          * Since any request may use any or all the currently configured
+          output streams, the sensor and ISP must be configured to support
+          scaling a single capture to all the streams at the same time.  This
+          means the camera pipeline must be ready to produce the largest
+          requested output size without any delay.  Therefore, the overall
+          frame rate of a given configured stream set is governed only by the
+          largest requested stream resolution.
+          * Using more than one output stream in a request does not affect the
+          frame duration.
+          * JPEG streams act like processed YUV streams in requests for which
+          they are not included; in requests in which they are directly
+          referenced, they act as JPEG streams. This is because supporting a
+          JPEG stream requires the underlying YUV data to always be ready for
+          use by a JPEG encoder, but the encoder will only be used (and impact
+          frame duration) on requests that actually reference a JPEG stream.
+          * The JPEG processor can run concurrently to the rest of the camera
+          pipeline, but cannot process more than 1 capture at a time.
+
+          The necessary information for the application, given the model above,
+          is provided via the android.scaler.available*MinDurations fields.
+          These are used to determine the maximum frame rate / minimum frame
+          duration that is possible for a given stream configuration.
+
+          Specifically, the application can use the following rules to
+          determine the minimum frame duration it can request from the HAL
+          device:
+
+          1. Given the application's currently configured set of output
+          streams, `S`, divide them into three sets: streams in a JPEG format
+          `SJ`, streams in a raw sensor format `SR`, and the rest ('processed')
+          `SP`.
+          1. For each subset of streams, find the largest resolution (by pixel
+          count) in the subset. This gives (at most) three resolutions `RJ`,
+          `RR`, and `RP`.
+          1. If `RJ` is greater than `RP`, set `RP` equal to `RJ`. If there is
+          no exact match for `RP == RJ` (in particular there isn't an available
+          processed resolution at the same size as `RJ`), then set `RP` equal
+          to the smallest processed resolution that is larger than `RJ`. If
+          there are no processed resolutions larger than `RJ`, then set `RJ` to
+          the processed resolution closest to `RJ`.
+          1. If `RP` is greater than `RR`, set `RR` equal to `RP`. If there is
+          no exact match for `RR == RP` (in particular there isn't an available
+          raw resolution at the same size as `RP`), then set `RR` equal to
+          or to the smallest raw resolution that is larger than `RP`. If
+          there are no raw resolutions larger than `RP`, then set `RR` to
+          the raw resolution closest to `RP`.
+          1. Look up the matching minimum frame durations in the property lists
+          android.scaler.availableJpegMinDurations,
+          android.scaler.availableRawMinDurations, and
+          android.scaler.availableProcessedMinDurations.  This gives three
+          minimum frame durations `FJ`, `FR`, and `FP`.
+          1. If a stream of requests do not use a JPEG stream, then the minimum
+          supported frame duration for each request is `max(FR, FP)`.
+          1. If a stream of requests all use the JPEG stream, then the minimum
+          supported frame duration for each request is `max(FR, FP, FJ)`.
+          1. If a mix of JPEG-using and non-JPEG-using requests is submitted by
+          the application, then the HAL will have to delay JPEG-using requests
+          whenever the JPEG encoder is still busy processing an older capture.
+          This will happen whenever a JPEG-using request starts capture less
+          than `FJ` _ns_ after a previous JPEG-using request. The minimum
+          supported frame duration will vary between the values calculated in
+          \#6 and \#7.
+          </details>
           <tag id="V1" />
           <tag id="BC" />
         </entry>