Mark Hecomovich | 1a2e215 | 2016-07-29 11:14:13 -0700 | [diff] [blame] | 1 | page.title=HDR Video Playback |
| 2 | @jd:body |
| 3 | |
| 4 | <!-- |
| 5 | Copyright 2015 The Android Open Source Project |
| 6 | |
| 7 | Licensed under the Apache License, Version 2.0 (the "License"); |
| 8 | you may not use this file except in compliance with the License. |
| 9 | You may obtain a copy of the License at |
| 10 | |
| 11 | http://www.apache.org/licenses/LICENSE-2.0 |
| 12 | |
| 13 | Unless required by applicable law or agreed to in writing, software |
| 14 | distributed under the License is distributed on an "AS IS" BASIS, |
| 15 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
| 16 | See the License for the specific language governing permissions and |
| 17 | limitations under the License. |
| 18 | --> |
| 19 | <div id="qv-wrapper"> |
| 20 | <div id="qv"> |
| 21 | <h2>In this document</h2> |
| 22 | <ol id="auto-toc"> |
| 23 | </ol> |
| 24 | </div> |
| 25 | </div> |
| 26 | |
| 27 | <p>High dynamic range (HDR) video is the next frontier in high-quality |
| 28 | video decoding, bringing unmatched scene reproduction qualities. It does |
| 29 | so by significantly increasing the dynamic range of the luminance component |
| 30 | (from the current 100 cd/m<sup>2</sup> to 1000s of cd/m<sup>2</sup>) and by using a much wider |
| 31 | color space (BT 2020). This is now a central element of the 4K UHD evolution |
| 32 | in the TV space.</p> |
| 33 | |
| 34 | <p>In Android 7.0, initial HDR support has been added, which includes the |
| 35 | creation of proper constants for the discovery and setup of HDR video |
| 36 | pipelines. That means defining codec types and display modes and specifying |
| 37 | how HDR data must be passed to MediaCodec and supplied to HDR decoders. HDR |
| 38 | is only supported in tunneled video playback mode.</p> |
| 39 | |
| 40 | <p>The purpose of this document is to help application developers support HDR stream |
| 41 | playback, and help OEMs and SOCs enable the HDR features on Android 7.0.</p> |
| 42 | |
| 43 | <h2 id="technologies">Supported HDR technologies</h2> |
| 44 | |
| 45 | <p>As of Android 7.0 release, the following HDR technologies are supported. |
| 46 | |
| 47 | <table> |
| 48 | <tbody> |
| 49 | <tr> |
| 50 | <th>Technology |
| 51 | </th> |
| 52 | <th>Dolby-Vision |
| 53 | </th> |
| 54 | <th>HDR10 |
| 55 | </th> |
| 56 | <th>VP9-HLG |
| 57 | </th> |
| 58 | <th>VP9-PQ |
| 59 | </th> |
| 60 | </tr> |
| 61 | <tr> |
| 62 | <th>Codec |
| 63 | </th> |
| 64 | <td>AVC/HEVC |
| 65 | </td> |
| 66 | <td>HEVC |
| 67 | </td> |
| 68 | <td>VP9 |
| 69 | </td> |
| 70 | <td>VP9 |
| 71 | </td> |
| 72 | </tr> |
| 73 | <tr> |
| 74 | <th>Transfer Function |
| 75 | </th> |
| 76 | <td>ST-2084 |
| 77 | </td> |
| 78 | <td>ST-2084 |
| 79 | </td> |
| 80 | <td>HLG |
| 81 | </td> |
| 82 | <td>ST-2084 |
| 83 | </td> |
| 84 | </tr> |
| 85 | <tr> |
| 86 | <th>HDR Metadata Type |
| 87 | </th> |
| 88 | <td>Dynamic |
| 89 | </td> |
| 90 | <td>Static |
| 91 | </td> |
| 92 | <td>None |
| 93 | </td> |
| 94 | <td>Static |
| 95 | </td> |
| 96 | </tr> |
| 97 | </tbody> |
| 98 | </table> |
| 99 | |
| 100 | <p>In Android 7.0, <b>only HDR playback via tunneled mode is defined</b>, |
| 101 | but devices may add support for playback of HDR on SurfaceViews using opaque |
| 102 | video buffers. In other words:</p> |
| 103 | <ul> |
| 104 | <li>There is no standard Android API to check if HDR playback is supported |
| 105 | using non-tunneled decoders.</li> |
| 106 | <li>Tunneled video decoders that advertise HDR playback capability must |
| 107 | support HDR playback when connected to HDR-capable displays.</li> |
| 108 | <li>GL composition of HDR content is not supported by the AOSP Android |
| 109 | 7.0 release.</li> |
| 110 | </ul> |
| 111 | |
| 112 | <h2 id="discovery">Discovery</h2> |
| 113 | |
| 114 | <p>HDR Playback requires an HDR-capable decoder and a connection to an |
| 115 | HDR-capable display. Optionally, some technologies require a specific |
| 116 | extractor.</p> |
| 117 | |
| 118 | <h3 id="display">Display</h3> |
| 119 | |
| 120 | <p>Applications shall use the new <code>Display.getHdrCapabilities</code> |
| 121 | API to query the HDR technologies supported by the specified display. This is |
| 122 | basically the information in the EDID Static Metadata Data Block as defined |
| 123 | in CTA-861.3:</p> |
| 124 | |
| 125 | <ul> |
| 126 | <li><code>public Display.HdrCapabilities getHdrCapabilities()</code><br> |
| 127 | Returns the display's HDR capabilities.</li> |
| 128 | |
| 129 | <li><code>public Display.HdrCapabilities getHdrCapabilities()</code><br> |
| 130 | Returns the display's HDR capabilities.</li> |
| 131 | |
| 132 | <li><code>Display.HdrCapabilities</code><br> |
| 133 | Encapsulates the HDR capabilities of a given display. For example, what HDR |
| 134 | types it supports and details about the desired luminance data.</li> |
| 135 | </ul> |
| 136 | |
| 137 | <p><b>Constants:</b></p> |
| 138 | |
| 139 | <ul> |
| 140 | <li><code>int HDR_TYPE_DOLBY_VISION</code><br> |
| 141 | Dolby Vision support.</li> |
| 142 | |
| 143 | <li><code>int HDR_TYPE_HDR10</code><br> |
| 144 | HDR10 / PQ support.</li> |
| 145 | |
| 146 | <li><code>int HDR_TYPE_HLG</code><br> |
| 147 | Hybrid Log-Gamma support.</li> |
| 148 | |
| 149 | <li><code>float INVALID_LUMINANCE</code><br> |
| 150 | Invalid luminance value.</li> |
| 151 | </ul> |
| 152 | |
| 153 | <p><b>Public Methods:</b></p> |
| 154 | |
| 155 | <ul> |
| 156 | <li><code>float getDesiredMaxAverageLuminance()</code><br> |
| 157 | Returns the desired content max frame-average luminance data in cd/cd/m<sup>2</sup> for |
| 158 | this display.</li> |
| 159 | |
| 160 | <li><code>float getDesiredMaxLuminance()</code><br> |
| 161 | Returns the desired content max luminance data in cd/cd/m<sup>2</sup> for this display.</li> |
| 162 | |
| 163 | <li><code>float getDesiredMinLuminance()</code><br> |
| 164 | Returns the desired content min luminance data in cd/cd/m<sup>2</sup> for this display.</li> |
| 165 | |
| 166 | <li><code>int[] getSupportedHdrTypes()</code><br> |
| 167 | Gets the supported HDR types of this display (see constants). Returns empty |
| 168 | array if HDR is not supported by the display.</li> |
| 169 | </ul> |
| 170 | |
| 171 | <h3 id="decoder">Decoder</h3> |
| 172 | |
| 173 | <p>Applications shall use the existing |
| 174 | <a href="https://developer.android.com/reference/android/media/MediaCodecInfo.CodecCapabilities.html#profileLevels"> |
| 175 | <code>CodecCapabilities.profileLevels</code></a> API to verify support for the |
| 176 | new HDR capable profiles:</p> |
| 177 | |
| 178 | <h4>Dolby-Vision</h4> |
| 179 | |
| 180 | <p><code>MediaFormat</code> mime constant: |
| 181 | <blockquote><pre> |
| 182 | String MIMETYPE_VIDEO_DOLBY_VISION |
| 183 | </pre></blockquote></p> |
| 184 | |
| 185 | <p><code>MediaCodecInfo.CodecProfileLevel</code> profile constants:</p> |
| 186 | <blockquote><pre> |
| 187 | int DolbyVisionProfileDvavPen |
| 188 | int DolbyVisionProfileDvavPer |
| 189 | int DolbyVisionProfileDvheDen |
| 190 | int DolbyVisionProfileDvheDer |
| 191 | int DolbyVisionProfileDvheDtb |
| 192 | int DolbyVisionProfileDvheDth |
| 193 | int DolbyVisionProfileDvheDtr |
| 194 | int DolbyVisionProfileDvheStn |
| 195 | </pre></blockquote> |
| 196 | |
| 197 | <p>Dolby Vision video layers and metadata must be concatenated into a single |
| 198 | buffer per frames by video applications. This is done automatically by the |
| 199 | Dolby-Vision capable MediaExtractor.</p> |
| 200 | |
| 201 | <h4>HEVC HDR 10</h4> |
| 202 | |
| 203 | <p><code>MediaCodecInfo.CodecProfileLevel</code> profile constants:<p> |
| 204 | <blockquote><pre> |
| 205 | int HEVCProfileMain10HDR10 |
| 206 | </pre></blockquote> |
| 207 | |
| 208 | <h4>VP9 HLG & PQ</h4> |
| 209 | |
| 210 | <p><code>MediaCodecInfo.CodecProfileLevel</code> profile |
| 211 | constants:</p> |
| 212 | <blockquote><pre> |
| 213 | int VP9Profile2HDR |
| 214 | int VP9Profile3HDR |
| 215 | </pre></blockquote> |
| 216 | |
| 217 | <p>If a platform supports an HDR-capable decoder, it shall also support an |
| 218 | HDR-capable extractor.</p> |
| 219 | |
| 220 | <p>Only tunneled decoders are guaranteed to play back HDR content. Playback |
| 221 | by non-tunneled decoders may result in the HDR information being lost and |
| 222 | the content being flattened into an SDR color volume.</p> |
| 223 | |
| 224 | <h3 id="extractor">Extractor</h3> |
| 225 | |
| 226 | <p>The following containers are supported for the various HDR technologies |
| 227 | on Android 7.0:</p> |
| 228 | |
| 229 | <table> |
| 230 | <tbody> |
| 231 | <tr> |
| 232 | <th>Technology |
| 233 | </th> |
| 234 | <th>Dolby-Vision |
| 235 | </th> |
| 236 | <th>HDR10 |
| 237 | </th> |
| 238 | <th>VP9-HLG |
| 239 | </th> |
| 240 | <th>VP9-PQ |
| 241 | </th> |
| 242 | </tr> |
| 243 | <tr> |
| 244 | <th>Container |
| 245 | </th> |
| 246 | <td>MP4 |
| 247 | </td> |
| 248 | <td>MP4 |
| 249 | </td> |
| 250 | <td>WebM |
| 251 | </td> |
| 252 | <td>WebM |
| 253 | </td> |
| 254 | </tr> |
| 255 | </tbody> |
| 256 | </table> |
| 257 | |
| 258 | <p>Discovery of whether a track (of a file) requires HDR support is not |
| 259 | supported by the platform. Applications may parse the codec-specific data |
| 260 | to determine if a track requires a specific HDR profile.</p> |
| 261 | |
| 262 | <h3 id ="summary">Summary</h3> |
| 263 | |
| 264 | <p>Component requirements for each HDR technology are shown in the following table:</p> |
| 265 | |
| 266 | <div style="overflow:auto"> |
| 267 | <table> |
| 268 | <tbody> |
| 269 | <tr> |
| 270 | <th>Technology |
| 271 | </th> |
| 272 | <th>Dolby-Vision |
| 273 | </th> |
| 274 | <th>HDR10 |
| 275 | </th> |
| 276 | <th>VP9-HLG |
| 277 | </th> |
| 278 | <th>VP9-PQ |
| 279 | </th> |
| 280 | </tr> |
| 281 | <tr> |
| 282 | <th>Supported HDR type (Display) |
| 283 | </th> |
| 284 | <td>HDR_TYPE_DOLBY_VISION |
| 285 | </td> |
| 286 | <td>HDR_TYPE_HDR10 |
| 287 | </td> |
| 288 | <td>HDR_TYPE_HLG |
| 289 | </td> |
| 290 | <td>HDR_TYPE_HDR10 |
| 291 | </td> |
| 292 | </tr> |
| 293 | <tr> |
| 294 | <th>Container (Extractor) |
| 295 | </th> |
| 296 | <td>MP4 |
| 297 | </td> |
| 298 | <td>MP4 |
| 299 | </td> |
| 300 | <td>WebM |
| 301 | </td> |
| 302 | <td>WebM |
| 303 | </td> |
| 304 | </tr> |
| 305 | <tr> |
| 306 | <th>Decoder |
| 307 | </th> |
| 308 | <td>MIMETYPE_VIDEO_DOLBY_VISION |
| 309 | </td> |
| 310 | <td>MIMETYPE_VIDEO_HEVC |
| 311 | </td> |
| 312 | <td>MIMETYPE_VIDEO_VP9 |
| 313 | </td> |
| 314 | <td>MIMETYPE_VIDEO_VP9 |
| 315 | </td> |
| 316 | </tr> |
| 317 | <tr> |
| 318 | <th>Profile (Decoder) |
| 319 | </th> |
| 320 | <td>One of the Dolby profiles |
| 321 | </td> |
| 322 | <td>HEVCProfileMain10HDR10 |
| 323 | </td> |
| 324 | <td>VP9Profile2HDR or |
| 325 | VP9Profile3HDR |
| 326 | </td> |
| 327 | <td>VP9Profile2HDR or |
| 328 | VP9Profile3HDR |
| 329 | </td> |
| 330 | </tr> |
| 331 | </tbody> |
| 332 | </table> |
| 333 | </div> |
| 334 | <br> |
| 335 | |
| 336 | <p>Notes:</p> |
| 337 | <ul> |
| 338 | <li>Dolby-Vision bitstreams are packaged in an MP4 container in a way defined |
| 339 | by Dolby. Applications may implement their own Dolby-capable extractors as |
| 340 | long as they package the access units from the corresponding layers into a |
| 341 | single access unit for the decoder as defined by Dolby.</li> |
| 342 | <li>A platform may support an HDR-capable extractor, but no corresponding |
| 343 | HDR-capable decoder.</li> |
| 344 | </ul> |
| 345 | |
| 346 | <h2 id="playback">Playback</h2> |
| 347 | |
| 348 | <p>After an application has verified support for HDR playback, it can play |
| 349 | back HDR content nearly the same way as it plays back non-HDR content, |
| 350 | with the following caveats:</p> |
| 351 | |
| 352 | <ul> |
| 353 | <li>For Dolby-Vision, whether or not a specific media file/track requires |
| 354 | an HDR capable decoder is not immediately available. The application must |
| 355 | have this information in advance or be able to obtain this information by |
| 356 | parsing the codec-specific data section of the MediaFormat.</li> |
| 357 | <li><code>CodecCapabilities.isFormatSupported</code> does not consider whether |
| 358 | the tunneled decoder feature is required for supporting such a profile.</li> |
| 359 | </ul> |
| 360 | |
| 361 | <h2 id="enablinghdr">Enabling HDR platform support</h2> |
| 362 | |
| 363 | <p>SoC vendors and OEMs must do additional work to enable HDR platform |
| 364 | support for a device.</p> |
| 365 | |
| 366 | <h3 id="platformchanges">Platform changes in Android 7.0 for HDR</h3> |
| 367 | |
| 368 | <p>Here are some key changes in the platform (Application/Native layer) |
| 369 | that OEMs and SOCs need to be aware of.</p> |
| 370 | |
| 371 | <h3 id="display">Display</h3> |
| 372 | |
| 373 | <h4>Hardware composition</h4> |
| 374 | |
| 375 | <p>HDR-capable platforms must support blending HDR content with non-HDR |
| 376 | content. The exact blending characteristics and operations are not defined |
| 377 | by Android as of release 7.0, but the process generally follows these steps:</p> |
| 378 | <ol> |
| 379 | <li>Determine a linear color space/volume that contains all layers to be |
| 380 | composited, based on the layers' color, mastering, and potential dynamic |
| 381 | metadata. |
| 382 | <br>If compositing directly to a display, this could be the linear space |
| 383 | that matches the display's color volume.</li> |
| 384 | <li>Convert all layers to the common color space.</li> |
| 385 | <li>Perform the blending.</li> |
| 386 | <li>If displaying through HDMI: |
| 387 | <ol style="list-style-type: lower-alpha"> |
| 388 | <li>Determine the color, mastering, and potential dynamic metadata for the |
| 389 | blended scene.</li> |
| 390 | <li>Convert the resulting blended scene to the derived color |
| 391 | space/volume.</ol></li> |
| 392 | <li>If displaying directly to the display, convert the resulting blended |
| 393 | scene to the required display signals to produce that scene. |
| 394 | </ol></li> |
| 395 | </ol> |
| 396 | |
| 397 | <h4>Display discovery</h4> |
| 398 | |
| 399 | <p>HDR display discovery is only supported via HWC2. Partners must selectively |
| 400 | enable the HWC2 adapter that is released with Android 7.0 for this feature |
| 401 | to work. Therefore, platforms must add support for HWC2 or extend the AOSP |
| 402 | framework to allow a way to provide this information. HWC2 exposes a new |
| 403 | API to propagate HDR Static Data to the framework and the application.</p> |
| 404 | |
| 405 | <h4>HDMI</h4> |
| 406 | |
| 407 | <ul> |
| 408 | <li>A connected HDMI display advertises |
| 409 | its HDR capability through HDMI EDID as defined in CTA-861.3 |
| 410 | section 4.2.</li> |
| 411 | <li>The following EOTF mapping shall be used:<ul> |
| 412 | <li>ET_0 Traditional gamma - SDR Luminance Range: not mapped to any HDR |
| 413 | type</li> |
| 414 | <li>ET_1 Traditional gamma - HDR Luminance Range: not mapped to any HDR |
| 415 | type</li> |
| 416 | <li>ET_2 SMPTE ST 2084 - mapped to HDR type HDR10</ul></li> |
| 417 | <li>The signaling of Dolby Vision or HLG support over HDMI is done as defined |
| 418 | by their relevant bodies.</li> |
| 419 | <li>Note that the HWC2 API uses float desired luminance values, so the 8-bit |
| 420 | EDID values must be translated in a suitable fashion.</ul></li> |
| 421 | </ul> |
| 422 | |
| 423 | <h3 id="decoders">Decoders</h3> |
| 424 | |
| 425 | <p>Platforms must add HDR-capable tunneled decoders and advertise their HDR |
| 426 | support. Generally, HDR-capable decoders must:</p> |
| 427 | <ul> |
| 428 | <li>Support tunneled decoding (<code>FEATURE_TunneledPlayback</code>).</li> |
| 429 | <li>Support HDR static metadata |
| 430 | (<code>OMX.google.android.index.describeHDRColorInfo</code>) and its |
| 431 | propagation to the display/hardware composition. For HLG, appropriate metadata |
| 432 | must be submitted to the display.</li> |
| 433 | <li>Support color description |
| 434 | (<code>OMX.google.android.index.describeColorAspects</code>) and its |
| 435 | propagation to the display/hardware composition.</li> |
| 436 | <li>Support HDR embedded metadata as defined by the relevant standard.</li> |
| 437 | </ul> |
| 438 | |
| 439 | <h4>Dolby Vision decoder support</h4> |
| 440 | |
| 441 | <p>To support Dolby Vision, platforms must add a Dolby-Vision capable |
| 442 | HDR OMX decoder. Given the specifics of Dolby Vision, this is normally a |
| 443 | wrapper decoder around one or more AVC and/or HEVC decoders as well as a |
| 444 | compositor. Such decoders must:</p> |
| 445 | <ul> |
| 446 | <li>Support mime type "video/dolby-vision."</li> |
| 447 | <li>Advertise supported Dolby Vision profiles/levels.</li> |
| 448 | <li>Accept access units that contain the sub-access-units of all layers as |
| 449 | defined by Dolby.</li> |
| 450 | <li>Accept codec-specific data defined by Dolby. For example, data containing |
| 451 | Dolby Vision profile/level and possibly the codec-specific data for the |
| 452 | internal decoders.</li> |
| 453 | <li>Support adaptive switching between Dolby Vision profiles/levels as |
| 454 | required by Dolby.</li> |
| 455 | </ul> |
| 456 | |
| 457 | <p>When configuring the decoder, the actual Dolby profile is not communicated |
| 458 | to the codec. This is only done via codec-specific data after the decoder |
| 459 | has been started. A platform could choose to support multiple Dolby Vision |
| 460 | decoders: one for AVC profiles, and another for HEVC profiles to be able to |
| 461 | initialize underlying codecs during configure time. If a single Dolby Vision |
| 462 | decoder supports both types of profiles, it must also support switching |
| 463 | between those dynamically in an adaptive fashion.</p> |
| 464 | <p>If a platform provides a Dolby-Vision capable decoder in addition to the |
| 465 | general HDR decoder support, it must:</p> |
| 466 | |
| 467 | <ul> |
| 468 | <li>Provide a Dolby-Vision aware extractor, even if it does not support |
| 469 | HDR playback.</li> |
| 470 | <li>Provide a decoder that supports at least Dolby Vision profile X/level |
| 471 | Y.</li> |
| 472 | </ul> |
| 473 | |
| 474 | <h4>HDR10 decoder support</h4> |
| 475 | |
| 476 | <p>To support HDR10, platforms must add an HDR10-capable OMX decoder. This |
| 477 | is normally a tunneled HEVC decoder that also supports parsing and handling |
| 478 | HDMI related metadata. Such a decoder (in addition to the general HDR decoder |
| 479 | support) must:</p> |
| 480 | <ul> |
| 481 | <li>Support mime type "video/hevc."</li> |
| 482 | <li>Advertise supported HEVCMain10HDR10. HEVCMain10HRD10 profile support |
| 483 | also requires supporting the HEVCMain10 profile, which requires supporting |
| 484 | the HEVCMain profile at the same levels.</li> |
| 485 | <li>Support parsing the mastering metadata SEI blocks, as well as other HDR |
| 486 | related info contained in SPS.</li> |
| 487 | </ul> |
| 488 | |
| 489 | <h4>VP9 decoder support</h4> |
| 490 | |
| 491 | <p>To support VP9 HDR, platforms must add a VP9 Profile2-capable HDR OMX |
| 492 | decoder. This is normally a tunneled VP9 decoder that also supports handling |
| 493 | HDMI related metadata. Such decoders (in addition to the general HDR decoder |
| 494 | support) must:</p> |
| 495 | <ul> |
| 496 | <li>Support mime type "video/x-vnd.on2.vp9."</li> |
| 497 | <li>Advertise supported VP9Profile2HDR. VP9Profile2HDR profile support also |
| 498 | requires supporting VP9Profile2 profile at the same level.</li> |
| 499 | </ul> |
| 500 | |
| 501 | <h3 id="extractors">Extractors</h3> |
| 502 | |
| 503 | <h4>Dolby Vision extractor support</h4> |
| 504 | |
| 505 | <p>Platforms that support Dolby Vision decoders must add Dolby extractor |
| 506 | (called Dolby Extractor) support for Dolby Video content.</p> |
| 507 | <ul> |
| 508 | <li>A regular MP4 extractor can only extract the base layer from a file, |
| 509 | but not the enhancement or metadata layers. So a special Dolby extractor is |
| 510 | needed to extract the data from the file.</li> |
| 511 | <li>The Dolby extractor must expose 1 to 2 tracks for each Dolby video track |
| 512 | (group): |
| 513 | <ul> |
| 514 | <li>A Dolby Vision HDR track with the type of "video/dolby-vision" for the |
| 515 | combined 2/3-layers Dolby stream. The HDR track's access-unit format, which |
| 516 | defines how to package the access units from the base/enhancement/metadata |
| 517 | layers into a single buffer to be decoded into a single HDR frame, is to be |
| 518 | defined by Dolby.</li> |
| 519 | <li>If a Dolby Vision video track contains a separate (backward compatible) |
| 520 | base-layer (BL), the extractor must also expose this as a separate "video/avc" |
| 521 | or "video/hevc" track. The extractor must provide regular AVC/HEVC access |
| 522 | units for this track.</li> |
| 523 | <li>The BL track must have the same track-unique-ID ("track-ID") as the |
| 524 | HDR track so the app understands that these are two encodings of the same |
| 525 | video.</li> |
| 526 | <li>The application can decide which track to choose based on the platform's |
| 527 | capability.</li> |
| 528 | </ul> |
| 529 | </li> |
| 530 | <li>The Dolby Vision profile/level must be exposed in the track format of |
| 531 | the HDR track.</li> |
| 532 | <li>If a platform provides a Dolby-Vision capable decoder, it must also provide |
| 533 | a Dolby-Vision aware extractor, even if it does not support HDR playback.</li> |
| 534 | </ul> |
| 535 | |
| 536 | <h4>HDR10 and VP9 HDR extractor support</h4> |
| 537 | |
| 538 | <p>There are no additional extractor requirements to support HDR10 or VP9 |
| 539 | HLG. Platforms must extend MP4 extractor to support VP9 PQ in MP4. HDR |
| 540 | static metadata must be propagated in the VP9 PQ bitstream, such that this |
| 541 | metadata is passed to the VP9 PQ decoder and to the display via the normal |
| 542 | MediaExtractor => MediaCodec pipeline.</p> |
| 543 | |
| 544 | <h3 id="stagefright">Stagefright extensions for Dolby Vision support</h3> |
| 545 | |
| 546 | <p>Platforms must add Dolby Vision format support to Stagefright:</p> |
| 547 | <ul> |
| 548 | <li>Support for port definition query for compressed port.</li> |
| 549 | <li>Support profile/level enumeration for DV decoder.</li> |
| 550 | <li>Support exposing DV profile/level for DV HDR tracks.</li> |
| 551 | </ul> |
| 552 | |
| 553 | <h2 id="implementationnotes">Technology-specific implementation details</h2> |
| 554 | |
| 555 | <h3 id="hdr10decoder">HDR10 decoder pipeline</h3> |
| 556 | |
| 557 | <p><img src="../images/hdr10_decoder_pipeline.png"></p> |
| 558 | |
| 559 | <p class="img-caption"><strong>Figure 1.</strong> HDR10 pipeline</p> |
| 560 | |
| 561 | <p>HDR10 bitstreams are packaged in MP4 containers. Applications use a regular |
| 562 | MP4 extractor to extract the frame data and send it to the decoder.</p> |
| 563 | |
| 564 | <ul> |
| 565 | <li><b>MPEG4 Extractor</b><br> |
| 566 | HDR10 bitstreams are recognized as just a normal HEVC stream by a |
| 567 | MPEG4Extractor and the HDR track with the type "video/HEVC" will be |
| 568 | extracted. The framework picks an HEVC video decoder that supports the |
| 569 | Main10HDR10 profile to decode that track.</li> |
| 570 | |
| 571 | <li><b>HEVC Decoder</b><br> |
| 572 | HDR information is in either SEI or SPS. The HEVC decoder first receives |
| 573 | frames that contain the HDR information. The decoder then extracts the HDR |
| 574 | information and notifies the application that it is decoding an HDR video. HDR |
| 575 | information is bundled into decoder output format, which is propagated to |
| 576 | the surface later.</li> |
| 577 | </ul> |
| 578 | |
| 579 | <h4>Vendor actions</h4> |
| 580 | <ol> |
| 581 | <li>Advertise supported HDR decoder profile and level OMX type. Example:<br> |
| 582 | <code>OMX_VIDEO_HEVCProfileMain10HDR10</code> (and <code>Main10</code>)</li> |
| 583 | <li>Implement support for index: |
| 584 | '<code>OMX.google.android.index.describeHDRColorInfo</code>'</li> |
| 585 | <li>Implement support for index: |
| 586 | '<code>OMX.google.android.index.describeColorAspects</code>'</li> |
| 587 | <li>Implement support for SEI parsing of mastering metadata.</li> |
| 588 | </ol> |
| 589 | |
| 590 | <h3 id="dvdecoder">Dolby Vision decoder pipeline</h3> |
| 591 | |
| 592 | <p><img src="../images/dolby_vision_decoder_pipleline.png"></p> |
| 593 | |
| 594 | <p class="img-caption"><strong>Figure 2.</strong> Dolby Vision pipeline</p> |
| 595 | |
| 596 | <p>Dolby-bitstreams are packaged in MP4 containers as defined by |
| 597 | Dolby. Applications could, in theory, use a regular MP4 extractor to extract |
| 598 | the base layer, enhancement layer, and metadata layer independently; however, |
| 599 | this does not fit the current Android MediaExtractor/MediaCodec model.</p> |
| 600 | |
| 601 | <ul> |
| 602 | <li>DolbyExtractor: |
| 603 | <ul> |
| 604 | <li>Dolby-bitstreams are recognized by a DolbyExtractor, which exposes the |
| 605 | various layers as 1 to 2 tracks for each dolby video track (group): |
| 606 | <ul> |
| 607 | <li>An HDR track with the type of "video/dolby-vision" for the combined |
| 608 | 2/3-layers dolby stream. The HDR track's access-unit format, which defines |
| 609 | how to package the access units from the base/enhancement/metadata layers |
| 610 | into a single buffer to be decoded into a single HDR frame, is to be defined |
| 611 | by Dolby.</li> |
| 612 | <li>(Optional, only if the BL is backward compatible) A BL track contains |
| 613 | only the base layer, which must be decodable by regular MediaCodec decoder, |
| 614 | for example, AVC/HEVC decoder. The extractor should provide regular AVC/HEVC |
| 615 | access units for this track. This BL track must have the same track-unique-ID |
| 616 | ("track-ID") as the Dolby track so the application understands that these |
| 617 | are two encodings of the same video.</li> |
| 618 | </ul> |
| 619 | <li>The application can decide which track to choose based on the platform's |
| 620 | capability.</li> |
| 621 | <li>Because an HDR track has a specific HDR type, the framework will pick |
| 622 | a Dolby video decoder to decode that track. The BL track will be decoded by |
| 623 | a regular AVC/HEVC video decoder.</li> |
| 624 | </ul> |
| 625 | |
| 626 | <li>DolbyDecoder: |
| 627 | <ul> |
| 628 | <li>The DolbyDecoder receives access units that contain the required access |
| 629 | units for all layers (EL+BL+MD or BL+MD)</li> |
| 630 | <li>CSD (codec specific data, such as SPS+PPS+VPS) information for the |
| 631 | individual layers can be packaged into 1 CSD frame to be defined by |
| 632 | Dolby. Having a single CSD frame is required.</li> |
| 633 | </ul> |
| 634 | </ul> |
| 635 | |
| 636 | <h4>Dolby actions</h4> |
| 637 | <ol> |
| 638 | <li>Define the packaging of access units for the various Dolby container |
| 639 | schemes (e.g. BL+EL+MD) for the abstract Dolby decoder (i.e. the buffer |
| 640 | format expected by the HDR decoder).</li> |
| 641 | <li>Define the packaging of CSD for the abstract Dolby decoder.</li> |
| 642 | </ol> |
| 643 | |
| 644 | <h4>Vendor actions</h4> |
| 645 | <ol> |
| 646 | <li>Implement Dolby extractor. This can also be done by Dolby.</li> |
| 647 | <li>Integrate DolbyExtractor into the framework. The entry point is |
| 648 | <code>frameworks/av/media/libstagefright/MediaExtractor.cpp</code>.</li> |
| 649 | <li>Declare HDR decoder profile and level OMX |
| 650 | type. Example: <code>OMX_VIDEO_DOLBYPROFILETYPE</code> and |
| 651 | <code>OMX_VIDEO_DOLBYLEVELTYP</code>.</li> |
| 652 | <li>Implement support for index: |
| 653 | <code>'OMX.google.android.index.describeColorAspects</code>'</li> |
| 654 | <li>Propagate the dynamic HDR metadata to the app and surface in each |
| 655 | frame. Typically this information must be packaged into the decoded frame |
| 656 | as defined by Dolby, because the HDMI standard does not provide a way to |
| 657 | pass this to the display.</li> |
| 658 | </ol> |
| 659 | |
| 660 | <h3 id="v9decoder">VP9 decoder pipeline</h3> |
| 661 | |
| 662 | <p><img src="../images/vp9-pq_decoder_pipleline.png"></p> |
| 663 | |
| 664 | <p class="img-caption"><strong>Figure 3.</strong> VP9-PQ pipeline</p> |
| 665 | |
| 666 | <p>VP9 bitstreams are packaged in WebM containers in a way defined by WebM |
| 667 | team. Applications need to use a WebM extractor to extract HDR metadata from |
| 668 | the bitstream before sending frames to the decoder.</p> |
| 669 | |
| 670 | <ul> |
| 671 | <li>WebM Extractor: |
| 672 | <ul> |
| 673 | <li>WebM Extractor extracts the HDR <a |
| 674 | href="http://www.webmproject.org/docs/container/#colour">metadata</a> |
| 675 | and frames from the <a |
| 676 | href="http://www.webmproject.org/docs/container/#location-of-the-colour-element-in-an-mkv-file"> |
| 677 | container</a>.</li> |
| 678 | </ul> |
| 679 | |
| 680 | <li>VP9 Decoder: |
| 681 | <ul> |
| 682 | <li>Decoder receives Profile2 bitstreams and decodes them as normal VP9 |
| 683 | streams.</li> |
| 684 | <li>Decoder receives any HDR static metadata from the framework.</li> |
| 685 | <li>Decoder receives static metadata via the bitstream access units for VP9 |
| 686 | PQ streams.</li> |
| 687 | <li>VP9 decoder must be able to propagate the HDR static/dynamic metadata |
| 688 | to the display.</li> |
| 689 | </ul> |
| 690 | </ul> |
| 691 | |
| 692 | <h4>Vendor Actions</h4> |
| 693 | |
| 694 | <ol> |
| 695 | <li>Implement support for index: |
| 696 | <code>OMX.google.android.index.describeHDRColorInfo</code></li> |
| 697 | <li>Implement support for index: |
| 698 | <code>OMX.google.android.index.describeColorAspects</code></li> |
| 699 | <li>Propagate HDR static metadata</li> |
| 700 | </ol> |