blob: 2062a37320bea1b219a45690953f40944aed1c69 [file] [log] [blame]
Mark Hecomovich1a2e2152016-07-29 11:14:13 -07001page.title=HDR Video Playback
2@jd:body
3
4<!--
5 Copyright 2015 The Android Open Source Project
6
7 Licensed under the Apache License, Version 2.0 (the "License");
8 you may not use this file except in compliance with the License.
9 You may obtain a copy of the License at
10
11 http://www.apache.org/licenses/LICENSE-2.0
12
13 Unless required by applicable law or agreed to in writing, software
14 distributed under the License is distributed on an "AS IS" BASIS,
15 WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
16 See the License for the specific language governing permissions and
17 limitations under the License.
18-->
19<div id="qv-wrapper">
20 <div id="qv">
21 <h2>In this document</h2>
22 <ol id="auto-toc">
23 </ol>
24 </div>
25</div>
26
27<p>High dynamic range (HDR) video is the next frontier in high-quality
28video decoding, bringing unmatched scene reproduction qualities. It does
29so by significantly increasing the dynamic range of the luminance component
30(from the current 100 cd/m<sup>2</sup> to 1000s of cd/m<sup>2</sup>) and by using a much wider
31color space (BT 2020). This is now a central element of the 4K UHD evolution
32in the TV space.</p>
33
34<p>In Android 7.0, initial HDR support has been added, which includes the
35creation of proper constants for the discovery and setup of HDR video
36pipelines. That means defining codec types and display modes and specifying
37how HDR data must be passed to MediaCodec and supplied to HDR decoders. HDR
38is only supported in tunneled video playback mode.</p>
39
40<p>The purpose of this document is to help application developers support HDR stream
41playback, and help OEMs and SOCs enable the HDR features on Android 7.0.</p>
42
43<h2 id="technologies">Supported HDR technologies</h2>
44
45<p>As of Android 7.0 release, the following HDR technologies are supported.
46
47<table>
48<tbody>
49<tr>
50<th>Technology
51</th>
52<th>Dolby-Vision
53</th>
54<th>HDR10
55</th>
56<th>VP9-HLG
57</th>
58<th>VP9-PQ
59</th>
60</tr>
61<tr>
62<th>Codec
63</th>
64<td>AVC/HEVC
65</td>
66<td>HEVC
67</td>
68<td>VP9
69</td>
70<td>VP9
71</td>
72</tr>
73<tr>
74<th>Transfer Function
75</th>
76<td>ST-2084
77</td>
78<td>ST-2084
79</td>
80<td>HLG
81</td>
82<td>ST-2084
83</td>
84</tr>
85<tr>
86<th>HDR Metadata Type
87</th>
88<td>Dynamic
89</td>
90<td>Static
91</td>
92<td>None
93</td>
94<td>Static
95</td>
96</tr>
97</tbody>
98</table>
99
100<p>In Android 7.0, <b>only HDR playback via tunneled mode is defined</b>,
101but devices may add support for playback of HDR on SurfaceViews using opaque
102video buffers. In other words:</p>
103<ul>
104<li>There is no standard Android API to check if HDR playback is supported
105using non-tunneled decoders.</li>
106<li>Tunneled video decoders that advertise HDR playback capability must
107support HDR playback when connected to HDR-capable displays.</li>
108<li>GL composition of HDR content is not supported by the AOSP Android
1097.0 release.</li>
110</ul>
111
112<h2 id="discovery">Discovery</h2>
113
114<p>HDR Playback requires an HDR-capable decoder and a connection to an
115HDR-capable display. Optionally, some technologies require a specific
116extractor.</p>
117
118<h3 id="display">Display</h3>
119
120<p>Applications shall use the new <code>Display.getHdrCapabilities</code>
121API to query the HDR technologies supported by the specified display. This is
122basically the information in the EDID Static Metadata Data Block as defined
123in CTA-861.3:</p>
124
125<ul>
126<li><code>public Display.HdrCapabilities getHdrCapabilities()</code><br>
127Returns the display's HDR capabilities.</li>
128
129<li><code>public Display.HdrCapabilities getHdrCapabilities()</code><br>
130Returns the display's HDR capabilities.</li>
131
132<li><code>Display.HdrCapabilities</code><br>
133Encapsulates the HDR capabilities of a given display. For example, what HDR
134types it supports and details about the desired luminance data.</li>
135</ul>
136
137<p><b>Constants:</b></p>
138
139<ul>
140<li><code>int HDR_TYPE_DOLBY_VISION</code><br>
141Dolby Vision support.</li>
142
143<li><code>int HDR_TYPE_HDR10</code><br>
144HDR10 / PQ support.</li>
145
146<li><code>int HDR_TYPE_HLG</code><br>
147Hybrid Log-Gamma support.</li>
148
149<li><code>float INVALID_LUMINANCE</code><br>
150Invalid luminance value.</li>
151</ul>
152
153<p><b>Public Methods:</b></p>
154
155<ul>
156<li><code>float getDesiredMaxAverageLuminance()</code><br>
157Returns the desired content max frame-average luminance data in cd/cd/m<sup>2</sup> for
158this display.</li>
159
160<li><code>float getDesiredMaxLuminance()</code><br>
161Returns the desired content max luminance data in cd/cd/m<sup>2</sup> for this display.</li>
162
163<li><code>float getDesiredMinLuminance()</code><br>
164Returns the desired content min luminance data in cd/cd/m<sup>2</sup> for this display.</li>
165
166<li><code>int[] getSupportedHdrTypes()</code><br>
167Gets the supported HDR types of this display (see constants). Returns empty
168array if HDR is not supported by the display.</li>
169</ul>
170
171<h3 id="decoder">Decoder</h3>
172
173<p>Applications shall use the existing
174<a href="https://developer.android.com/reference/android/media/MediaCodecInfo.CodecCapabilities.html#profileLevels">
175<code>CodecCapabilities.profileLevels</code></a> API to verify support for the
176new HDR capable profiles:</p>
177
178<h4>Dolby-Vision</h4>
179
180<p><code>MediaFormat</code> mime constant:
181<blockquote><pre>
182String MIMETYPE_VIDEO_DOLBY_VISION
183</pre></blockquote></p>
184
185<p><code>MediaCodecInfo.CodecProfileLevel</code> profile constants:</p>
186<blockquote><pre>
187int DolbyVisionProfileDvavPen
188int DolbyVisionProfileDvavPer
189int DolbyVisionProfileDvheDen
190int DolbyVisionProfileDvheDer
191int DolbyVisionProfileDvheDtb
192int DolbyVisionProfileDvheDth
193int DolbyVisionProfileDvheDtr
194int DolbyVisionProfileDvheStn
195</pre></blockquote>
196
197<p>Dolby Vision video layers and metadata must be concatenated into a single
198buffer per frames by video applications. This is done automatically by the
199Dolby-Vision capable MediaExtractor.</p>
200
201<h4>HEVC HDR 10</h4>
202
203<p><code>MediaCodecInfo.CodecProfileLevel</code> profile constants:<p>
204<blockquote><pre>
205int HEVCProfileMain10HDR10
206</pre></blockquote>
207
208<h4>VP9 HLG & PQ</h4>
209
210<p><code>MediaCodecInfo.CodecProfileLevel</code> profile
211constants:</p>
212<blockquote><pre>
213int VP9Profile2HDR
214int VP9Profile3HDR
215</pre></blockquote>
216
217<p>If a platform supports an HDR-capable decoder, it shall also support an
218HDR-capable extractor.</p>
219
220<p>Only tunneled decoders are guaranteed to play back HDR content. Playback
221by non-tunneled decoders may result in the HDR information being lost and
222the content being flattened into an SDR color volume.</p>
223
224<h3 id="extractor">Extractor</h3>
225
226<p>The following containers are supported for the various HDR technologies
227on Android 7.0:</p>
228
229<table>
230<tbody>
231<tr>
232<th>Technology
233</th>
234<th>Dolby-Vision
235</th>
236<th>HDR10
237</th>
238<th>VP9-HLG
239</th>
240<th>VP9-PQ
241</th>
242</tr>
243<tr>
244<th>Container
245</th>
246<td>MP4
247</td>
248<td>MP4
249</td>
250<td>WebM
251</td>
252<td>WebM
253</td>
254</tr>
255</tbody>
256</table>
257
258<p>Discovery of whether a track (of a file) requires HDR support is not
259supported by the platform. Applications may parse the codec-specific data
260to determine if a track requires a specific HDR profile.</p>
261
262<h3 id ="summary">Summary</h3>
263
264<p>Component requirements for each HDR technology are shown in the following table:</p>
265
266<div style="overflow:auto">
267<table>
268<tbody>
269<tr>
270<th>Technology
271</th>
272<th>Dolby-Vision
273</th>
274<th>HDR10
275</th>
276<th>VP9-HLG
277</th>
278<th>VP9-PQ
279</th>
280</tr>
281<tr>
282<th>Supported HDR type (Display)
283</th>
284<td>HDR_TYPE_DOLBY_VISION
285</td>
286<td>HDR_TYPE_HDR10
287</td>
288<td>HDR_TYPE_HLG
289</td>
290<td>HDR_TYPE_HDR10
291</td>
292</tr>
293<tr>
294<th>Container (Extractor)
295</th>
296<td>MP4
297</td>
298<td>MP4
299</td>
300<td>WebM
301</td>
302<td>WebM
303</td>
304</tr>
305<tr>
306<th>Decoder
307</th>
308<td>MIMETYPE_VIDEO_DOLBY_VISION
309</td>
310<td>MIMETYPE_VIDEO_HEVC
311</td>
312<td>MIMETYPE_VIDEO_VP9
313</td>
314<td>MIMETYPE_VIDEO_VP9
315</td>
316</tr>
317<tr>
318<th>Profile (Decoder)
319</th>
320<td>One of the Dolby profiles
321</td>
322<td>HEVCProfileMain10HDR10
323</td>
324<td>VP9Profile2HDR or
325VP9Profile3HDR
326</td>
327<td>VP9Profile2HDR or
328VP9Profile3HDR
329</td>
330</tr>
331</tbody>
332</table>
333</div>
334<br>
335
336<p>Notes:</p>
337<ul>
338<li>Dolby-Vision bitstreams are packaged in an MP4 container in a way defined
339by Dolby. Applications may implement their own Dolby-capable extractors as
340long as they package the access units from the corresponding layers into a
341single access unit for the decoder as defined by Dolby.</li>
342<li>A platform may support an HDR-capable extractor, but no corresponding
343HDR-capable decoder.</li>
344</ul>
345
346<h2 id="playback">Playback</h2>
347
348<p>After an application has verified support for HDR playback, it can play
349back HDR content nearly the same way as it plays back non-HDR content,
350with the following caveats:</p>
351
352<ul>
353<li>For Dolby-Vision, whether or not a specific media file/track requires
354an HDR capable decoder is not immediately available. The application must
355have this information in advance or be able to obtain this information by
356parsing the codec-specific data section of the MediaFormat.</li>
357<li><code>CodecCapabilities.isFormatSupported</code> does not consider whether
358the tunneled decoder feature is required for supporting such a profile.</li>
359</ul>
360
361<h2 id="enablinghdr">Enabling HDR platform support</h2>
362
363<p>SoC vendors and OEMs must do additional work to enable HDR platform
364support for a device.</p>
365
366<h3 id="platformchanges">Platform changes in Android 7.0 for HDR</h3>
367
368<p>Here are some key changes in the platform (Application/Native layer)
369that OEMs and SOCs need to be aware of.</p>
370
371<h3 id="display">Display</h3>
372
373<h4>Hardware composition</h4>
374
375<p>HDR-capable platforms must support blending HDR content with non-HDR
376content. The exact blending characteristics and operations are not defined
377by Android as of release 7.0, but the process generally follows these steps:</p>
378<ol>
379<li>Determine a linear color space/volume that contains all layers to be
380composited, based on the layers' color, mastering, and potential dynamic
381metadata.
382<br>If compositing directly to a display, this could be the linear space
383that matches the display's color volume.</li>
384<li>Convert all layers to the common color space.</li>
385<li>Perform the blending.</li>
386<li>If displaying through HDMI:
387<ol style="list-style-type: lower-alpha">
388<li>Determine the color, mastering, and potential dynamic metadata for the
389blended scene.</li>
390<li>Convert the resulting blended scene to the derived color
391space/volume.</ol></li>
392<li>If displaying directly to the display, convert the resulting blended
393scene to the required display signals to produce that scene.
394</ol></li>
395</ol>
396
397<h4>Display discovery</h4>
398
399<p>HDR display discovery is only supported via HWC2. Partners must selectively
400enable the HWC2 adapter that is released with Android 7.0 for this feature
401to work. Therefore, platforms must add support for HWC2 or extend the AOSP
402framework to allow a way to provide this information. HWC2 exposes a new
403API to propagate HDR Static Data to the framework and the application.</p>
404
405<h4>HDMI</h4>
406
407<ul>
408<li>A connected HDMI display advertises
409its HDR capability through HDMI EDID as defined in CTA-861.3
410section 4.2.</li>
411<li>The following EOTF mapping shall be used:<ul>
412<li>ET_0 Traditional gamma - SDR Luminance Range: not mapped to any HDR
413type</li>
414<li>ET_1 Traditional gamma - HDR Luminance Range: not mapped to any HDR
415type</li>
416<li>ET_2 SMPTE ST 2084 - mapped to HDR type HDR10</ul></li>
417<li>The signaling of Dolby Vision or HLG support over HDMI is done as defined
418by their relevant bodies.</li>
419<li>Note that the HWC2 API uses float desired luminance values, so the 8-bit
420EDID values must be translated in a suitable fashion.</ul></li>
421</ul>
422
423<h3 id="decoders">Decoders</h3>
424
425<p>Platforms must add HDR-capable tunneled decoders and advertise their HDR
426support. Generally, HDR-capable decoders must:</p>
427<ul>
428<li>Support tunneled decoding (<code>FEATURE_TunneledPlayback</code>).</li>
429<li>Support HDR static metadata
430(<code>OMX.google.android.index.describeHDRColorInfo</code>) and its
431propagation to the display/hardware composition. For HLG, appropriate metadata
432must be submitted to the display.</li>
433<li>Support color description
434(<code>OMX.google.android.index.describeColorAspects</code>) and its
435propagation to the display/hardware composition.</li>
436<li>Support HDR embedded metadata as defined by the relevant standard.</li>
437</ul>
438
439<h4>Dolby Vision decoder support</h4>
440
441<p>To support Dolby Vision, platforms must add a Dolby-Vision capable
442HDR OMX decoder. Given the specifics of Dolby Vision, this is normally a
443wrapper decoder around one or more AVC and/or HEVC decoders as well as a
444compositor. Such decoders must:</p>
445<ul>
446<li>Support mime type "video/dolby-vision."</li>
447<li>Advertise supported Dolby Vision profiles/levels.</li>
448<li>Accept access units that contain the sub-access-units of all layers as
449defined by Dolby.</li>
450<li>Accept codec-specific data defined by Dolby. For example, data containing
451Dolby Vision profile/level and possibly the codec-specific data for the
452internal decoders.</li>
453<li>Support adaptive switching between Dolby Vision profiles/levels as
454required by Dolby.</li>
455</ul>
456
457<p>When configuring the decoder, the actual Dolby profile is not communicated
458to the codec. This is only done via codec-specific data after the decoder
459has been started. A platform could choose to support multiple Dolby Vision
460decoders: one for AVC profiles, and another for HEVC profiles to be able to
461initialize underlying codecs during configure time. If a single Dolby Vision
462decoder supports both types of profiles, it must also support switching
463between those dynamically in an adaptive fashion.</p>
464<p>If a platform provides a Dolby-Vision capable decoder in addition to the
465general HDR decoder support, it must:</p>
466
467<ul>
468<li>Provide a Dolby-Vision aware extractor, even if it does not support
469HDR playback.</li>
470<li>Provide a decoder that supports at least Dolby Vision profile X/level
471Y.</li>
472</ul>
473
474<h4>HDR10 decoder support</h4>
475
476<p>To support HDR10, platforms must add an HDR10-capable OMX decoder. This
477is normally a tunneled HEVC decoder that also supports parsing and handling
478HDMI related metadata. Such a decoder (in addition to the general HDR decoder
479support) must:</p>
480<ul>
481<li>Support mime type "video/hevc."</li>
482<li>Advertise supported HEVCMain10HDR10. HEVCMain10HRD10 profile support
483also requires supporting the HEVCMain10 profile, which requires supporting
484the HEVCMain profile at the same levels.</li>
485<li>Support parsing the mastering metadata SEI blocks, as well as other HDR
486related info contained in SPS.</li>
487</ul>
488
489<h4>VP9 decoder support</h4>
490
491<p>To support VP9 HDR, platforms must add a VP9 Profile2-capable HDR OMX
492decoder. This is normally a tunneled VP9 decoder that also supports handling
493HDMI related metadata. Such decoders (in addition to the general HDR decoder
494support) must:</p>
495<ul>
496<li>Support mime type "video/x-vnd.on2.vp9."</li>
497<li>Advertise supported VP9Profile2HDR. VP9Profile2HDR profile support also
498requires supporting VP9Profile2 profile at the same level.</li>
499</ul>
500
501<h3 id="extractors">Extractors</h3>
502
503<h4>Dolby Vision extractor support</h4>
504
505<p>Platforms that support Dolby Vision decoders must add Dolby extractor
506(called Dolby Extractor) support for Dolby Video content.</p>
507<ul>
508<li>A regular MP4 extractor can only extract the base layer from a file,
509but not the enhancement or metadata layers. So a special Dolby extractor is
510needed to extract the data from the file.</li>
511<li>The Dolby extractor must expose 1 to 2 tracks for each Dolby video track
512(group):
513<ul>
514<li>A Dolby Vision HDR track with the type of "video/dolby-vision" for the
515combined 2/3-layers Dolby stream. The HDR track's access-unit format, which
516defines how to package the access units from the base/enhancement/metadata
517layers into a single buffer to be decoded into a single HDR frame, is to be
518defined by Dolby.</li>
519<li>If a Dolby Vision video track contains a separate (backward compatible)
520base-layer (BL), the extractor must also expose this as a separate "video/avc"
521or "video/hevc" track. The extractor must provide regular AVC/HEVC access
522units for this track.</li>
523<li>The BL track must have the same track-unique-ID ("track-ID") as the
524HDR track so the app understands that these are two encodings of the same
525video.</li>
526<li>The application can decide which track to choose based on the platform's
527capability.</li>
528</ul>
529</li>
530<li>The Dolby Vision profile/level must be exposed in the track format of
531the HDR track.</li>
532<li>If a platform provides a Dolby-Vision capable decoder, it must also provide
533a Dolby-Vision aware extractor, even if it does not support HDR playback.</li>
534</ul>
535
536<h4>HDR10 and VP9 HDR extractor support</h4>
537
538<p>There are no additional extractor requirements to support HDR10 or VP9
539HLG. Platforms must extend MP4 extractor to support VP9 PQ in MP4. HDR
540static metadata must be propagated in the VP9 PQ bitstream, such that this
541metadata is passed to the VP9 PQ decoder and to the display via the normal
542MediaExtractor =&gt; MediaCodec pipeline.</p>
543
544<h3 id="stagefright">Stagefright extensions for Dolby Vision support</h3>
545
546<p>Platforms must add Dolby Vision format support to Stagefright:</p>
547<ul>
548<li>Support for port definition query for compressed port.</li>
549<li>Support profile/level enumeration for DV decoder.</li>
550<li>Support exposing DV profile/level for DV HDR tracks.</li>
551</ul>
552
553<h2 id="implementationnotes">Technology-specific implementation details</h2>
554
555<h3 id="hdr10decoder">HDR10 decoder pipeline</h3>
556
557<p><img src="../images/hdr10_decoder_pipeline.png"></p>
558
559<p class="img-caption"><strong>Figure 1.</strong> HDR10 pipeline</p>
560
561<p>HDR10 bitstreams are packaged in MP4 containers. Applications use a regular
562MP4 extractor to extract the frame data and send it to the decoder.</p>
563
564<ul>
565<li><b>MPEG4 Extractor</b><br>
566HDR10 bitstreams are recognized as just a normal HEVC stream by a
567MPEG4Extractor and the HDR track with the type "video/HEVC" will be
568extracted. The framework picks an HEVC video decoder that supports the
569Main10HDR10 profile to decode that track.</li>
570
571<li><b>HEVC Decoder</b><br>
572HDR information is in either SEI or SPS. The HEVC decoder first receives
573frames that contain the HDR information. The decoder then extracts the HDR
574information and notifies the application that it is decoding an HDR video. HDR
575information is bundled into decoder output format, which is propagated to
576the surface later.</li>
577</ul>
578
579<h4>Vendor actions</h4>
580<ol>
581<li>Advertise supported HDR decoder profile and level OMX type. Example:<br>
582<code>OMX_VIDEO_HEVCProfileMain10HDR10</code> (and <code>Main10</code>)</li>
583<li>Implement support for index:
584'<code>OMX.google.android.index.describeHDRColorInfo</code>'</li>
585<li>Implement support for index:
586'<code>OMX.google.android.index.describeColorAspects</code>'</li>
587<li>Implement support for SEI parsing of mastering metadata.</li>
588</ol>
589
590<h3 id="dvdecoder">Dolby Vision decoder pipeline</h3>
591
592<p><img src="../images/dolby_vision_decoder_pipleline.png"></p>
593
594<p class="img-caption"><strong>Figure 2.</strong> Dolby Vision pipeline</p>
595
596<p>Dolby-bitstreams are packaged in MP4 containers as defined by
597Dolby. Applications could, in theory, use a regular MP4 extractor to extract
598the base layer, enhancement layer, and metadata layer independently; however,
599this does not fit the current Android MediaExtractor/MediaCodec model.</p>
600
601<ul>
602<li>DolbyExtractor:
603<ul>
604<li>Dolby-bitstreams are recognized by a DolbyExtractor, which exposes the
605various layers as 1 to 2 tracks for each dolby video track (group):
606<ul>
607<li>An HDR track with the type of "video/dolby-vision" for the combined
6082/3-layers dolby stream. The HDR track's access-unit format, which defines
609how to package the access units from the base/enhancement/metadata layers
610into a single buffer to be decoded into a single HDR frame, is to be defined
611by Dolby.</li>
612<li>(Optional, only if the BL is backward compatible) A BL track contains
613only the base layer, which must be decodable by regular MediaCodec decoder,
614for example, AVC/HEVC decoder. The extractor should provide regular AVC/HEVC
615access units for this track. This BL track must have the same track-unique-ID
616("track-ID") as the Dolby track so the application understands that these
617are two encodings of the same video.</li>
618</ul>
619<li>The application can decide which track to choose based on the platform's
620capability.</li>
621<li>Because an HDR track has a specific HDR type, the framework will pick
622a Dolby video decoder to decode that track. The BL track will be decoded by
623a regular AVC/HEVC video decoder.</li>
624</ul>
625
626<li>DolbyDecoder:
627<ul>
628<li>The DolbyDecoder receives access units that contain the required access
629units for all layers (EL+BL+MD or BL+MD)</li>
630<li>CSD (codec specific data, such as SPS+PPS+VPS) information for the
631individual layers can be packaged into 1 CSD frame to be defined by
632Dolby. Having a single CSD frame is required.</li>
633</ul>
634</ul>
635
636<h4>Dolby actions</h4>
637<ol>
638<li>Define the packaging of access units for the various Dolby container
639schemes (e.g. BL+EL+MD) for the abstract Dolby decoder (i.e. the buffer
640format expected by the HDR decoder).</li>
641<li>Define the packaging of CSD for the abstract Dolby decoder.</li>
642</ol>
643
644<h4>Vendor actions</h4>
645<ol>
646<li>Implement Dolby extractor. This can also be done by Dolby.</li>
647<li>Integrate DolbyExtractor into the framework. The entry point is
648<code>frameworks/av/media/libstagefright/MediaExtractor.cpp</code>.</li>
649<li>Declare HDR decoder profile and level OMX
650type. Example: <code>OMX_VIDEO_DOLBYPROFILETYPE</code> and
651<code>OMX_VIDEO_DOLBYLEVELTYP</code>.</li>
652<li>Implement support for index:
653<code>'OMX.google.android.index.describeColorAspects</code>'</li>
654<li>Propagate the dynamic HDR metadata to the app and surface in each
655frame. Typically this information must be packaged into the decoded frame
656as defined by Dolby, because the HDMI standard does not provide a way to
657pass this to the display.</li>
658</ol>
659
660<h3 id="v9decoder">VP9 decoder pipeline</h3>
661
662<p><img src="../images/vp9-pq_decoder_pipleline.png"></p>
663
664<p class="img-caption"><strong>Figure 3.</strong> VP9-PQ pipeline</p>
665
666<p>VP9 bitstreams are packaged in WebM containers in a way defined by WebM
667team. Applications need to use a WebM extractor to extract HDR metadata from
668the bitstream before sending frames to the decoder.</p>
669
670<ul>
671<li>WebM Extractor:
672<ul>
673<li>WebM Extractor extracts the HDR <a
674href="http://www.webmproject.org/docs/container/#colour">metadata</a>
675and frames from the <a
676href="http://www.webmproject.org/docs/container/#location-of-the-colour-element-in-an-mkv-file">
677container</a>.</li>
678</ul>
679
680<li>VP9 Decoder:
681<ul>
682<li>Decoder receives Profile2 bitstreams and decodes them as normal VP9
683streams.</li>
684<li>Decoder receives any HDR static metadata from the framework.</li>
685<li>Decoder receives static metadata via the bitstream access units for VP9
686PQ streams.</li>
687<li>VP9 decoder must be able to propagate the HDR static/dynamic metadata
688to the display.</li>
689</ul>
690</ul>
691
692<h4>Vendor Actions</h4>
693
694<ol>
695<li>Implement support for index:
696<code>OMX.google.android.index.describeHDRColorInfo</code></li>
697<li>Implement support for index:
698<code>OMX.google.android.index.describeColorAspects</code></li>
699<li>Propagate HDR static metadata</li>
700</ol>