Create an HD Video Player With HDR Tech
A walkthrough implementing HDR into a video player.
Join the DZone community and get the full member experience.
Join For FreeWhat Is HDR and Why Does It Matter
Streaming technology has improved significantly, giving rise to higher and higher video resolutions from those at or below 480p (which are known as the standard definition or SD for short) to those at or above 720p (high definition, or HD for short).
The video resolution is vital for all apps. Research that I recently came across backs this up: 62% of people are more likely to negatively perceive a brand that provides a poor-quality video experience, while 57% of people are less likely to share a poor-quality video. With this in mind, it's no wonder that there are so many emerging solutions to enhance video resolution.
One solution is HDR — high dynamic range. It is a post-processing method used in imaging and photography, which mimics what a human eye can see by giving more details to dark areas and improving the contrast. When used in a video player, HDR can deliver richer videos with a higher resolution.
Many HDR solutions, however, are let down by annoying restrictions. These can include a lack of unified technical specifications, a high level of difficulty in implementing them, and a requirement for videos in ultra-high definition. I tried to look for a solution without such restrictions and luckily, I found one. That's the HDR Vivid SDK from HMS Core Video Kit. This solution is packed with image-processing features like the optoelectronic transfer function (OETF), tone mapping, and HDR2SDR. With these features, the SDK can equip a video player with richer colours, a higher level of detail, and more.
I used the SDK together with the HDR Ability SDK (which can also be used independently) to try the latter's brightness adjustment feature and found that they could deliver an even better HDR video playback experience. And on that note, I'd like to share how I used these two SDKs to create a video player.
Before Development
1. Configure the app information as needed in AppGallery Connect.
2. Integrate the HMS Core SDK.
For Android Studio, the SDK can be integrated via the Maven repository. Before the development procedure, the SDK needs to be integrated into the Android Studio project.
3. Configure the obfuscation scripts.
4. Add permissions, including those for accessing the Internet, obtaining the network status, accessing the Wi-Fi network, writing data into the external storage, reading data from the external storage, reading device information, checking whether a device is rooted, and obtaining the wake lock. (The last three permissions are optional.)
App Development
Preparations
1. Check whether the device is capable of decoding an HDR Vivid video. If the device has such a capability, the following function will return true.
public boolean isSupportDecode() {
// Check whether the device supports MediaCodec.
MediaCodecList mcList = new MediaCodecList(MediaCodecList.ALL_CODECS);
MediaCodecInfo[] mcInfos = mcList.getCodecInfos();
for (MediaCodecInfo mci : mcInfos) {
// Filter out the encoder.
if (mci.isEncoder()) {
continue;
}
String[] types = mci.getSupportedTypes();
String typesArr = Arrays.toString(types);
// Filter out the non-HEVC decoder.
if (!typesArr.contains("hevc")) {
continue;
}
for (String type : types) {
// Check whether 10-bit HEVC decoding is supported.
MediaCodecInfo.CodecCapabilities codecCapabilities = mci.getCapabilitiesForType(type);
for (MediaCodecInfo.CodecProfileLevel codecProfileLevel : codecCapabilities.profileLevels) {
if (codecProfileLevel.profile == HEVCProfileMain10
|| codecProfileLevel.profile == HEVCProfileMain10HDR10
|| codecProfileLevel.profile == HEVCProfileMain10HDR10Plus) {
// true means supported.
return true;
}
}
}
}
// false means unsupported.
return false;
}
2. Parse a video to obtain information about its resolution, OETF, color space, and color format. Then save the information in a custom variable. In the example below, the variable is named as VideoInfo.
public class VideoInfo {
private int width;
private int height;
private int tf;
private int colorSpace;
private int colorFormat;
private long durationUs;
}
3. Create a SurfaceView object that will be used by the SDK to process the rendered images.
// surface_view is defined in a layout file.
SurfaceView surfaceView = (SurfaceView) view.findViewById(R.id.surface_view);
4. Create a thread to parse video streams from a video.
Rendering and Transcoding a Video
1. Create and then initialize an instance of HdrVividRender.
HdrVividRender hdrVividRender = new HdrVividRender();
hdrVividRender.init();
2. Configure the OETF and resolution for the video source.
// Configure the OETF.
hdrVividRender.setTransFunc(2);
// Configure the resolution.
hdrVividRender.setInputVideoSize(3840, 2160);
When the SDK is used on an Android device, only the rendering mode for input is supported.
3. Configure the brightness for the output. This step is optional.
hdrVividRender.setBrightness(700);
4. Create a Surface object, which will serve as the input. This method is called when HdrVividRender works in rendering mode, and the created Surface object is passed as the inputSurface parameter of configure to the SDK.
Surface inputSurface = hdrVividRender.createInputSurface();
- Set the dimensions of the rendered Surface object. This step is necessary in the rendering mode for output.
// surfaceView is the video playback window.
hdrVividRender.setOutputSurfaceSize(surfaceView.getWidth(), surfaceView.getHeight());
- Set the colour space for the buffered output video, which can be set in the transcoding mode for output. This step is optional. However, when no colour space is set, BT.709 is used by default.
hdrVividRender.setColorSpace(HdrVividRender.COLORSPACE_P3);
- Set the colour format for the buffered output video, which can be set in the transcoding mode for output. This step is optional. However, when no colour format is specified, R8G8B8A8 is used by default.
hdrVividRender.setColorFormat(HdrVividRender.COLORFORMAT_R8G8B8A8);
6. When the rendering mode is used as the output mode, the following APIs are required.
hdrVividRender.configure(inputSurface, new HdrVividRender.InputCallback() {
@Override
public int onGetDynamicMetaData(HdrVividRender hdrVividRender, long pts) {
// Set the static metadata, which needs to be obtained from the video source.
HdrVividRender.StaticMetaData lastStaticMetaData = new HdrVividRender.StaticMetaData();
hdrVividRender.setStaticMetaData(lastStaticMetaData);
// Set the dynamic metadata, which also needs to be obtained from the video source.
ByteBuffer dynamicMetaData = ByteBuffer.allocateDirect(10);
hdrVividRender.setDynamicMetaData(20000, dynamicMetaData);
return 0;
}
}, surfaceView.getHolder().getSurface(), null);
7. When the transcoding mode is used as the output mode, call the following APIs.
hdrVividRender.configure(inputSurface, new HdrVividRender.InputCallback() {
@Override
public int onGetDynamicMetaData(HdrVividRender hdrVividRender, long pts) {
// Set the static metadata, which needs to be obtained from the video source.
HdrVividRender.StaticMetaData lastStaticMetaData = new HdrVividRender.StaticMetaData();
hdrVividRender.setStaticMetaData(lastStaticMetaData);
// Set the dynamic metadata, which also needs to be obtained from the video source.
ByteBuffer dynamicMetaData = ByteBuffer.allocateDirect(10);
hdrVividRender.setDynamicMetaData(20000, dynamicMetaData);
return 0;
}
}, null, new HdrVividRender.OutputCallback() {
@Override
public void onOutputBufferAvailable(HdrVividRender hdrVividRender, ByteBuffer byteBuffer,
HdrVividRender.BufferInfo bufferInfo) {
// Process the buffered data.
}
});
new HdrVividRender.OutputCallback() is used for asynchronously processing the returned buffered data. If this method is not used, the read method can be used instead. For example:
hdrVividRender.read(new BufferInfo(), 10); // 10 is a timestamp, which is determined by your app.
8. Start the processing flow.
hdrVividRender.start();
9. Stop the processing flow.
hdrVividRender.stop();
10. Release the resources that have been occupied.
hdrVividRender.release();
hdrVividRender = null;
During the above steps, I noticed that when the dimensions of Surface change, setOutputSurfaceSize has to be called to re-configure the dimensions of the Surface output.
Besides, in the rendering mode for output, when WisePlayer is switched from the background to the foreground or vice versa, the Surface object will be destroyed and then re-created. In this case, there is a possibility that the HdrVividRender instance is not destroyed. If so, the setOutputSurface API needs to be called so that a new Surface output can be set.
Setting up HDR Capabilities
HDR capabilities are provided in the class HdrAbility. It can be used to adjust brightness when the HDR Vivid SDK is rendering or transcoding an HDR Vivid video.
1. Initialize the function of brightness adjustment.HdrAbility.init(getApplicationContext());
2. Enable the HDR feature on the device. Then, the maximum brightness of the device screen will increase.
HdrAbility.setHdrAbility(true);
3. Configure the alternative maximum brightness of white points in the output video image data.
HdrAbility.setBrightness(600);
4. Make the video layer highlighted.
HdrAbility.setHdrLayer(surfaceView, true);
5. Configure the feature of highlighting the subtitle layer or the bullet comment layer.
HdrAbility.setCaptionsLayer(captionView, 1.5f);
Summary
Video resolution is an important influencer of user experience for mobile apps. HDR is often used to post-process video, but it is held back by a number of restrictions, which are resolved by the HDR Vivid SDK from Video Kit.
This SDK is loaded with features for image processing such as the OETF, tone mapping, and HDR2SDR, so that it can mimic what human eyes can see to deliver immersive videos that can be enhanced even further with the help of the HDR Ability SDK from the same kit. The functionality and straightforward integration process of these SDKs make them ideal for implementing the HDR feature into a mobile app.
Opinions expressed by DZone contributors are their own.
Comments