[libav-api] H.264 decoder using libavcodec
nfxjfg at googlemail.com
Fri Aug 8 01:18:11 CEST 2014
On Tue, 5 Aug 2014 03:10:36 -0700
Prashanth Bhat <prashanth_b_bhat at yahoo.com> wrote:
> I'm using libavcodec to perform decoding of H.264 frames.
> I'm on a Linux environment (14.04 Xubuntu), and Intel Haswell CPU.
> My program decodes the frames, without rendering them on the screen. With 4 simultaneous decodes of 1080p resolution and 15 fps, the CPU utilization is around 90% (not bad), but the load average shown by 'top' is 20+
> This looks excessive. I don't think I'm using the hardware decoding ability of the Haswell CPU. Could someone please advise how to find out if I'm effectively using the decoder?
Hardware decoding is a completely different beast from CPU decoding.
libavcodec only provides a part of what is involved with hardware
decoding. You need to use VA-API (Intel's library for hw decoding) and
need to set it up to work with libavcodec. This is non-trivial, and
unfortunately there isn't even a sample. There are examples for DXVA,
VDA, and VDPAU, but none for VAAPI yet.
Since VDPAU is conceptually still relatively similar to VAAPI, you
might have some success understanding how it works and what you have to
do by looking at the VDPAU example:
> The API calls I'm making are pretty standard -
> The above calls are made during initialization. The below calls are made on each frame -
> The following points are probably relevant -
> a) The default context allocated by avcodec_alloc_context3() does not have any hw_accel associated with it. I allocated a h264_vaapi accelerator, but this doesn't lead to any improvement.
> b) The codec capabilities does not have the HW_ACCEL bit set.
> Any help would be appreciated.
> libav-api mailing list
> libav-api at libav.org
More information about the libav-api