[libav-api] Quality degradation with delay variance

Luca Barbato lu_zero at gentoo.org
Wed Sep 18 17:23:26 CEST 2013

On 18/09/13 16:12, Janis Zemitis wrote:
> Hi all,
> I am decoding a real-time stream sent from another machine all works
> flawlessly on LAN. So to test the performance I started  simulating for
> various network characteristics by use of netem on the server side.
> Artifacts are in the bounds of expactation for corruption, duplication and
> packet loss, however I get an unusable image (white overlay over the whole
> frame) when adding the slightest amount of variance to the delay parameter.

How bad it is compared the normal packet loss?


More information about the libav-api mailing list