[mythtv-users] Playback On Intel

David Edwards david at more.fool.me.uk
Sun May 11 15:10:59 UTC 2014


A while ago, I ditched my ION-based front end and replaced it with an
Intel i5-4570S with HD4600 graphics in a passively cooled Streacom
case. It's nice and cool, silent, and I thought the CPU should be
plenty fast enough to do software decoding and deinterlacing if VAAPI
doesn't work out.

I'm running MythTV 0.27 on Mythbuntu 14.04, which I believe pretty
much has the latest 2014Q1 Intel graphics stack.

I'm watching on a brand new Panasonic plasma which is entirely capable
of displaying a good picture, connected via HDMI.

Unfortunately, I'm having all sorts of issues with playback quality.
I've tried the High Quality, OpenGL High Quality and VAAPI Normal
playback profiles, but they all have different problems.

Decoding - Using High Quality and OpenGL High Quality I get problems
in dark areas of the picture. A pale cloud of decoding artefacts
slowly builds in shadow areas over a few seconds, then abruptly drops
back to black. This repeats every 10 seconds or so. This does not
happen when using VAAPI to decode.

Judder - On SD MPEG-2 broadcasts there is a noticeable judder every
second or so, particularly on horizontal pans. On HD H.264 broadcasts
the judder is there, but less noticeable. I am in the UK, so these are
25fps interfaced. On ripped DVDs (deinterlaced with Handbrake where
applicable) at 25fps and Blurays at 23.976 fps playback is generally
smooth, but there is an occasional jump. This happens on all playback
profiles. The refresh rate on the TV appears to be correctly switching
based on the content.

Deinterlacing - High Quality (Linear Blend) is unwatchable, VAAPI
Normal (Bob 2x) is almost okay on HD material, but poor on SD. OpenGL
High Quality (Greedy High Motion 2x) is good.

Consequentially, none of the playback profiles are perfect and I am
starting to hate watching TV.

Some comments and thoughts:

1. The decoding issue seems to me to be a bug in ffmpeg. Does anyone
have any thoughts on this?
2. The worst of the judder can *sometimes* be fixed by running
"DISPLAY=:0.0 xrandr" from an SSH session - ie, with no params -
during playback. The picture jerks once and then settles to be smooth.
It is also fixed sometimes by returning to the playback settings menu
and then starting playback again.
3. Motion Adaptive deinterlacing was added to the Intel graphics stack
in 2013Q2 (see https://01.org/linuxgraphics/downloads/2013/2013q2-intel-graphics-stack-release),
but does not appear as an option in the playback profile settings
4. Refresh rate switching doesn't select non-integer rates even though
they can be selected manually using "xrandr" from the command line.

It seems to me that some of these problems may well be bugs; if anyone
can suggest what I can do to aid them getting investigated or fixed,
please let me know.

Is no-one else having this sort of trouble? I'm starting to think that
I need to return to Nvidia... but VAAPI and the Intel driver have had
so much work done on them recently that it seems *so* close to being
usable.

David


More information about the mythtv-users mailing list