[mythtv-users] XvMC and (non)deinterlacing

Paul Leppert phlepper at gmail.com
Tue Mar 1 04:40:46 UTC 2005


I am using 0.17 and an HD-3000 video card for High Definition.  Using
an nVidia video card (GeForce 6800) without XvMC, but with kernel
deinterlacing, I get a great looking picture with little, if any,
motion artifacts when playing a 1080i picture on a progressive
monitor.

However, if I watch the same content using XvMC with kernel
deinterlacing, I get much lower CPU utilization (a good thing), but
with plenty of motion artifacts (like deinterlacing isn't being used).
 This is with the "deinterlace playback" checkbox checked and kernel
deinterlacing used (as opposed to using kerneldeint in the filter box,
which according to the documentation isn't used with XvMC).

Is there some way to get the deinterlacing to work correctly?  It's
great having the extra processing power with XvMC (I'm using an AMD
3200 (non-64)), but having the motion artifacts pushes me back to
running without XvMC (and I've tried the other options as well,
including Bob, all without success).

Any ideas?
-- 
I hear and I forget. I see and I remember. I do and I understand.  --  Confucius


More information about the mythtv-users mailing list