[mythtv-users] Mythfrontend idle cpu consumption help
Brian J. Murrell
brian at interlinx.bc.ca
Wed Nov 11 16:27:58 UTC 2009
On Wed, 2009-11-11 at 10:46 -0500, Johnny wrote:
> This is off topic, but I am feeding the TV through component inputs
> and it is a nice Sharp CRT.
So you have an interlaced output.
> I am using VDPAU Advanced 2x for
And deinterlacing your input content. Presumably the original content
is interlaced if you are deinterlacing and considering it is probably
being captured from a source that is destined for an interlaced output
device (i.e. an NTSC broadcast).
> With quality scaling you would be hard pressed to see
> the difference. Scaling down and back up will be a problem. But
> scaling up and back down isn't really an issue, especially if it is
> done well.
I always wonder why, for those of us who are capturing interlaced
content and outputting to interlaced target devices, we have to do all
this deinterlacing and and of course re-interlacing for the interlaced
target we are outputting to. It's silly.
If we simply kept the interlaced content interlaced and gave it to the
interlaced target device in it's original interlaced format, we'd save a
whole lot of CPU and image manipulation (and the loss that goes with
My understanding is that this is because X or the drivers (nvidia in my
case) or something else in the pipeline is simply unable to give the
target device real, interlaced output.
Sure makes me pine for my Matrox G400 and directfb, where interlaced
output was available and it was a simple 1:1 playback from recorded
source material to output device.
Of course, I suppose with the transition to digital outputs and digital
sets (LCD, etc.) the target device can be truly progressive and want
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Size: 197 bytes
Desc: This is a digitally signed message part
More information about the mythtv-users