[mythtv-users] HD rendering: what takes the horsepower?

Johnny jarpublic at gmail.com
Sun Sep 6 04:12:23 UTC 2009


> This is something that's been bugging me....
>
> If I have HD souce at 1,920x1,080 and I play it back at that resolution I
> better have some really powerful graphics hardware.
>
> On the other hand, if I play it back at, say, 1,024x768, where do I need the
> horsepower?  Is the CPU still going to be taxed at the same level as with
> the full resolution playback?  What about the GPU?

Well the processing happens in multiple steps. Decoding the video will
require the same amount of power regardless of the playback
resolution. However, Mpeg2 content is significantly less
computationally intensive to decode than H264. So the power required
to decode will depend on the codec and bitrate of the source material
(with higher bitrates typically associated with higher resolutions,
but not necessarily). The other big processing intensive thing that
happens is deinterlacing. I believe the video card generally does the
scaling. So if you are using the CPU for deinterlacing then it would
do so at the resolution of the source. However, in the case of VDPAU
or the openGL deinterlacers, it would depend on what order the card
does the scaling and deinterlacing. It would make sense to scale it
first and then do the rest of the processing, but I don't know what
actually happens inside the GPU.


More information about the mythtv-users mailing list