[mythtv-users] Correctly behaving 576i (or 480i) from vdpau-capable chip?

Paul Gardiner lists at glidos.net
Sun May 30 14:39:45 UTC 2010


Paul Gardiner wrote:
> Ronald Frazier wrote:
>> On Sun, May 23, 2010 at 8:13 AM, Paul Gardiner <lists at glidos.net> wrote:
>>> Hi,
>>> New TV!!. So I want to build a new front end, mainly to be able
>>> to connect via HDMI. For the next year or so, I don't foresee
>>> having any HD content to play, so I don't need vdpau, but I'd
>>> like to build a vdpau-capable system for future proofing.
>>>
>>> For now though, with the entirely SD content (PAL 576i), I'd rather let
>>> the TV do all the processing (deinterlacing and scaling), and
>>> not use vdpau at all. I'm planning on using software decoding,
>>> the "2 x Interlaced" software deinterlacer, with X set up to
>>> a 576i mode. The TV accepts that mode via HDMI.
>>>
>>> So what I'd like to know, is whether the vdpau-capable chips
>>> can be set up to run that way. I know some chips/drivers
>>> don't support interlaced modes. In the past there were problems
>>> with nVidia cards and interlaced modes (although possibly
>>> only if using XVMC).
>>
>> If I'm reading correctly, you want to buy a VDPAU capable card now but
>> let the TV do the scaling and deinterlacing anyway. Is there a
>> specific reason you want to take that approach? I welcome someone to
>> correct me if I'm wrong, but I believe the processing done by the
>> VDPAU card beats pretty much everything except (maybe) the highest end
>> TVs. Even the cheapest $30 G210 based chip can do Advanced 2X deint on
>> an SD stream (and every other deint on HD). I'm not sure what you'd
>> gain by ignoring the capabilities of the card you are already going to
>> be buying. I'm not aware of any significant VDPAU bugs in mythtv
>> either (I haven't encountered any in the last 5 months, and in fact
>> the OSD looks much nicer on a VDPAU system)
>>
> 
> That's the thing, the TV is pretty high end. It has something called
> 600Hz intelligent frame creation, which makes motion smoother than
> I've ever seen before from a TV. (It also causes the odd artifact
> when it gets confused, but over all it's a definite win). That
> feature can still be enabled if the frontend does the deinterlacing
> and scaling, but I can't be sure without trying it whether that
> might look slightly worse than the TV doing it.

Turns out you were right. I actually get a significantly better picture
using vdpau than even the TV's internal decoder. The TV suffers from
a little posterization (enough to be annoying) when using its internal
decoder (either live TV or playing Myth recordings via DLNA). With
vdpau on an ION-based board, I get a much better picture, with no
significant posterization. I can still use the TV's Intelligent Frame
Generation. Somehow that part of the processing doesn't impact on
quality.

Cheers,
	Paul.



More information about the mythtv-users mailing list