[mythtv-users] Re: Any reason for PVR-350 over 250?/nvidia TV-out problems...

David Wood obsidian at panix.com
Fri Jul 16 13:47:38 EDT 2004


On Thu, 15 Jul 2004, John Smith wrote:

>>> So, any reason(s) to choose a pvr-350 instead of a
>>> pvr-250?
>
>> 1) It produces the very best picture quality for TV
> playback.
>
> Could you quantify it a little? ;) Is picture quality
> with other cards/on-board video (like the Geforce 440
> MX) more blurry, etc.?

This is an interesting point that I wish I had learned a little more about 
earlier. I'll try to keep it simple, although that is really futile.  :) 
Apologies for whatever details I get wrong.

First of all, you have to know about interlacing. I won't rehash that here 
- too basic, but you can google it and get the idea quick.

TV is interlaced. (Leaving aside progressive DVDs, HDTV, and anything else 
I'm forgetting about.)

Computers _generally_ deal with non-interlaced ("progressive") video 
signals. Your VGA connector, for instance.

TV-Out on your computer, though, is interlaced again (by definition). It 
takes "progressive" data from a buffer, and interlaces it, when sending it 
out.

Now, your average computer TV-*in* (capture card) will take the interlaced 
signal and _preserve_ it that way. The fields are kept separate in the 
digital video data. Sometimes, when you watch it on your (VGA) computer 
monitor, you can see them! And that's actually a playback error. 
Displaying an interlaced "signal" on a non-interlaced display looks wrong. 
You see comb patterns when things move. Maybe there are other issues too 
(like timing), if your playback system is sloppy enough to comb on you 
anyway.

Basically, it's harder than it sounds to properly convert a TV signal to 
play on a digital, "progressive display" computer. So a lot of folks get 
it wrong. Including Myth and/or the tools it depends on, apparently - even 
though this should be our bread and butter. Well, it's being worked on, at 
any rate.

(Simplifying a bit) the right way to display it is to "deinterlace" the 
signal. That's a complicated, messy process where we combine the two 
"fields" of the display back into one progressive signal, through various 
means. We lose some quality. We burn some CPU time doing it too; maybe 
quite a bit.

Now the sad irony is that TV-out on your computer (*almost* all the time) 
takes the progressive video that you've just created and converts it 
_back_ into interlaced video (and probably scales it, too). This is messy 
_again_. You lose some more quality. And by the way, if you didn't 
properly *de-*interlace your TV data in the first place, you'll _still_ 
see comb artifacs and everything else that's probably wrong with your 
recorded TV signal - only worse, because now you took the wrong result and 
re-analoged and re-interlaced it again. This really makes things weird. It 
can make the problem more obvious, or it can even sometimes hide it a 
little. The cruel irony.

Notice how I said this happens "almost" all the tiem?

It doesn't happen for PVR-350 owners.

Originally, I heard what everyone else was saying about the 350 - that it 
was "better." But I didn't understand why, I just thought, better quality 
components or whatever. But it's not - it's fundamental. This is why:

The 250 and the 350 both record an interlaced signal.

But only the 350 can output that _exact same_ interlaced signal back to 
your TV. Without reprocessing it. Without (unnecessarily) losing any 
quality. Without doing any work.

That's why it's (probably) worth what it costs. It may even be worth the 
pain of dealing with the (still sketchy) 350 display driver situation.

Now, most people are like me, stuck with the 
interlace-deinterlace-reinterlace madness. We have hope, because there is 
a "right" way to deal with it. We just have to fix our software. We have 
to properly de-interlace the recorded TV signal, as cleanly as we can 
manage, before it is reinterlaced again.

Our output will never be as good as the 350 (barring some harrowing 
advance in driver sophistication - if some other hardware can be made to 
do the same trick as the 350?). But, considering most of us are just using 
"standard" NTSC televisions, when we fix what's outrageously broken we can 
probably get the difference in quality down below what you can detect 
anyway.

Apparently not yet, though.


More information about the mythtv-users mailing list