[mythtv-users] Re: Any reason for PVR-350 over 250?/nvidia TV-out problems...

David Wood obsidian at panix.com
Fri Jul 16 16:16:20 EDT 2004


First of all, all good points - all stemming from my glossing over things 
to try to keep it short (funny, right) and more understandable (yeah, a 
real riot).

On Fri, 16 Jul 2004, Anthony Spinoza wrote:

> ?? half the HDTV are interlaced

I know. Just wanted to talk about NTSC for the moment to keep from 
confusing things.

>> TV-Out on your computer, though, is interlaced again
>> (by definition). It
>> takes "progressive" data from a buffer, and
>> interlaces it, when sending it
>> out.
>
> You have to be more specific, NTSC TV out, or any of
> the EDTV / HDTV interlaced modes.

Yeah, for me TV-Out == S-Video NTSC. Sorry.

>> But only the 350 can output that _exact same_
>> interlaced signal back to
>> your TV. Without reprocessing it. Without
>> (unnecessarily) losing any
>> quality. Without doing any work.
>
> Oh it does plenty of work, just because the mpeg2 is
> done it hardware doesn't mean it's not changing the
> picture.

Of course. I was only talking about "unnecessary" DSP like 
deinterlace/reinterlace. I deliberately left aside the issue of hardware 
compression/decompression (which I was afraid would confuse the issue).

>> Our output will never be as good as the 350 (barring
>> some harrowing
>> advance in driver sophistication - if some other
>> hardware can be made to
>> do the same trick as the 350?).
>
> Huh? You can achieve this same "trick" with any
> capture card and any tv out card. Don't deinterlace
> and match the recording resolution to the playback
> resolution. Which is esentially all the 350 is doing.
> It's just nice to have all the chips on one card.

I gather that nothing but the 350 is capable of doing what you describe on 
the TV-out (i.e. S-Video/composite NTSC). That is, taking interlaced data, 
in the form of an MPEG stream or something else, and outputting that 
directly. For that matter, many seem to have trouble even getting the 
native capture resolution, and end up using 800x600 or 640x480 instead and 
scaling too.

I had the impression most commodity TV-out hardware is focused on 
interlacing non-interlaced (computer-generated) images, and had no 
provision for simply reproducing an interlaced source.

>> But, considering
>> most of us are just using
>> "standard" NTSC televisions,
>
> Most, for now, probably not next year.

That soon?  :)

> what's broken? I'm not trying to be an arse, I just
> don't see what's broken. Myth works with nearly every
> tv capture card that has drivers in linux, it will
> encode pure interlaced, or deinterlace. Playback
> interlaced, or deinterlace on the fly.
>
> NTSC myth setup, no deinterlacing
>
> bt878 480x480 interlaced --> encode --> TVout at 480x480
> NTSC interlaced timing

In theory you're right. In practice, though, I see a lot of "TV-Out 
Problem" threads, and traffic on mythdev about the filters.


More information about the mythtv-users mailing list