[mythtv-users] NVidia non-interlaced video output quality-loss?

Thomas Börkel thomas at boerkel.de
Sun Feb 15 06:11:52 EST 2004


HI!

steve at nexusuk.org wrote:

>>What are the exact consequences of the non-interlaced video output with 
>>the NVidia cards?
> 
> A loss of picture sharpness - basically it's slightly blurry in the 
> vertical direction and high motion scenes will look a bit blurry as it 
> combines the 2 fields (which will be significantly different from 
> eachother in high motion scenes).  Its basically the same as turning on 
> deinterlacing in Myth.

Don't I have to turn deinterlacing on in Myth, if I have a NVidia card 
and my captured MPEG2 is interlaced (from PVR-250)?

> Horizontal resolution should be unaffected.
> 
>>Assuming that analog TV is 352x380 native, would I then only have 
>>352x240? Isn't that a big loss in quality?
> 
> Where are you getting 352x380 from?  PAL is 768x576 and NTSC is 720x480.  

Sorry, typo! I meant 352x480. I thought that analog TV only has half the 
possible horizontal resolution.

> Deinterlacing only affects the vertical resolution, and it doesn't just 
> throw away one field, it blends them together, so the output won't be 
> nearly as bad as just throwing away one field (nor will it be anywhere 
> near as good as keeping all the fields and watching an interlaced 
> picture).

OK, thanks. I am curious, how it will look (buying my MythTV box 
components next week).

Thomas



More information about the mythtv-users mailing list