[mythtv-users] Anyone know WHY I have to deinterlace when using a 1080i tv?

Doug Larrick doug at parkercat.org
Mon Nov 20 21:33:50 UTC 2006


Patrick Ouellette wrote:
> Howdy Campers,
> 
> I've noticed that I have to have de-interlacing enabled when watching
> MythTV (live or recorded) to get a good picture.  The crazy part of this
> is my display device is a 1080i format television (CRT, not DLP/LCD).
> 
> If I turn off de-interlacing the picture is jumpy/ragged as I would
> expect an interlaced video to appear on a progressive display.
> 
> TV is a 5 year old Samsung that does 480p and 1080i (ONLY) driven by an
> Nvidia 6600 card with the component outputs.  I have been using XvMC,
> but also recall noticing this effect without using XvMC.
> 
> I'm pretty sure the video card is in 1080i mode since thin lines on the
> Myth menus flicker.
> 
> The real question: is this a Myth issue, a Nvidia driver issue or some
> other thing?

AFAICT it's an nVidia driver issue.  Its XVideo support doesn't include
proper output of interlaced source material onto an interlaced mode
(unless you're using the TV-out, which only gives you 480i).  You might
(or might not) have better luck with OpenGL-based video output.  Or you
can run a 1920x540p @ 60Hz mode and use bob deinterlace, which seems to
be good enough to fool most people's 1080i TVs into thinking it's real
1080i.  You'll want to "use separate modes for video and GUI" so the
menus don't look all weird.

-Doug

-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 252 bytes
Desc: OpenPGP digital signature
Url : http://mythtv.org/pipermail/mythtv-users/attachments/20061120/73e0097c/attachment.pgp 


More information about the mythtv-users mailing list