[mythtv-users] Anyone know WHY I have to deinterlace when using a 1080i tv?

Steven Adeff adeffs.mythtv at gmail.com
Tue Nov 21 19:14:24 UTC 2006


On 11/20/06, Patrick Ouellette <pat at flying-gecko.net> wrote:
> Howdy Campers,
>
> I've noticed that I have to have de-interlacing enabled when watching
> MythTV (live or recorded) to get a good picture.  The crazy part of this
> is my display device is a 1080i format television (CRT, not DLP/LCD).
>
> If I turn off de-interlacing the picture is jumpy/ragged as I would
> expect an interlaced video to appear on a progressive display.
>
> TV is a 5 year old Samsung that does 480p and 1080i (ONLY) driven by an
> Nvidia 6600 card with the component outputs.  I have been using XvMC,
> but also recall noticing this effect without using XvMC.
>
> I'm pretty sure the video card is in 1080i mode since thin lines on the
> Myth menus flicker.
>
> The real question: is this a Myth issue, a Nvidia driver issue or some
> other thing?

try enabling OpenGL in MythTV's TV Playback setup.

-- 
Steve
Before you ask, read the FAQ!
http://www.mythtv.org/wiki/index.php/Frequently_Asked_Questions
then search the Wiki, and this list,
http://www.gossamer-threads.com/lists/mythtv/
Mailinglist etiquette -
http://www.mythtv.org/wiki/index.php/Mailing_List_etiquette


More information about the mythtv-users mailing list