[mythtv-users] nVidia interlace problem

Alex Butcher mythlist at assursys.co.uk
Thu Nov 13 15:24:03 UTC 2008


On Thu, 13 Nov 2008, Paul Gardiner wrote:

> I've seen some posts about problems with interlacing on nVidia
> cards, but I've been unable to pick up exactly what the problem
> is. Please can someone give a brief explanation? Is it only
> a problem with TV out, or would cause problems with the VGA to Scart
> trick? Is there any signs of it's being fixed?

I'm using a homemade VGA-to-RGB-SCART that generates composite sync from VGA
sync signals. My card is a generic GeForce 4 MX440 model, and thus I'm
forced to use nVidia's legacy drivers (currently 96.43.07). Normal graphical
output is properly interlaced using a 720x576 at 50Hz PAL modeline; things like
the MythTV UI are razor-sharp.

Output to Xvideo (e.g. MythTV playback) appears to have every other scanline
doubled, so I need to use the linearblend deinterlacer in order to get
correct shapes (e.g. BBC Four logo is excessively aliased without
deinterlacing).

Configuring MythTV to use OpenGL or plain X11 (NO_XV=1 usr/bin/mythfrontend)
/output for playback results in correct interlaced output (i.e. as sharp as
the UI), but performance suffers so badly on a P4 2.53GHz machine, even with
SD material, as to be unusable. mplayer using OpenGL output is usable.

> Cheers,
> 	Paul.

HTH,
Alex.


More information about the mythtv-users mailing list