Talk:Deinterlacing

From MythTV Official Wiki
Jump to: navigation, search

Interlace picture, interlace display

Regarding the problems with this configuration, the "interlaced" filter is specifically designed to address the sync problems attributed to Nvidia cards on this page - they are actually caused by a limitation of Xv. The original patch submission has a very good explanation, maybe it could be worked into this page? See http://svn.mythtv.org/trac/ticket/6391 Ali1234 15:13, 10 February 2010 (UTC)

Linear Blend Requires XvMC?

I thought that XvMC only supported Bob, so why would linear blend require XvMC?

Deinterlacing and SDTVs

Is there any benefit to deinterlacing for display on regular SDTV (as in standard non-HDTV)?

A regular CRT TV running 50 Hz (PAL) or 60 Hz (NTSC) does not need deinterlacing unless the Graphics Adapter doesn't support the correct resolution for the local SD signal. For example PAL runs at 576i but some graphics cards will only do TV out in 640x480 or 800x600, so deinterlacing the image before it gets double scaled can improve the picture quality. But if you do have such a graphics card, which probably does not support Xv either, you should consider to get a new one (E.g a Nvidia FX5200 or 6200 based one for €35).

-Does the 6150 (Nforce onboard) support 576 line output? How?--Turpie 02:10, 28 September 2007 (UTC)

I do not know if it supports 576 line output, but the usual method is to use Xv. Xv is an additional video layer (On Windows it is called Overlay buffer), where the Graphics adapter does any scaling. So the CPU just dumps the video into Xv and the GPu scales to whatever resolution you have set on the VGA output (e.g scales from 576 lines to 768 lines). However the TV-Out on the videoboard does run with the native TV resolution (PAL 576 lines, NTSC 480 lines), so the GPU scales the normal graphics buffer to that resolution. If the contents of the Xv buffer is already 576 lines (for PAL), then that contents is sent to the TV Out unscaled, even though your graphics adapter has been set to run 800x600 or 1024x768.

Note that you usually can not make screen dumps of the contents of the Xv buffer as it is a separate system from the normal Graphics buffer on the adapter.

So using Xv gives a much better quality (no scaling on TV Out) and much better performance (scaling to e.g 800x600 by the GPU).

If you use TV-Out you should still care about the resolution of your Graphics adapter. For PAL use 800x600 as it is close to the 576 lines. For NTSC use 640x480. Low resolutions, but the normal desktop is more readable than if you used 1024x768.

For LCD TVs, do not use TV Out. Set the Graphics adapter to the resolution of the TV (In my case 1360x768) and use the VGA or DVI output of your Graphics adapter.

Decoding and deinterlacing performance of VDPAU capable video cards

This seems to be a good source for info on the different cards capabilities: http://www.nvnews.net/vbulletin/showthread.php?t=133465