Deinterlacing is the process of converting traditional Interlaced Video into a Progressive Picture that can be displayed on modern non interlaced display devices such as LCD or Plasma screens.
There is a lot of general information on deinterlacing at Wikipedia in these two articles:
While targeted at DivX authoring, the visual examples from the following site are valuable in demonstrating artifacts that one may see with various forms of deinterlacing:
MythTV has several options for Deinterlacing. The configuration options for deinterlacing can be found under Mythfrontend->Setup->Video->Playback. On Playback Profiles (3/9) edit each entry, The Primary and Fallback deinterlacer may be set on the 3rd sub-page.
Choosing a Deinterlacer
The best deinterlacers can require lots of CPU, and/or VDPAU or OpenGL video hardware, particularly for 1080i content. You may find that you have enough resources to use a powerful deinterlacer for your SDTV (480i) content, but you need to use a less resource-hungry deinterlacer for 1080i. (720p content does not require deinterlacing.)
To custom tune exactly for your system, you may need to just try different deinterlacers to see which look best on your screen and which don't overtax your CPU. If you see signs of tearing (discontinuities in the video), jittery video, prebuffering pauses and your CPU usage is high, you may be overtaxing things. If you see lots of "combing," where moving objects have alternating teeth, you may not be deinterlacing at all.
Note that the deinterlacers available vary based on your renderer. For example, you will find many more deinterlacing choices if you use the OpenGL renderer and some high-quality hardware deinterlacers with the VDPAU system. (VDPAU is available only in MythTV 0.22 or later)
There is a python script available for Comparing deinterlacers.
The best deinterlacers will double the frame rate (typically 29.97fps in the USA) to 59.94fps, which is the rate at which the fields (half-frames) of interlaced video occur. For each field, they will build a whole frame. This requires that your display be able to operate at this doubled frame rate. Surprisingly some displays are just a hair below that, in which case MythTV will switch to your "fallback" deinterlacer, as set on your configuration screen. Doubling the frame rate requires a lot more resources. Some video comes at 25fps, which doubles to 50fps and should not have a problem with the monitor refresh rate. You may also encounter problems with loss of deinterlacing if doing time stretch.
Your TV probably has a fairly good quality hardware deinterlacer. Some (usually higher end) A/V receivers also have a fairly good quality hardware deinterlacer. To make use of those hardware deinterlacers, you must have your video card output an interlaced video signal to the TV. This typically requires, in X, a special modeline for the display. Because you will not want to use interlaced output for 720p or 1080p video, you will want to set up MythTV to use different video modes depending on what type of stream is being played. When using native deinterlacing, you would set your deinterlacer to none. You will not be able to zoom, and may have trouble doing time stretch, but you will get quality deinterlacing with minimal CPU.
Please note that many people have had problems getting NVIDIA video cards to properly output an interlaced video signal. Some have played tricks, like using a 60 frame per second 540 line mode line.
Available Deinterlacing Systems
These are arranged in approximate order of preference. You probably want the best one your CPU and GPU can handle. Hardware acceleration (in OpenGL and inherent in VDPAU) will let you handle a better deinterlacer. MythTV has a number of 2x deinterlacers which attempt to turn 30 Hz (or 25 Hz) interlaced video into 60 Hz (or 50 Hz) progressive video. The aim is to improve on non-2x deinterlacers which look fine for static scenes but create jumpiness when the scene contains motion. If the display is not capable of displaying 2x the frame rate, then MythTV will use the fallback deinterlacer.
High / normal / slim quality
The high / normal / slim quality settings use the Xv-blit render for on screen elements like the volume, position and menus. The downside is that these elements are scaled with the video. When watching low resolution video (576i) on a 1080 display the menus appear very blurry. To avoid this use OpenGL, VAAPI or VDPAU. The deinterlacers in this mode are (in order of preference):
- One field - discards half the frame (low-resolution use only)
- Linear blend - linear blend of the 2 frames
- Kernel (1x and 2x) - linear blend when needed (requires SSE support)
- Greedy HiMotion (1x and 2x) - motion compensation in forward frame
- Yadif (1x and 2x) - motion compensation both forwards and backwards in frames
- Interlaced 2x - attempts to synchronize video with output display
It is recommended that you try each in order until you find the best your system can cope with without stutter.
OpenGL is recommended for those without nVidia cards and have interlaced video. It allows the UI elements to scale correctly and offers the same deinterlacing methods as the section above.
VAAPI has the following limited deinterlacing options that are hardware accelerated:
- One field (1x)
- Bob (2x)
All deinterlacers in VDPAU are hardware assisted, except "none." Temporal and Advanced are best, but Advanced requires heavy computation and may struggle on 1080i depending on resources, although most modern day GPUs should not have any issues.
- One field (1x)
- Bob (2x)
- Temporal (1x or 2x)
- Advanced (1x or 2x)
Temporal 2x will work with most 8400gs or 9400gs Nvidia cards for 1080i.
The Advanced 2x deinterlacer is thought by many to be the best, but it requires an GeForce 8600 card in the 8x series (The 8800 does not support VDPAU) or a 9600 or above. Reports on the 9500 are not available, but it is likely to work.
Note that for some configurations, such as the ION chipset, you may see tearing even with VDPAU enabled. This can be because there is "Composite" processing going on in parallel which is unnecessarily stealing away performance in the GPU. This can be disabled in X11.