[mythtv-users] XvMC De-interlacing
dean vanden heuvel
deanv at cox.net
Sun Aug 15 15:20:46 EDT 2004
if you have an nvidia card (I believe at least the GeForce4 MX 400 or better), then you can use the *interlace* mode of the driver. If you send interlaced content to your TV, then the inherent handling for interlace built into your TV (to handle real broadcast content) should take over.
In my case, I have a Samsung DLP (native 1280x720) that accepts 1080i. I HAD mucho interlace issues as well...XvMC made the overall picture quality better, but there were nasty combing effects, and XvMC did not allow any deinterlace to be used (what is this *poor man's deinterlace of which you speak?). Using no XvMC and with interlace (kerneldeint), the picture looked like older film, rather than live TV, but most of the interlace combs were gone.
So, I have created a modeline that drives the 1080i mode from my PVR, and now I can use either XvMC or not...but in either case, I do not need to insert any de-interlace, as the TV does this work as it normally would for a 1080i signal.
However, to drive this mode, my PVR must do all of the *scaling* up to 1920x1080, and I think my TV just MIGHT be better at scaling, so I would like to drive a 720x480 interlaced mode. Problem is, this mode results in a horizontal scan rate that is too low for my TV, and for some reason, I have not been able to get *doublescan* to work...any ideas?
On Sun, 15 Aug 2004 11:08:57 -0500
Brian Foddy <bfoddy at visi.com> wrote:
> On Sunday 15 August 2004 08:05 am, john roberts wrote:
> > All HD folks:
> > I've been watching the Olympic Games on NBC OTA HD (1080i content). I was
> > wondering if anyone else has been watching and could comment on the quality
> > of the picture.
> > My processor is a 2.6G P4 and I've always had to enable XvMC for mythtv not
> > to studder all the time while watching live TV. Because of this I can only
> > use the "poor mans XvMC deinterlacing" feature vs. the Mythtv
> > de-interlacing software variations (which I hear are better).
> > With XvMC and my 1080i content - watching on a 720x480p Samsung - I see two
> > things when I watch the games:
> > 1) On an object which has very small variations in color I see what almost
> > looks like the effect you see when watching a DiVX or an XViD rip (if that
> > makes sense). Kind of blotchy vs. smooth transitions in color.
> > 2) During movement (this must be the interlacing issue) things seem just
> > slightly jagged and blurred. Nothing like it used to be WITHOUT XvMC
> > de-interlacing turned on - but still not "smooth".
> > Anyone have any ideas on the above issues? Or - maybe someone can comment
> > on NBC's HD content for the games.
> I see #2 a lot, even on non-hd (pvr250) with XvMC enabled. Since my
> frontend is only a P3-1400, I have no choice but to use it. However, my video
> card is set to 1280x720p (the TV may convert back to 1080i internally),
> so I don't attribute it to interlacing.
> I'm really surprised your 2.6P4 can't handle the work itself...
> Oh would it ever be nice for Nvidia to enable the MPEG decoder on their
> cards in Linux... Its always been my biggest complaint with them...
> mythtv-users mailing list
> mythtv-users at mythtv.org
More information about the mythtv-users