[mythtv-users] Help with interlaced DVI output configuration

Steven Adeff adeffs.mythtv at gmail.com
Thu Mar 22 13:34:39 UTC 2007


On 3/22/07, jason maxwell <decepticon at gmail.com> wrote:
> On 3/21/07, Chris Weiland <hobbiticus at gmail.com> wrote:
> > I've searched and search and can't seem to find a definitive answer to my
> > questions, so hopefully the all-knowing body of the mythtv discussion list
> > will know.
> >
> > My Situation:
> > I've got a 32" Samsung LCD TV that has a native resolution of 720p.  I have
> > a cable box with an HDMI output that I've hooked into my TV's HDMI input in
> > the past, which resulted in a very clean and crisp picture.  I guess I can't
> > be 100% sure that it was outputting in 1080i, but it says it was.  My myth
> > box is setup to run at the 720p resolution, outputting over the DVI
> > interface on my geforce 6200, through a DVI-HDMI cable, into the HDMI input
> > on the TV.  As you might suspect, playing 1080i content without
> > deinterlacing is filled with interlacing artifacts.
> >
> > My Problem:
> > I'm very sensitive to even the slightest visual artifacts, and none of the
> > built in deinterlacing methods in mythtv are as good as I would like it to
> > be.  So, logic along with searching through lots of mailing lists suggests
> > that if I can output 1080i from my myth box, the (much better) TV
> > deinterlacer/scaler will kick in and give me the same crisp and clear images
> > that I get when I hook up my cable box directly.  This should theoretically
> > also save CPU cycles in the deinterlacing/scaling that myth is currently
> > doing.
> >
> > The Solution:
> > Output in 1080i! (and turn off mythtv's deinterlacing)
> >
> >
> > So, first of all, am I correct in everything I've mentioned above?
> >
> >
> > I'm not exactly sure how to go about doing this.  My TV's EDID info says
> > that it will accept 1080i sources, and the nvidia
> > driver seems to have the 1080i modes built in.  However, if I try to use the
> > 1080i modes, the X log complains that the vsync is out of range.  This is
> > "true" because the TV wants 60hz, but the mode is listed as 120hz.  So, I
> > turn off the vertical sync mode validation.  Now it says that the 1080i
> > screen size is bigger than the LCD's native resolution.  Fine, so I turn off
> > the DFP native resolution mode validation.
> >
> > Now it works!  Except not.  X accepts the 1080i video mode, and the change
> > is apparent on my TV because it is only showing the top half of the screen,
> > and the horizontal scan lines are out of sync.  It's a little hard to
> > explain, but it's almost like instead of drawing lines 0,1,2,3,4... it's
> > drawing lines 1,0,3,2,5,4..., except making each scan line twice as big, so
> > the screen only shows half of the picture.  I couldn't think of what to do
> > after that.
> >
> > I also see that I can define a custom modeline.  I had to do a lot of mode
> > validation disabling to get this to work too.  I have to admit that I havn't
> > played with this too much, but I can't get the driver to like anything that
> > I create.  I think once I finally got X to start using my modeline, all I
> > got was a blank screen.  That's where I gave up.
> >
> >
> > I've seen elsewhere that interlaced modes over the DVI output stopped
> > working after driver version 6629, but that driver doesn't support the 6200
> > card.  However, this was from a post back in 2005, so hopefully that's not
> > the case anymore.  I've also seen some people say to enable TwinView with
> > only one display device connected for some reason, but no one ever explained
> > what settings they used for that.  Others have said that I need to turn off
> > some vsync settings in the nvidia-settings program.  I havn't investigated
> > these issues myself yet, so hopefully someone can tell me so I don't have to
> > waste my time.
> >
> I have also found flaws with the quality of deinterlacing that myth
> currently provides, and have hit similar roadblocks. I'm not so
> unhappy that I cant live with it, but the occasional combing does get
> on my nerves. Especially because I specifically built an overkill
> system to ensure the best quality HD playback, but still cant match
> the TV's built in deinterlacing. I thought TVs scalers/deinterlacers
> are on the low end, and that's why a lot of videophiles use
> standalone processors. Surely those algorithms in use on the quality
> processors would be available programmatically, no? Maybe the demand
> isn't high enough for someone to put the effort in to implement them
> in myth.
> -J

work on better scaling and deinterlacing is going on in the mythtv-vid
branch in SVN.


-- 
Steve
Before you ask, read the FAQ!
http://www.mythtv.org/wiki/index.php/Frequently_Asked_Questions
then search the Wiki, and this list,
http://www.gossamer-threads.com/lists/mythtv/
Mailinglist etiquette -
http://www.mythtv.org/wiki/index.php/Mailing_List_etiquette


More information about the mythtv-users mailing list