[mythtv-users] Struggling with Xwindows DVI to HDTV 1080i
adeffs at gmail.com
Thu Dec 29 20:52:05 EST 2005
On Thursday 29 December 2005 01:16, Len Reed wrote:
> Steve Adeff wrote:
> > On Tuesday 27 December 2005 18:38, Len Reed wrote:
> >>I've got an nvidia 6200 with DVI out connected to my Mitsubishi HDTV
> >>(DVIS to HDMI cable). The TV does 720p and 1080i on HDMI: it's worked
> >>from both the cable box and from a DVD player that does upscaling.
> >>I can get the TV to recognize that it's getting 1080i input from the
> >>computer. (The info on the screen says so.) I can't get it to deal
> >>with 720p for some reason. I can get the TV to handle lower resolution
> >>SVGA and XGA modes up to 1024x768 fine.
> >>With 1080i I get what is close to the twm screen, but there are two
> >>1. The screen is greatly overscanned. Perhaps 20% is not displayed.
> >>2. The interlacing is off, or at least that's my guess. Everything is
> >>displayed twice, with one flickering image directly below another. They
> >>are close: the bar at the top of an xterm has its two images overlapping.
> >>I've tried every modeline I can find, and have tried two different
> >>modeline calculators, but I can't get the two images to converge. The
> >>TV seems to be reporting things correctly to X (59-61 Vsync, reasonable
> >>Hsync, etc.) Telling X to ignore the TV's info doesn't help in any case.
> >>It seems like it should be easy enough to play with the vertical
> >>blanking interval to fix this, and that I'm close. But I'm guessing,
> >>and I'm not making progress. Is there a reasonble way to tweak the
> >>modeline to iterate toward a solution here?
> >>Fedora core 4, x86_64
> >>Athlon-64x2 (3800+)
> >>ndvidia 6200 card
> >>latest nvidia X driver, compiled on the machine
> > newer nvidia drivers don't support interlace modes over DVI.
> Seriously?? I sure didn't see that in their README. After my original
> posting, I bought (from somewhere that I can return it) a 6600 card that
> has component video out and it works fine. (It exhibited exactly the
> same problem with DVI, though.) The card with HDTV encoding is a
> satisfactory solution if not an ideal one (both technically and in
> cost.) It sure seems stupid to have the card encode HDTV to have the TV
> turn it back into digital for the DLP display when I should be able to
> do it over DVI. Certainly the cable box's 1080i DVI is a bit clearer
> than its component video. Is there any way that the open source (dv)
> driver will work at 1920x1080i to DVI or is a waste of more time to even
the NVidia component output uses an HDTV output chip so it will work. It's
interlaced over DVI that doesn't since it uses the normal monitor output. Its
a known bug that NVidia won't fix since relatively few people use it.
Apparently one of the older (search the list archives) drivers does support
it though. Some TV's will accept a 1920x540p signal from the DVI and display
it as 1080i though, so thats worth a try too.
> > Overscan won't change what you see for TV (ie. the overscan is the same
> > whether from your computer or your cablebox), so change the gui overscan
> > settings for your TV to fix the gui. if you want to fix tv video, SVN
> > lets you adjust overscan, or find the service menu info for your tv to
> > lessen its overscan.
> OK, I haven't laid mythtv into this mix yet, so I'll worry about it
> then. The component video from the new card at 720p and 1080i exhibits
> only small overscan, about what I'd want.
newer TV's, esp. LCD and rear projection have very minor overscan which is
nice, its the tube TV's that exhibit obscenely large overscan. Plus, with TV
content your missing with Myth what you'd miss without it, so its a wash
really. If you insist on seeing it all, then venture into the service menu or
get an ISF guy to your house and pay him to do it.
> While you're listening, Steve, you were complaining about the VIA
> IEEE1394 chipset. I can't find anything but VIA. Fry's had a dozen
> cards, all VIA. Did you have to mail order to get something with the TI
> chip? Do you have a recommendation?
So far I've had great luck with the TI chipset in my laptop, no problems with
it. I had a VIA on my Myth motherboard that just refused to work, and from
what I've seen the people with VIA chipsets aren't having much better luck
than that. I ended up getting a card from newegg(a Syba with NEC firewire
chipset) for that computer that I've had, I'd say 95% success rate with (just
in channel changing, haven't setup firewire capture yet). When I get back
from Vaca. I'm getting another cable box and will have one capturing via
firewire and one via PVR150, so we'll see how that goes. My advice rght now
is to stay away from Via and be willing to spend a few bucks on a quality
firewire card. I also found that some of the packaged libraries seem to not
be perfect, I ended up compiling from scratch and my success rate went way
up, same library version.
> Thanks again for the help.
More information about the mythtv-users