[mythtv-users] nVidia DVI out?
brad+myth at templetons.com
Wed Dec 8 07:33:06 UTC 2004
On Tue, Dec 07, 2004 at 09:48:54PM -0800, Scott Alfter wrote:
> On Tue, Dec 07, 2004 at 11:17:51PM -0500, Mark L. Cukier wrote:
> > Does anyone have any experience getting the DVI output on an nVidia card
> > to work? (GeForce FX 5200)
> I just plugged it in and it worked after a reboot.
> > I'm trying to connect the video card is connected to a Sony Widescreen
> > HDTV. As always, help is much appreciated.
> My understanding is that TVs can be a little more fussy about the modes
> they'll accept. Here's a 720p mode you might try:
> Mode "1280x720 at 60"
> DotClock 74.250
> HTimings 1280 1352 1392 1648
> VTimings 720 728 736 752
> Flags "+HSync" "+VSync"
> (The 30" LCD I'm using now (a Vizio L30) was designed to work as a computer
> display as well as a TV, so it's probably not typical of what you'd find in
> the TV section of your local electronics store.)
Yes, quite amazingly though the new DLP and LCD-projector HDTVs have a
native resolution in the range of 1280x720, many of them don't actually
take a 720p input. Of if they do they "upscale" it to 1080i to display it!
Of course this is sort of a strange use of upscale. It is more pixels but
at half the refresh rate it's definitely not an upscale if you are
Bizarre. Would think instead they would want to be native at 720p and
translated 1080i to 720p.
More information about the mythtv-users