[mythtv-users] Video cards, modelines, 720p, HD DLP TVs...

Steven Adeff adeffs.mythtv at gmail.com
Wed Jan 31 13:54:16 UTC 2007


On 1/30/07, matthew.garman at gmail.com <matthew.garman at gmail.com> wrote:
> My parents and I have nearly identical Samsung DLP TVs (mine is the
> HLR5067W, FWIW).  On my parents' MythTV, they're using an Asus M2NPV
> motherboard with integrated nvidia geForce 6150 graphics.  This
> motherboard comes with a component out "dongle".  To enable the
> component out, I had to put the "ConnectedMonitor" "TV" option in
> xorg.conf.
>
> What's interesting is this: when using component
> out/ConnectedMonitor=TV, the graphics card/driver supports 720
> (1280x720) natively---I didn't have to specify a modeline.
>
> Now my setup: I have a Chaintec 7NIF2 motherboard, with integrated
> geForce4 MX graphics; only VGA out (might be svideo, but don't want
> to use that with my TV ;).  Since I'm using VGA, I have to specify a
> modeline.
>
> I've since found that this onboard GPU just won't do XvMC
> consistently enough to make this PC useable for HD.  I have an AGP
> 6200 graphics card laying around.
>
> When I used it, with the VGA out, XvMC worked consistently, but the
> screen had *awful* flickering problems (i.e. every second or so, the
> whole screen would flicker).
>
> My guess is that there's a problem with my modeline that is causing
> the flicker.  FWIW, that AGP card also has DVI out, and I just
> bought a DVI->HDMI cable, but I haven't had a chance to test that
> yet.
>
> So, several questions:
>     - if the GPU/driver can support 1280x720 natively when using
>       ConnectedMonitor=TV, why is an explicit modeline required for
>       VGA/DVI?

It's not if your using HDMI, you can use the ModePool when frequency
bounds are gathered from EDID. You don't need to enable the EDID
modes, you just need to let the drivers determine the operating bounds
of the TV through EDID to enable the ModePool. This will then let you
"create" your own modelines. I wrote a bit about this in the
MythWiki's modeline page.

>     - does anyone have a definitive answer on whether or not you can
>       damage a DLP TV with bad modelines?  (I don't have windows, so
>       I can't use the PowerStrip program everyone recommends to find
>       my ideal modeline.)

The Definitive Answer: Maybe. It really depends on the TV. Some TV's
have protection which will prevent the electronics from accepting a
signal that will damage the electronics. Some have a simple version of
this(won't always work), some have an advanced version(basically
fool-proof), some have none. But this is where EDID comes in, it tells
the hardware producing the signal what the bounds of the TV are as
well as offering some known parameters. I've found that the bounds it
sends are for the most part always correct (at least not so far off
that using them will damage the TV), whereas the modeline information
sent can many times be wrong.

If you use the ModePool to "create" your own modes then you can pretty
much guarantee that your mode won't damage your TV, but you can't
guarantee that it will work. If memory serves correct the 9xxx series
driver version of nvidia-settings has the ability to choose between a
bunch of resolutions to try out. So you could start with one that
works and jump around. Though if your using this mainly for Myth, I
would go with one of the standard ATSC modes...
"1280x720_60" "1920x1080_60i" "1920x1080_60" "1920x1080_30"
"720x480_60" "720x480_60i"

>     - anyone happen to have this particular TV and a good modeline?

see above

>     - might there be some other problem (other than modeline)
>       causing the flickering with the AGP card?

once you get a proper modeline with the right frequency the only other
source of flicker would be due to a deinterlacing problem.

>     - in my situation, is there an advantage to use DVI/HDMI over
>       VGA?

Most DLP HDTV's don't scale DVI input, so you get some overscan but
1:1 mapping. LCD and Plasma are a little different on that. VGA inputs
are almost always setup to allow for some sort of scaling to be able
to display the whole screen. But since LCD/Plasma are fixed pixel
displays if you use the proper mode for their fixed resolution you
should get 1:1 mapping.

So basically it depends on if your just using this for Myth or as a PC
monitor in general.

>     - for my parents, is the component out the best route, or should
>       I set them up with DVI/VGA as well?

DVI will most likely be the best route. Component and VGA requires
conversion to analog by the graphics card and then whatever the TV
does to that analog signal. This may or may not decrease the quality
significantly. The larger your display the more noticeable errors
become though. Plus DVI is usually a more manageable and cheaper
cable. (hint: www.monoprice.com)

-- 
Steve
Before you ask, read the FAQ!
http://www.mythtv.org/wiki/index.php/Frequently_Asked_Questions
then search the Wiki, and this list,
http://www.gossamer-threads.com/lists/mythtv/
Mailinglist etiquette -
http://www.mythtv.org/wiki/index.php/Mailing_List_etiquette


More information about the mythtv-users mailing list