[mythtv-users] Re: Resolution, screen size, and other related 'size' question

Chad masterclc at gmail.com
Mon May 23 08:11:12 UTC 2005


On 5/21/05, Michael T. Dean <mtdean at thirdcontact.com> wrote:
> Michael T. Dean wrote:
> 
> > Chad wrote:
> >
> >> On 4/28/05, Chad <masterclc at gmail.com> wrote:
> >
> >>> What resolution should I have in my xorg.conf?
> >>
> > Ideally, since your TV has a native resolution of 1280x720, you would
> > use that resolution.  Then, your TV doesn't have to scale the video
> > and you get a pixel-for-pixel representation of the output, giving you
> > the highest quality possible.  (Note that although some
> > (older/cheaper) TV's have a native resolution of 1280x720, they may
> > not be able to accept input at that resolution from a computer.  If
> > that's the case for yours, you would probably get the best results
> > using the maximum resolution allowed.  However, this is most likely
> > not the case with your Samsung.)
> 
> I knew this section of my reply was too small.  (Yeah, I'm the first to
> admit my replies can be^H^H^H^H^Hare usually a bit wordy. :)
> 
> I forgot to mention that the above holds true when talking about the
> quality of the GUI.  If you render the GUI at the resolution at which it
> is displayed, you'll  get the best picture quality.
> 
> However, when it comes to TV, you might actually get better results
> sending a different signal to the TV and allowing its built-in scaler to
> do the scaling for you.  This may be the case if all you have is SDTV.
> Also, if you have HDTV, you might actually get better quality for your
> 1080i broadcasts by outputting 1080i.  Probably the only way to know for
> sure is to try it out.
> 
> IIRC, Myth can be set up to use XRandR to dynamically change the output
> resolution based on the source resolution.  To get the best of all
> possible worlds--assuming the TV's scaler works better than the computer
> based scaling mechanism, you would want to set up your xorg.conf to
> allow 1080i, 720p, 480p and 480i resolutions (assuming your TV accepts
> all these resolutions as input) and set up Myth to use XRandR.
> 
> Mike
> _______________________________________________

Wow...

Thanks for both, extremely in depth replies.  This was exactly the
kind of information I was hoping for.

In slightly reverse, and basically just for completeness:

The TV does accept all of those resolutions as inputs.  

I'm not sure how to tell if my DVI is DVI-A/I or DVI-D, but I'll check
out the image and see how it looks sometime this week.

I'm not yet using an HDTV card, but I believe I'm getting one very
shortly as a gift...  ;)

My current capture cards are all cheap bt8x cards, but seem to give a
decent image on my monitor, looking forward to seeing them on the big
screen this week.

That is extremely odd that VGA does a DAC first, and then the monitor
has to do the ADC back, it would seem intelligent to have changed this
WAAYYY back in the 'day' not wait until LCD's come out that require
the DVI input (probably the "fix" I'm referring to).

Awesome, tons of info.  Thank you!


More information about the mythtv-users mailing list