[mythtv-users] OT: HDTV TV's
Michael T. Dean
mtdean at thirdcontact.com
Thu May 11 12:55:44 EDT 2006
On 05/11/2006 12:22 PM, Greg Woods wrote:
> On Thu, 2006-05-11 at 11:54 -0400, Michael T. Dean wrote:
>> On 05/11/2006 09:42 AM, Greg Woods wrote:
>>> You can't even count on what the manual says either. The manual for my
>>> Pioneer PRO-R06 says that it's 1024x768 resolution, but EDID shows that
>>> it can also do 1280x768, and in fact that works in X.
>> Input signal != output resolution. Scaling...
> I don't know enough about all this stuff to understand what you mean
> here (if I did, I'd probably have HD working by now).
> What I do know is, when I first turn on a signal, or switch inputs on
> the TV, the TV will display the type of signal. I always assumed that
> was the type of signal it was receiving, but maybe I'm wrong?
> With X/VGA input, using the 1280x768 modeline, it says it's "Standard
> FULL2", and it displays full 16:9 screen, which results in a "stretched"
> picture since the recording being displayed came from an analog channel
> ( via PVR-500 card) and is therefore a 4:3 image. But it does nicely
> fill the screen. (It's actually better for watching hockey games because
> the bigger picture allows one to see the puck better, but it looks
> strange if it's a show with a lot of closeups; everybody looks really
> fat :-).
And, if your TV actually has a native resolution of 1024x768, the image
represented by the input signal is being scaled to fit on the available
> If I use the 1024x768 modeline, it says it is "Standard 4:3"
> and uses the gray bars on the sides.
Chances are it assumes that a 1024x768 resolution is coming from a
computer, so it scales the image to use less than 1024x768 pixels (thus
the black bars) so that you can see your Windows Start Bar in spite of
the TV's built-in overscan to make it "easy" for you to use with a
computer. (After all, who wouldn't want to do word processing on an
HDTV that's 10 feet away from his/her seating position?)
> If I display an HD channel from the
> Comcast box via HDMI, the TV says it's "1080i". I wish I understood what
> all this means )-:
In many situations input signal resolution does not match output
resolution, so the image is scaled to fit. I'm guessing that since your
manual says the TV has a native resolution of 1024x768, it actually
does. However, it will accept input signals at higher resolutions.
This is extremely common. Most existing HDTV's and HD-Ready TV's (where
existing means those in people's homes) are 1280x720 pixels, but they
happily accept a 1080i (1920x1080) input signal. They just scale it to fit.
This way, when someone says I spent too much money on my TV because he
got his new huge-screen HDTV for only $1000 from a
friend-of-a-friend-of-... but it's a "1080" TV, just like mine (mine is
1080p), I can say, "Wow. You got a great deal!" However, I still know
that he's talking about its ability to accept a "1080" input signal
(probably 1080i) and scale it to fit on its available pixels (unless, of
course, he bought the TV from a friend-of-... off the back of a truck).
More information about the mythtv-users