[mythtv-users] Resolution

James L. Paul james at mauibay.net
Wed Dec 3 03:05:14 EST 2003


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On Tuesday 02 December 2003 17:19, Jim Paris wrote:
> > My goal is to use the lowest resolution and bitrate I can without
> > noticing a quality drop. For me and my dvd player, that happens to be
> > MPEG2 1/2 D1, so I use 352x240 at about 2500k bitrate. (Which also
> > happens to be perfect for CVD and DVD which my standalone player works
> > great with.)
>
> Why lower the resolution?  2500k is 2500k is 2500k no matter what the
> resolution.  What's the point of 1/2 D1?  All that introduces is extra
> scaling (scale down, encode, decode, scale up), so the only thing you
> gain is encoding/decoding speed.

Untrue. Bitrate is rather meaningless without resolution. The bitrate 
represents the amount of data available to encode the pixel information. For 
any given resolution and bitrate you can calculate the number of bits per 
pixel. (Of course this is somewhat more complicated due to different 
compression frame types representing pixel information differently, but 
that's irrelevant for this discussion.) If you double the number of pixels, 
you cut the number of bits per pixel in half, thus spreading the data more 
thinly and more poorly representing the original image. You can double the 
bitrate to compensate for the extra pixels, or you can decrease the number of 
pixels to gain "bandwidth" in the image and increase the quality.

The point is that you get double the pixel depth with 1/2 D1 than you get with 
D1. Since the encoder has half the amount of pixel area to encode, it has 
double the number of bits to do it. The image is roughly equivalent in 
quality to D1 at twice the bitrate. Since any number of pixels greater than 
1/2 D1 will be effectively "wasted" on a TV display, I'd rather use the 
appropriate number of pixels so to maximize the number of bits per pixel.

Put another way, my TV can't show me 720x480, it's simply doesn't have that 
many lines of resolution. So I'd rather not waste bitrate on pixels I can't 
see. I'd rather reduce the number of pixels, which increases the bits per 
pixel, reduces compression artifacts and results in a higher quality image on 
a TV.

There is no "extra scaling" introduced. When I play my recorded video on my 
TV, it's at the stream's native resolution. (I'm ignoring that I'm using an 
anamorphic format. The number of pixels represented is unchanged.) TVs are 
not pixel-based and do not scale NTSC or PAL video. The "resolution" of a 
good consumer TV is approximately equivalent to 352x480, which is considered 
the number of pixels needed to represent a SVHS quality image. The signal 
quality of most analog cable TV sources in the US is about VHS quality or 
slightly better, commonly about 280 lines. So, 352x480 is actually overkill 
for my cable source, which is about 352x280. It's certainly a waste to use 
more than half my bitrate to encode at 720x480!

Gee. That was long-winded. Sorry. I hope it's clear enough.

> -jim
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.1 (GNU/Linux)

iD8DBQE/zZk6T8BYaKRUpkQRAntYAJ9Dr74ke6vxauewOuxYmTt4Aa6+tgCgm3PF
N4E92iTpS2oV0LoRSQJA5Qk=
=aLbb
-----END PGP SIGNATURE-----




More information about the mythtv-users mailing list