[mythtv-users] quality

match at ece.utah.edu match at ece.utah.edu
Tue Sep 12 18:52:05 EDT 2006


From:           	Sergei Gerasenko <gerases at publicschoolworks.com>

> > Which nVidia card?
> 
> nVidia Corporation GeForce 6200 TurboCache(TM) (rev a1)

No surprises there...
> 
> > How interesting. My experiences are very different. I run 3 MythTV 
> > boxen with CRT computer monitors. One is a 19" Dell, next is a 20" 
> > Viewsonic and lastly a Marquee 8500 CRT projector fed through its 
> > RGBHV interface. The picture is stellar on each one viewing SD 
> > content, 
> 
> Can you send a couple screenshots so I can compare?

No. Since we're discussing what actually hits your eyeballs, all I 
could do would be to take a snapshot with my 3 Megapixel camera and 
that would tell us more about the camera than what you see on the 
screen.

> What's SD content by the way?

Standard Definition. (As opposed to HD or High Definition). Hey! You 
gotta call it SOMETHING!  It's good ol' NTSC that was adopted in... I 
think... 1937? Or good ol' PAL or SECAM in other parts of the world.
> 
> > you're calling "kinda bad" quality? Is it pixelated? Snowy? Colors 
> > off? What?
> 
> Sorry. It's pixalated and somewhat blurry. Unfortunately, I can't yet
> figure out how to take a screenshot of the thing because it's running
> full screen. 
> 
> > Bitrate too low maybe?
> 
> I use the default LiveTV profile.

Dunno what the "default" is...

The thing is, a computer monitor has a bandwidth usually over 80 MHz, 
while a TV set has a bandwidth of about 4 or 5 MHz, so a computer 
monitor will show faults that a TV set won't. A TV set is much less 
demanding than a computer monitor. A signal with a bandwidth over 4 
or 5 MHz is wasted on a TV, but the computer monitor makes good use 
of it. A little fuzziness or pixelation won't be seen on a TV set, 
but a computer monitor will display it in all it's glory. (HD is 
another story)

Hmmm... Try capturing at a higher resolution setting like 640x480 or 
even 720x480 and see if the blurriness improves. Yes, I know... you 
can't get more resolution out of a signal than the originator put 
into it, but try it. The truth lies somewhere between 480x480 and 
640x480 for most SD signals. There's no point in recording at too 
high a resolution setting, but 480x480 is marginal. The pixelation 
_should_ be worse at the higher resolution settings, so:

Next, try a high-ish bitrate like 6000 or even 8000 (temporarily) and 
see if the pixelation improves. If it does, start setting the bitrate 
down a little at a time until you just start to see the pixelation 
come back, then bump it back up a little.

With these settings you're trading off resolution against file size. 
The bitrate and the capture resolution should be "high enough" but no 
higher. The higher the bitrate, the bigger the file that will be 
created. Set it REALLY high and you'll be recording an awful lot of 
duplicate bits for nothing. But, if the bitrate is too low (for the 
particular resolution), you get pixelation. 

There's a chart on the WEB somewhere that explains all this 
graphically, but I can't remember where. YMMV anyway.

Just for the record, my own system records files that are needlessly 
large. In other words, I err on the side of better picture and larger 
files, but with almost 3 terrabytes of storage it's not an issue. 
I'll select more sane settings when I upgrade from .19 to .20 in 
another week or two  :-)

All this so called "advice" assumes that the problem is related to 
resolution and bitrate, rather than hardware or drivers or signal. If 
it's not, then I'm just flapping my gums... err... fingers, and I'm 
completely up-in-the-night.

At least by going through all this you'll have a better understanding 
of how all these settings work than anyone can explain to you or that 
you can read about, even if it doesn't fix the problem.

Marvin





More information about the mythtv-users mailing list