[mythtv-users] [mythtv] Best quality SD TV picture from DVB-T source
lists at glidos.net
Mon Sep 22 08:45:45 UTC 2008
>> Very few people seem to understand the full details of these
>> issues. I thought maybe on the dev list I might have some luck.
> I admire your commitment to quality.
I wish I didn't care. :-) If the TV wasn't so good, and
I hadn't gotten used to getting a quite striking clarity
(albeit SD) from my DVRs, I don't think I'd be worrying about it.
As it is, it seems a shame not to get the best out of the TV.
> I must admit I have no knowledge
> on this matter although its an interesting problem. On my mythfrontend
> (0.21-fixes) there is a de-interlacer called 'no deinterlacing' (or
> something like that). The description reads that you need X.org set to
> an interlaced video mode at the same resolution as the source
> material. I imagine this will be the best quality (esp. wrt CPU usage)
> you can get.
That makes a lot of sense. I think that's what I get from Minimyth
by default, which would fit with the fact that I get combs when
displaying to a flat panel (although no combs at all when displaying
via TV out). I'm confused, though, by reading of someone using
bobdeint, and by that means getting correct sychronisation. It was
just one post, and I wasn't able to follow it up.
> My opinion is that a PC based device will never have
> quality equivalent to a (good) DVR. I'm guessing the DVR will have
> hardware that clocks the video output device and sound output device
> using clocks extracted from the source material so everything is
> perfectly synchronized. This is not possible with a PC.
That's odd. I'd have thought it was possible, although the other
way around: i.e., set up the audio and video system to by very
close to the correct rates, and then just let them consume the
video at the rate they desire (messing with the streams every
now and then to keep them in sync).
> Good luck, let me know how you went.
More information about the mythtv-users