[mythtv-users] Very slight jitter on high power machine

Brad Templeton brad+myth at templetons.com
Tue Dec 14 10:00:06 UTC 2004


On Tue, Dec 14, 2004 at 01:14:18AM -0800, Jarod Wilson wrote:
> On Tuesday 14 December 2004 00:37, Brad Templeton wrote:
> > On Mon, Dec 13, 2004 at 11:13:46PM -0800, Jarod Wilson wrote:
> > > > Well, not quite since my TV, like most, is only 720 lines, so I was
> > > > never using it all, but you get the idea.
> > >
> > > I don't think most TVs are 720 lines.
> >
> > Indeed, almost all HDTVs sold today are 720 lines native resolution.
> 
> When looking strictly at new sets, perhaps that is the case. There are plenty 
> of 1080i sets around that only do 540 lines. Like mine. I haven't looked at 
> new sets much since I got mine about 3 years ago, when 720p sets were rare.

Were they allowed to call them HDTV sets?  I know for a while with plasma
TVs there was some very misleading labeling on them, suggesting they were
HDTV and not EDTV.  My understanding was that, at least today, you couldn't
call it an HDTV unless it had 720 lines.  (Also you can't call it an HDTV
unless it has an ATSC receiver, which is being mandated in all sets even
though few people actually want one -- they either want a cable/sat
set-top box or a PVR)

Computer CRTs for a decade have been able to display 1200 lines (though
being 4:3 only 1600 dots) in both progressive and interlaced forms, today
commonly to 85 hz and beyond.
> 
> > There are some very expensive ones that truly have 1080 lines.
> Those would be the new 1080p sets.

Actually, there are two things here.  How many lines your screen has,
and what bandwidth of incoming signal it can take.   You can sell 
a screen with 1080 lines only able to take 1080i on input, or the
same screen could come with electronics able to take 1080p.


I actually think they should have a 1080p/30hz.  That seems the same
as 1080i, and is the same bandwidth, but would have no interlacing artifacts.
It would be ideal for movies (that are 24hz anyway) and many other things
shot at 30 fps.

In the past, 30hz progressive made no sense, since with analog TV, that
would either require an overly long phosphor persistence or would flicker
like the dickens.   However, with digital TV, you can control those things
internally.   Digital displays (like flat panels) do this already.
Indeed, while 720p is 60hz, in an analog monitor, as we know this would
cause annoying flicker.    The flicker comes from the time when the
pixel is dimming for the next frame on CRTs.  DLPs and LCDs don't have
the same persistence issue.  I presume the modern CRTs use digital
processing to use even faster phosphors and run internally higher than
60hz.

1080i/60hz makes little sense because if something is moving fast you
get the interlace artifacts, so you probably would rather have the
30fps shot.



More information about the mythtv-users mailing list