[mythtv-users] High end, state of the art Myth Frontend

Andre Newman mythtv-list at dinkum.org.uk
Thu Sep 19 10:18:34 UTC 2013


On 19 Sep 2013, at 02:10, Jeff Siddall <news at siddall.name> wrote:

> On 09/18/2013 04:17 PM, Joseph Fry wrote:
>> On Wed, Sep 18, 2013 at 4:07 PM, Jeff Siddall <news at siddall.name> wrote:
>>> On 09/18/2013 03:42 PM, Eric Sharkey wrote:
>>>> 
>>>> On Wed, Sep 18, 2013 at 3:36 PM, Jeff Siddall <news at siddall.name> wrote:
>>>>> 
>>>>> But at a given bitrate I am becoming more convinced that interlacing is a
>>>>> good tradeoff.  The loss of resolution with 720 is irreversible whereas
>>>>> deinterlacing done well can result in a very watchable, and potentially
>>>>> much
>>>>> sharper, image.
>>>> 
>>>> 
>>>> You can scale up an image from 720 to 1080.  The result is no less
>>>> fake than what you get by deinterlacing.  Going from 1080p to 1080i or
>>>> 720p both represent irreversible information loss.
>>> 
>>> 
>>> Yes, I agree, you can scale an image up (and that is ultimately what happens
>>> when a 720p is shown full screen on a native 1080 display).
>>> 
>>> However, I disagree that 1080i represents irreversible information loss.
>>> With 1080i60 you get two subsequent 540 line images that, when combined,
>>> result in a full resolution 1080 line image.  If there is no motion in the
>>> image then no information is lost.
>> 
>> But that's not what happens.
>> 
>> They may use a 1080p/60 camera to shoot the footage, but then they
>> take half the fields of each frame and create 1080i/60 from it.
>> Effectively losing 1/2 of each frame.
> 
> Well, yes, but (and this is a big but) they don't lose the same half in two subsequent fields.  One contains even and one contains odd lines. So if your content isn't moving then after two frames you have all 1080 lines -- something you simply can't get in 720p.
> 
>> What your suggesting is that they are creating 1080p/30 but only
>> transmitting half the image at a time... which would be pointless...
>> they could transmit 1080p/30 in the same bandwidth at 1080i/60... no
>> need to interlace it.
> 
> Yes, theoretically they could make 1080p30 but in reality deinterlacing gives 60 frames per second with half the lines in each frame being interpolated.  This is why the quality of the deinterlacer has such a large impact on the final video.
> 
> See here for more info:
> 
> http://en.wikipedia.org/wiki/Interlacing
> 
> Also, I incorrectly stated they switched to 720p30 when they actually switched to 720p60.
> 
> Bottom line is that the 1080i picture is significantly sharper, especially for low motion content, but the tradeoff is it can suffer some from motion artifacts related to deinterlacing.

To put it another way, 1080i gains and loses resolution depending on the speed of motion which I think is rather odd, especially as it's extremely high resolution in one direction and variable from ok to very poor in the other. I theorise that if people weren't used to interlacing from SD days 1080i would look extremely odd.

So some of the time it's 1920x1080 which is great but at various times it's 1920x540 which is very unbalanced (tall thin pixels) and in reality different parts of the image have square and rectangular pixels at the same time.

As I am accustomed and trained to looking for problems in broadcast pictures and their component signals I do find 1080i sports rather odd to watch. This effect is emphasised when you have seen video that doesn't do such odd things and go back to regular 1080i.

Even with 720p60 at least things are consistently sharp (ok not very) equally in both directions with most common motion delivering moderate blur in a consistent manner. 

Bottom line is interlacing is a 1950's video compression system and in MythTV's context it takes much more CPU time and graphics IO to "de-compress" interlacing than it does to de-compress H264!

Andre


More information about the mythtv-users mailing list