[mythtv-users] VGA -SCART and interlace (2x)
John
reidjr at btconnect.com
Tue Sep 22 19:25:01 UTC 2009
Paul Gardiner wrote:
> John wrote:
>> Thanks for the reply. What I was doing was moving between
>> autodetect, interlaced and progressive under the menu. The picture
>> looks the same with interlace (x2) and Progressive (no -interlacer )
>> set. Same affect when I choose none as the interlacer under the TV
>> settings menu.
>>
>> Looks like I need another graphics card,
>
> I don't think you've quite got the point I was making.
You too :-)
What I see is perfect "still" quality. The problem is interlacing
artefact's on movement, just as you have clearly described. What I was
trying to say was that there was absolutely no reduction in interlacing
artefact's between interlaced 2x, and none. So doubling the field rate,
and repeating the fields does not lead to a constant synchronization
between recorded and displayed interlacing. The effect is
disconcerting, and is clearly noticable on all scene changes, as well as
movement.
VGA to scart gives excellent quality, and the quality is also acceptable
on movement using high quality deinterlacers, but the whole point is
that we shouldn't need to de-interlace to display it :-)
best regards
john
More information about the mythtv-users
mailing list