[mythtv-users] Done with ATI

Jarod Wilson jarod at wilsonet.com
Wed Aug 19 13:33:21 UTC 2009


On Aug 19, 2009, at 7:40 AM, Paul Gardiner wrote:

> Stephen Shelton wrote:
>> The only problem I had was getting the card to display a 1080i  
>> output.
>
> I'm a bit confused about the use people make of different display  
> modes
> now. I'd have imagined there are only two good ways to work, either
> letting the TV do the scaling and deinterlacing, or have MythTV do it.
>
> If having MythTV do it, then it makes sense to use the Advanced 2X  
> (HD)
> deinterlacer for 1080i content, but I wouldn't have thought you'd want
> to output a 1080i display mode. You'd be better outputting 1080p, and
> probably use that for everything, if it's the TV's resolution.
>
> If you want the TV to do the work, then it makes sense to use a 1080i
> display mode for 1080i content, but then you don't want MythTV using
> a deinterlaced (other than perhaps the "Interlaced" deinterlacer).
> Also you'd want to use a 1080i display mode only for 1080i content -
> using a 720p display mode for 720p content.
>
> Is that right? There doesn't seem to be any definitive advice on this.

I've been outputting nothing but a 1080p signal to my TV for the past  
4+ years or so, without using a deinterlacer. MythTV handles the  
scaling of everything that needs it just fine, the TV handles the  
deinterlacing, and everything looks quite good. (And the bulk of my  
content is 1080i mpeg2.)


-- 
Jarod Wilson
jarod at wilsonet.com





More information about the mythtv-users mailing list