[mythtv-users] What's easier to convert/decode? 720p or 1080i?

Art Morales bioart at gmail.com
Wed Mar 16 15:25:18 UTC 2005


Good to hear of another Samsung DLP user :)  I assume the DVI
interface will take a 1080i signal (since I can select that to be
output from my cable box).  Now that I know the multiple resolutions
are possible, I'll throw a few cycles that way and see If I get
anything good :)

Thanks for the info!

Art


On Wed, 16 Mar 2005 07:26:38 -0500, john roberts <homepagez at lycos.com> wrote:
> 
> Something I know about! ;)
> 
> I have the Samsung DLP - and here is what I found.
> 
> The DVI interface will only do 1280x720p - it will NOT do 1920x1080i.  Because of this I moved to using the VGA interface which WILL do 1920x1080i.
> 
> Why?  Because - as you said - the TV did a much better job of de-interlacing the 1080i content.  But - you do lose signal quality when using the VGA (analog) connection.
> 
> I've recently moved back to the DVI cable - cuz I just can't give up on a pure digital connection. :(  I'm still working out if Bob/other de-interlace methods have gotten any better.
> 
> And yes - XRANDR is used to change the mode in which the video card is sending to the TV.  Make sure you have this option compiled in (settings.pro).  Once done - you will see a "new" screen under TV -> Appearance which allows you to do as you say here "If 1080i then... if 720p then..."
> 
> Have fun - let me know if you get anywhere with your DLP.  DVI is the way to go if you can get 1080i to work (or just don't care that much about the 1080i content).
> 
> -John
>


More information about the mythtv-users mailing list