[mythtv-users] HD Woes

Steven Adeff adeffs.mythtv at gmail.com
Mon Jan 29 19:20:52 UTC 2007


On 1/29/07, Nate Crosno <nate at crosno.net> wrote:
> Steven Adeff wrote:
> >
> > Assuming your using the 9xxx series drivers,
> > try removing your 1080i modeline and use
> > Modes "1280x720_60" "1920x1080_60i"
> >
> > this should use ModePool versions which will have the proper refresh rate.
> > Then turn on Xv Vsync in nvidia-settings and use the Standard decoder.
> >
>
> I am indeed using the latest nvidia driver. Sorry I forgot to add that!
> I have a very hard time trying to figure out the right amount of info to
> post without putting so much that, no one will bother to read through
> the whole thing.

too much is usually a good thing, the difference is in how you format
it. If you have 100 thoughts in one paragraph I'm going to skip the
paragraph. If you have 100 thoughts broken up into 50+ paragraphs of 1
or 2 related thoughts each, I can easily gloss over the information I
don't need. I assume most people's minds work this way? (so how you
formated this email is good =)

> Try as I might, I could never get 1080i working over DVI.  When I
> started X in verbose mode ("startx -- -logverbose 5" or something like
> that), there was no 1080i mode listed in the modepool.  After seeing
> your post, I thought maybe the fact that I had a Modeline of the same
> name in there was keeping it from showing, so I commented out that
> modeline and tried again -- still no go.  In my log, I get:
> No valid modes for "1920x1080_60i"; removing.
>
> The 1080i modeline that I showed before, was one that I built using the
> EDID information from the xorg log:
>
> (--) NVIDIA(0): DPMS Capabilities            :
> (--) NVIDIA(0): Prefer first detailed timing : Yes
> (--) NVIDIA(0): Supports GTF                 : No
> (--) NVIDIA(0): Maximum Image Size           : 0mm x 0mm
> (--) NVIDIA(0): Valid HSync Range            : 15.0 kHz - 46.0 kHz
> (--) NVIDIA(0): Valid VRefresh Range         : 59 Hz - 61 Hz
> (--) NVIDIA(0): EDID maximum pixel clock     : 80.0 MHz
> (--) NVIDIA(0):
> (--) NVIDIA(0): Established Timings:
> (--) NVIDIA(0):   640  x 480  @ 60 Hz
> (--) NVIDIA(0):
> (--) NVIDIA(0): Detailed Timings:
> (--) NVIDIA(0):   1920 x 1080 @ 60 Hz
> (--) NVIDIA(0):     Pixel Clock      : 74.25 MHz
> (--) NVIDIA(0):     HRes, HSyncStart : 1920, 2008
> (--) NVIDIA(0):     HSyncEnd, HTotal : 2052, 2200
> (--) NVIDIA(0):     VRes, VSyncStart : 540, 542
> (--) NVIDIA(0):     VSyncEnd, VTotal : 547, 562
> (--) NVIDIA(0):     H/V Polarity     : +/+
> (--) NVIDIA(0):     Extra            : Interlaced
>
>
> This motherboard also has component out, and after much tinkering, i got
> that to do 1080i out. I had to set the desktop to 1920x1080 in order for
> it to fill the screen (with overscan).  I then tried playing a 1080i
> clip in Myth with 'Standard' and 'libmpeg2' and without deinterlacing.
> This had the same CPU usage as before and played back with pauses every
> couple seconds.  Also, the content still looked like it needed
> deinterlacing.  Am I not understanding this correctly? I thought at
> 1080i, I should have deinterlacing turned off?  I was hoping to take
> load off the Myth box by having the TV scaler do some of the work.
>
> Here are some xorg.conf details from that test:
>
> Section "Device"
>          Identifier  "Card0"
>          Driver      "nvidia"
>          Option      "NoLogo" "true"
>          Option      "coolbits" "1"
>          Option      "RenderAccel" "true"
>          VendorName  "All"
>          BoardName   "All"
>          Option      "XvmcUsesTextures" "false"
>          Option      "NvAgp" "0"
>          Option      "UseEvents" "True"
>          Option      "UseDisplayDevice" "TV"
>          Option      "TVOutFormat" "COMPONENT"
>          Option      "TVStandard" "HD1080i"
>                      #It wouldn't work without the above line
>          Option      "UseEDID" "FALSE"
>          Option      "TVOverScan" "0"
> EndSection
>
> Section "Screen"
>          Identifier      "Screen0"
>          Device  "Card0"
>          Monitor "Monitor0"
>          DefaultColorDepth 24
> ...
>   Modes  "1920x1080"
> ...
> EndSection
>
> I admit that I did not search enough for info about component settings.
> I found a few more things to try....mostly taking out the "TVOutFormat"
> line above and double-checking the various 'vblank' settings.
>
> In the mean time, I throw myself at the mercy of the list and hope for a
> kind soul to offer up any other possible ideas.
>
> Time to go do some real work and stop messing with the TV for a while.

don't use
          Option      "UseEDID" "FALSE"
as that effectively disables the ModePool, use the Option
"ModeValidation" repalcements as needed. You can get more information
from the driver manual linked to via the driver download page.

You may not even need it for the 9xxx, I had a ModeValidation line
with 8xxx but upon my upgrade to 9xxx I found I no longer needed it if
I used
Option         "ExactModeTimingsDVI" "true"
which seems to let the driver choose from the ModePool properly, but
this may just be my tv...

-- 
Steve
Before you ask, read the FAQ!
http://www.mythtv.org/wiki/index.php/Frequently_Asked_Questions
then search the Wiki, and this list,
http://www.gossamer-threads.com/lists/mythtv/
Mailinglist etiquette -
http://www.mythtv.org/wiki/index.php/Mailing_List_etiquette


More information about the mythtv-users mailing list