[mythtv-users] HD Woes (was: Upgrade Options for $150)
adeffs.mythtv at gmail.com
Mon Jan 29 14:22:51 UTC 2007
On 1/29/07, Nate Crosno <nate at crosno.net> wrote:
> Nate Crosno wrote:
> > Well, I've placed my order. After searching around for several hours on
> > various sites, I'm still going with the Jetway board because it was the
> > only 939 board I could find with DVI out, plus it has component out,
> > which I'd like to try. It also had the nvidia chipsets. After reviewing
> > some processor performance charts (with a grain of salt), I opted for an
> > Athlon 64 3400 at 2.2Ghz and a cheap heatsink/fan combo. According to a
> > cpu chart that tested transcoding a DVD, it looks like that will be
> > about a 10% bump in performance over the 3200 I listed before. The
> > damage came to $137 out the door.
> > AMD Athlon 64 3400+ Venice 2.2GHz Socket 939
> > http://www.newegg.com/Product/Product.asp?Item=N82E16819103023
> > JetWay J939GT3-PTD Socket 939 NVIDIA GeForce 6100 Micro ATX
> > http://www.newegg.com/Product/Product.asp?Item=N82E16813153047
> So here's the status -- the Jetway board is fine so far, and everything
> is working great except 1080i. XvMC never worked well for me, but with
> it turned on, I can just barely get 1080i to play in Myth, but it takes
> ALL the cpu. With XvMC off, 1080i pauses every 2-4 seconds.
> I'm using "1280x720_60" for my xorg.conf mode. 720p will play in Myth at
> about 50% cpu with XvMC on and up to 60% with libmpeg2. As a
> comparison, Xine only uses about 8% with xvmc for 720p and 45% for 1080i
> (with xvmc) and 60% using "-V xv".
> I've also tried overclocking as high as the MB will go (2.9ghz), just to
> see if it made a difference, but it does not.
> I've tried every permutation of OpenGL Vsync in myth and nvidia
> settings, plus the various buffering, interlacing, etc options I can
> think of, but nothing seems to make much of a difference on way or
> another. I'm stuck! The system is still usable, but no more so than
> before the upgrade. I just have to use the same xine work-arounds for
> 1080i as I did before, but the whole point of the upgrade was to make
> those go away!
> Can anyone with a similar setup send me detailed info about their
> configuration? At this point, I am doubtful that throwing more money at
> it in the form of a faster processor would even help much. I think I
> have something configured wrong, but cannot put my finger on it.
> P.S. More info:
> # uname -a
> Linux localhost 2.6.16-gentoo-r6 #1 SMP Mon May 8 13:07:34 PDT 2006 i686
> AMD Athlon(tm) 64 Processor 3400+ AuthenticAMD GNU/Linux
> Section "Monitor"
> Identifier "Monitor0"
> HorizSync 15.0 - 46.0
> VertRefresh 59.0 - 61.0
> Modeline "1920x1080_60i" 74.25 1920 2008 2052 2200 540 542 547
> 562 +hsync +vsync interlace
> Section "Device"
> Identifier "Card0"
> Driver "nvidia"
> Option "NoLogo" "true"
> Option "coolbits" "1"
> Option "RenderAccel" "true"
> VendorName "All"
> BoardName "All"
> Option "XvmcUsesTextures" "false"
> Option "NvAgp" "0"
> Option "UseEvents" "True"
> Option "ConnectedMonitor" "DFP"
> Option "UseDisplayDevice" "DFP"
> Option "FlatPanelProperties" "Scaling = native"
> Option "IgnoreDisplayDevices" "TV, CRT"
> Option "ExactModeTimingsDVI" "1"
> Option "UseEDIDDpi" "0"
> Option "UseEDIDFreqs" "0"
> Section "Screen"
> Identifier "Screen0"
> Device "Card0"
> Monitor "Monitor0"
> DefaultColorDepth 24
> SubSection "Display"
> Depth 24
> Modes "1280x720_60"
Assuming your using the 9xxx series drivers,
try removing your 1080i modeline and use
Modes "1280x720_60" "1920x1080_60i"
this should use ModePool versions which will have the proper refresh rate.
Then turn on Xv Vsync in nvidia-settings and use the Standard decoder.
Before you ask, read the FAQ!
then search the Wiki, and this list,
Mailinglist etiquette -
More information about the mythtv-users