Difference between revisions of "NVidiaProprietaryDriver"

From MythTV Official Wiki
Jump to: navigation, search
(Removed tvout link)
(TVOutFormat: Removed TVout subsection)
Line 178: Line 178:
 
=== Vertical Refresh Frequency ===
 
=== Vertical Refresh Frequency ===
 
Some televisions will show a black-and-white image if you attempt to display video using an unexpected vertical refresh frequency, e.g. showing PAL-I format video at 60Hz instead of 50Hz. This is because the TV may use the vertical refresh frequency to decide whether it should be decoding PAL (usually 50Hz) or NTSC (60Hz). Either change the TVStandard (see above) to match the frequency, or [[XorgConfMonitorSectionForTV|change the frequency]] to match the video standard.
 
Some televisions will show a black-and-white image if you attempt to display video using an unexpected vertical refresh frequency, e.g. showing PAL-I format video at 60Hz instead of 50Hz. This is because the TV may use the vertical refresh frequency to decide whether it should be decoding PAL (usually 50Hz) or NTSC (60Hz). Either change the TVStandard (see above) to match the frequency, or [[XorgConfMonitorSectionForTV|change the frequency]] to match the video standard.
 
=== TVOutFormat ===
 
Some televisions support multiple video formats on a single SCART or S-VIDEO socket, but without auto-detection of the signal type. If you are seeing a sharp black and white image instead of a colour image, the nVidia card is probably outputting a Composite signal, but your TV is expecting S-Video. Either switch the TV to Composite or "AV" mode (not recommended), or set the following option in xorg.conf:
 
 
Option "TVOutFormat"  "SVIDEO"
 
 
Conversely, to force Composite output (if your TV does not support S-VIDEO), use:
 
 
Option "TVOutFormat"  "COMPOSITE"
 
 
S-Video generally has far superior image quality to composite video, especially from nVidia cards.
 
 
Annoyingly, some nVidia card/TV combinations will default to Composite mode at boot-time, meaning that your boot sequence will be in black and white until xorg.conf is loaded. (If anyone knows a fix for this (using different cables perhaps?), please insert it here!)
 
  
 
=== Television does not support S-Video ===
 
=== Television does not support S-Video ===

Revision as of 08:23, 16 December 2014

Introduction

NVIDIA does not provide the documentation for their hardware, which is necessary in order for programmers to write appropriate and effective open source drivers for NVIDIA's products. Instead, NVIDIA provides their own binary graphics drivers for X.Org. This closed source driver is referred to as the NVidia Proprietary Video Driver.

This document describes how to use the NVIDIA Proprietary Video driver. The latest driver can be downloaded from the NVIDIA download site


Webpage.png - Version history from NVIDIA and older drivers

Download.png - Download the latest driver from NVIDIA

To determine your current version try

$ cat /proc/driver/nvidia/version
NVRM version: NVIDIA UNIX x86 Kernel Module  173.14.09  Wed Jun  4 23:43:17 PDT 2008
GCC version:  gcc version 4.1.2 20070925 (Red Hat 4.1.2-33)

Resolutions supported by the driver depend on the card in use. But most cards have a TV-encoder onboard (the actual IC making the TV-Out connector(s) work) with only support for a few (lower) resolutions... all 4x3 aspect ratio :-( Here's the list:
- 1024x768
- 800x600
- 720x576 (actually a very very good resolution for (european) TVs using the PAL TV-standard!)
- 720x480 (NTSC)
- 640x480

Before you begin

Is your card supported?

Check if your card is supported by the driver, a list of supported hardware can be found in Appendix A of the README file. To extract the README file, 'cd' to the directory where you downloaded the driver, 'chmod' the ".run" file to make it executable, and run 'NVIDIA-Linux-x86-190.18-pkg1.run -x'. This will extract the contents, without running the included installer script.

Make sure you check the hardware list from the lastest version of the driver. As an example, (here is Appendix A from version 100.14.11) Also check the specs of your card, some integrated cards do not support dual screen. (pc screen and TV-out)

There are no published plans by Nvidia to support XvMC in the 8xxx series, which may make hardware assisted high def playback problematic.[1]. Instead, the newer cards support the VDPAU hardware assisted acceleration method. There is a binary choice between XVMC and VDPAU, depending upon what your card supports. The latest drivers support both methods of acceleration, so xvmc users still benefit from the latest enhancements elsewhere in the code. For further information see the VDPAU entry in this wiki.

How is your TV connected? (TV-Out)

The type of output your video card can do, and the type of inputs your display device can handle are primarily what dictates what you should use to connect them. From highest- to lowest-quality, the order of consideration is: HDMI, DVI (both of which are digital), VGA, Component, S-Video and finally Composite (all of the rest are analog). If you want the gory details, see Highly Technical Details

HDMI digital
DVI digital
VGA analog
Component analog
S-Video analog
Composite analog

What's your Television Broadcast Standard

Depending on your country, you use NTSC, (National Television Standards Committee) or PAL, (Phase Alternating Line) for you Television Broadcast Standard.


TVStandards

PAL-B used in Australia, Belgium, Denmark, Finland, Germany, Guinea, Hong Kong, India, Indonesia, Italy, Luxembourg, Malaysia, The Netherlands, New Zealand, Norway, Portugal, Singapore, Spain, Sweden, and Switzerland

PAL-D used in China and North Korea

PAL-G Denmark, Finland, Germany, Italy, Luxembourg, Malaysia, The Netherlands, Norway, Portugal, Spain, Sweden, and Switzerland

PAL-H used in Belgium

PAL-I used in Hong Kong and The United Kingdom

PAL-K1 used in Guinea

PAL-M used in Brazil

PAL-N used in France, Paraguay, and Uruguay

PAL-NC used in Argentina

NTSC-J used in Japan

NTSC-M Canada, Chile, Colombia, Costa Rica, Ecuador, Haiti, Honduras, Mexico, Panama, Puerto Rico, South Korea, Taiwan, United States of America, and Venezuela

Installing the NVIDIA Driver

Some distributions come with the NVidia driver in their package management system, however mostly this is NOT the current version. If your distribution does not ship the driver, or you have other reasons to install the latest driver from NVidia, everything you need to get this working you will find in the README file from the driver; read the chapter Configuring TV-Out. A full description of the installation is set out on the VDPAU page of this wiki. The short version, is that you have to switch to a non-graphical run level (run-level 3 in Fedora and run the installer package ( the '.run' file) which will build the required kernel module. Reboot into run-level 5 ( the X graphical level and enjoy.


openSUSE

Geeko head48.png openSUSE is a distribution which has packages for the driver. Please see the opensuse NVIDIA page on how to install the driver.

Debian GNU/Linux

Debian also has packages available to install the driver. Note that these are in the non-free section. The following command gives an overview of the available packages:

root@mast:~# apt-cache search nvidia | grep ^nvidia
nvidia-xconfig - The NVIDIA X Configuration Tool
nvidia-cg-toolkit - NVIDIA Cg Toolkit installer
nvidia-kernel-common - NVIDIA binary kernel module common files
nvidia-settings - Tool of configuring the NVIDIA graphics driver
nvidia-glx - NVIDIA binary XFree86 4.x driver
nvidia-glx-dev - NVIDIA binary XFree86 4.x / Xorg driver development files
nvidia-glx-legacy - NVIDIA binary Xorg driver (legacy version)
nvidia-glx-legacy-dev - NVIDIA binary Xorg driver development files
nvidia-kernel-2.6-486 - NVIDIA binary kernel module for 2.6 series compiled for 486
nvidia-kernel-2.6-686 - NVIDIA binary kernel module for 2.6 series compiled for 686
nvidia-kernel-2.6-k7 - NVIDIA binary kernel module for 2.6 series compiled for k7
nvidia-kernel-2.6.18-4-486 - NVIDIA binary kernel module for Linux 2.6.18-4-486
nvidia-kernel-2.6.18-4-686 - NVIDIA binary kernel module for Linux 2.6.18-4-686
nvidia-kernel-2.6.18-4-k7 - NVIDIA binary kernel module for Linux 2.6.18-4-k7
nvidia-kernel-2.6.18-5-486 - NVIDIA binary kernel module for Linux 2.6.18-5-486
nvidia-kernel-2.6.18-5-686 - NVIDIA binary kernel module for Linux 2.6.18-5-686
nvidia-kernel-2.6.18-5-k7 - NVIDIA binary kernel module for Linux 2.6.18-5-k7
nvidia-kernel-legacy-2.6-486 - NVIDIA binary kernel module for 2.6 series compiled for 486
nvidia-kernel-legacy-2.6-686 - NVIDIA binary kernel module for 2.6 series compiled for 686
nvidia-kernel-legacy-2.6-k7 - NVIDIA binary kernel module for 2.6 series compiled for k7
nvidia-kernel-legacy-2.6.18-4-486 - NVIDIA binary kernel module for Linux 2.6.18-4-486 (legacy version)
nvidia-kernel-legacy-2.6.18-4-686 - NVIDIA binary kernel module for Linux 2.6.18-4-686 (legacy version)
nvidia-kernel-legacy-2.6.18-4-k7 - NVIDIA binary kernel module for Linux 2.6.18-4-k7 (legacy version)
nvidia-kernel-legacy-2.6.18-5-486 - NVIDIA binary kernel module for Linux 2.6.18-5-486 (legacy version)
nvidia-kernel-legacy-2.6.18-5-686 - NVIDIA binary kernel module for Linux 2.6.18-5-686 (legacy version)
nvidia-kernel-legacy-2.6.18-5-k7 - NVIDIA binary kernel module for Linux 2.6.18-5-k7 (legacy version)
nvidia-kernel-legacy-source - NVIDIA binary kernel module source (legacy version)
nvidia-kernel-source - NVIDIA binary kernel module source

You need to install the packages nvidia-glx, nvidia-kernel-common, the package for your running kernel and for easy configuration nvidia-xconfig. Issue the following command:

apt-get install nvidia-glx nvidia-kernel-common nvidia-kernel-`uname -r` nvidia-xconfig

Fedora

For installation on a system with Fedora, see the installation instuctions.

Common problems and solutions

Here is a test image for your TV-Out

mythTV Test image

click image to enlarge

Blue line(s) surrounding picture

One common symptom is the following: the NVidia driver may (or may not) give one or more blue lines surrounding the display during TV or DVD playback. This is something which is mostly seen on widescreen TV displays. The 'xvattr' command can be used to solve the problem:

xvattr -a XV_COLORKEY -v 66048

Add the command somewhere in the Xorg startup sequence. This may be done in various ways depending on your Linux distribution (need more distribution-specific information on this item!)

Solution for Ubuntu and Debian GNU/Linux 4.0

Create the following file:

/etc/X11/Xsession.d/98custom_disable-blueline

and add the following contents:

#!/bin/sh
xvattr -a XV_COLORKEY -v 66048

Restart your graphical environment and you're done!

Solution for openSUSE

Geeko head48.pngAdd the following line to /etc/X11/xinit/xinitrc after "Add your own lines here":

#disable-blueline:
xvattr -a XV_COLORKEY -v 66048

more on openSUSE and NVidia http://www.suse.de/~sndirsch/nvidia-installer-HOWTO.html

Black-and-White output

There are at least five possible causes:

TVStandard

Make sure the TVStandard option is set to the one valid for your country. Many nVidia cards default to the American TVStandard 'NTSC'. For a mythtv box in the Netherlands use the following option in the 'Device' section in xorg.conf:

Option "TVStandard" "PAL-B"

If you're in the UK, use:

Option "TVStandard" "PAL-I"

Many other variations of PAL and other standards are supported. For a complete list, see the nVidia "readme" for your driver version (see the nVidia Linux driver download page).

Vertical Refresh Frequency

Some televisions will show a black-and-white image if you attempt to display video using an unexpected vertical refresh frequency, e.g. showing PAL-I format video at 60Hz instead of 50Hz. This is because the TV may use the vertical refresh frequency to decide whether it should be decoding PAL (usually 50Hz) or NTSC (60Hz). Either change the TVStandard (see above) to match the frequency, or change the frequency to match the video standard.

Television does not support S-Video

Some televisions do not support a S-Video signal on a SCART input. This results in only the luminance signal being used. A hack exists which merges the luminance and chroma signals on the SCART connector which essentially creates a composite signal that the television uses. This is performed by connecting pins 15 (chrominance in) and 20 (luminance in) on the SCART connector. This signal is unfiltered and is at best equal in quality to a direct composite connection. See this thread for further information.

Cable Connections

Obvious perhaps, but make sure that the ends of your cables are firmly in place! (especially if you're using SCART/RGB sockets)

Annoying NVIDIA logo?

To disable the nVidia logo splash screen that is displayed when X is first initialized, you may use the following command to automatically edit your xorg.conf correctly:

nvidia-xconfig --no-logo

For older driver versions or installations without nvidia-xconfig installed, one will have to manually edit their xorg.conf and add this line in the 'Device' section:

Option "NoLogo" "True"

Small (unreadable) fonts or too big fonts?

See The FAQ "font size" entry and Specifying DPI for NVidia Cards.

Nvidia-cards and no picture when box is on before the TV

See this link if you have this Problem

System instability

The first port of call if your graphics card is unstable is the README that is included with your Nvidia driver package. There may be an online version of the README linked from the Nvidia Linux driver page for your system's processor architecture.

You should also have a look at the Linux forums which are linked to from the same page.

Try to reduce the AGP speed down (this seems to be a problem with boards with VIA KT333 and KT400 chipsets and also earlier VIA chipsets). Follow the instructions in the README for configuring the AGP rate. With an MSI KT4 Ultra (MS-6590) and Nvidia GeForce 6200A video card, the default AGP speed of 8x made the system unstable. Once it was set to 4 the system became rock steady. The BIOS changes weren't tried, they may also have helped and avoided the need for the edit/recompile.

It's possible to perform initial testing by disabling AGP entirely using the NvAGP option in your X11 config file. Check the APPENDIX F, AGP section in the README for more details.

As noted in the NVidia release notes there are known issues with some VIA chipsets as found on Athlon XP mainboards; The KT266 and KT333 chipsets are known to have problems. The solution described in the releasenotes did not solve my instability problems. However, booting the system with the 'noapic' option appended to the kernel boot line was enough to get my system stable. --Michel 14:20, 26 March 2007 (UTC)

Choppy video/High CPU Usage

Symptoms:

  • Choppy video
  • CPU usage near 100%
    • Xorg using the most CPU time
    • mythfrontend using the second most CPU time
    • other processes using negligible CPU time
  • Sync to VBlank set in nvidia-settings

Solution: Add the line

Option "UseEvents" "True"

to the Screen or Device section of /etc/X11/xorg.conf.

From NVIDIA docs:

Option "UseEvents" "boolean":

Enables the use of system events in some cases when the X driver is waiting for the hardware. The X driver can briefly spin through a tight loop when waiting for the hardware. With this option the X driver instead sets an event handler and waits for the hardware through the ‘poll()’ system call. Default: the use of the events is disabled.

This goes from using a "busy" wait by default to a less CPU intensive system call: poll, ppoll - wait for some event on a file descriptor

However, this can cause problems on some hardware so if X starts getting flaky and crashing, set UseEvents back to false.

Note: As of driver 177.80, this problem persists, and is not fixable, on the integrated GeForce 8200 GPU. The issue may or may not be fixed in later revisions.

On driver 180+ on 8200 GPU the problem seems to be livelock caused by the screen refresh rate being lower than the playback rate, for example a 720p HDTV stream or a 2x time-stretched SDTV stream will want to playback at 59.94FPS, however by default the driver modeline results in 59.84Hz refresh rate. While this may be a bug in the driver that causes the X process to eat so much CPU, the issue can be mostly avoided by increasing the screen refresh rate to and or beyond the playback rate. This seems to affect all vsync'd playback, including mplayer. Ideally you'd want to be able to run 119.88Hz, so that a 720p stream could be timestretched 2x without issues. Any refresh rate that is a non-multiple of the actual framerate will result in jitter whenever there is motion on screen.

On an 8200 with 180+ drivers the default mode line is:

ModeLine     "1280x768_0" 81.0 1280 1328 1440 1688 768 769 772 802 +hsync -vsync

Changing the pixel clock to 81.2 Mhz as follows:

ModeLine     "1280x768_0" 81.2 1280 1328 1440 1688 768 769 772 802 +hsync -vsync

Results in a vertical sync rate of 59.98Hz.

Half vertical resolution on video playback/Blurry OSD

This appears to be a known problem when using interlaced modelines and nVidia's implementation of the XVideo extension. Workarounds are to run mythfrontend with XVideo disabled using the NO_XV environment variable set to 1 (i.e. NO_XV=1 mythfrontend) if you have CPU to burn, or put up with using a software deinterlacer (I recommend Linear Blend, but your tastes may differ).

This problem appears to have been resolved in recent combinations of video card, Xorg, nVidia driver and MythTV (e.g. 7600GT, 256.53 drivers, MythTV 0.24)

Analog audio does not work with HDMI TV input

Configuring Analog Sound DVI to HDMI

HDMI audio Nvidia cards

The following link describes how to enable and trouble shoot HDMI audio on a Nvidia VGA ftp://download.nvidia.com/XFree86/gpu-hdmi-audio-document/gpu-hdmi-audio.html

HDMI audio on GT210 and GT220 cards

Problems have been reported with getting sound to work through HDMI on the G210, GT220 etc series cards (at least: this may apply to any HDMI output). At the end of a long thread on the mythtv-users list about this problem a potential fix was posted (which should apply to any distro). Extensive discussions of progress for HDMI audio on these chips are on the XBMC forum and wiki, including a patch for alsa 1.0.22.1.

Got this to work this weekend on Ubuntu 9.04 with mythtv.22. Someone in another site/forum said to modify /etc/modprobe.d/alsa-base.conf and add this line to it:

options snd-hda-intel enable_msi=0 probe_mask=0xffff,0xfff2 (if your Nvidia card is card 1)

And to create this file in your home directory: .asoundrc with this info:

pcm.!default {
type asym
playback.pcm {
type plug
slave.pcm "hw:1,3"
}
}

A potentially different patch has also been submitted on alsa-devel that may find its way into the next version of alsa. See the discussion thread surrounding this post. Note that additional patches in the thread are necessary to enable GT220 support and proper compiling.

Note: In order for NVidia HDMI to negotiate sound with your device, ensure "UseEDID" is NOT set to "False" in your xorg.conf. Instead use "ModeValidation" "NoEdidModes" and "UseEdidFreqs" "False" for the same effect.

Mythfrontend 'Watch Recordings' Alpha Blends with Desktop Background

export XLIB_SKIP_ARGB_VISUALS=1

Then start mythfrontend.

Configure the monitor!

The NVidiaProprietaryDriver supports monitors as well as TV screen (using TV-Out). To use a (wide-screen) TV as monitor, see XorgConfMonitorSectionForTV. If the basic TV configuration appears correctly, but the image does not fit exactly on your television set, you may need to adjust the overscan settings.

User experience with NVidia cards

  • NVIDIA GeForce2 MX AGP: works (Michel: Nov 2006 )
  • NVIDIA GeForce4 MX440 AGP: works (Michel: Nov 2006 )
  • NVIDIA FX5200 AGP: works (Michel: Nov 2006 )
  • NVIDIA FX5700LE AGP: works (MarcT: Jan 2008 )
  • NVIDIA 6200LE PCI-e: works (Moosylog: Jan 2007 )
  • NVIDIA 6600GT PCI-e: works (michael573114: Jan 2009 )
  • NVIDIA Riva TNT2 AGP: defunct (Michel: Nov 2006 )
  • NVIDIA Riva TNT AGP (also known as AGP-V3400): defunct (Michel: Nov 2006 )
  • NVIDIA 6200 PCI: works but TV geometry is by default at 1024x768 and overscan is 25%. (mythtv0x7c1: July 2008)
  • NVIDIA GeForce2 GTS/Pro: works, but there is overscan. After using the nvtv tool [2] the result is fine. (Aorie: Feb 2009 )
  • NVIDIA 8200 on an ASUS board M3N78-VM with Mythbuntu 10.04 required an additional parameter in the kernel startup sequence "vmalloc=192M" in order to assign more memory to drivers, see full parameter list here https://bugs.launchpad.net/ubuntu/+source/linux/+bug/354633/comments/32
  • NVIDIA 7600GT PCIe: works, including low-dotclock PAL output via VGA connector (cowbutt: Jan 2011 )
  • NVIDIA GT240 PCIe: works, but WON'T output low-dotclock PAL via VGA connector, so unsuitable for VGA-to-SCART use (cowbutt: Jan 2011 )

Links to additional info

Internal mythTV wiki:

External: