Difference between revisions of "NVidiaProprietaryDriver"

From MythTV Official Wiki
Jump to: navigation, search
m (Version information: Added that the NVIDIA drivers support 720x480 TV output)
(Updated Ubuntu NVidia proprietary driver installation instructions.)
 
(52 intermediate revisions by 26 users not shown)
Line 4: Line 4:
  
 
This document describes how to use the NVIDIA Proprietary Video driver. The latest driver can be downloaded from the NVIDIA [http://www.nvidia.com/object/unix.html download site]
 
This document describes how to use the NVIDIA Proprietary Video driver. The latest driver can be downloaded from the NVIDIA [http://www.nvidia.com/object/unix.html download site]
 
= Version information =
 
{{VersionNote
 
|173.14
 
|Released on 17 June 2008
 
Latest release version
 
[http://us.download.nvidia.com/XFree86/Linux-x86/173.14.09/README/index.html more]
 
}}
 
 
{{VersionNote
 
|169.12
 
|Released on February 26, 2008
 
Latest release version
 
[http://www.nvidia.com/object/linux_display_ia32_169.12.html more]
 
}}
 
{{VersionNote
 
|96.43.05
 
|Released on January 29, 2008
 
Latest legacy GPU version
 
[http://www.nvidia.com/object/linux_display_x86_96.43.05.html more]
 
}}
 
{{VersionNote
 
|71.86.04
 
|Released on January 29, 2008
 
Latest (additional) legacy GPU version for ''staggeringly'' old cards
 
[http://www.nvidia.com/object/linux_display_x86_71.86.04.html more]
 
}}
 
  
  
 
{{Webpage|www.nvidia.com/object/linux_display_archive.html|Version history from NVIDIA and older drivers}}
 
{{Webpage|www.nvidia.com/object/linux_display_archive.html|Version history from NVIDIA and older drivers}}
 
{{Download|www.nvidia.com/object/unix.html|Download the latest driver from NVIDIA}}
 
{{Download|www.nvidia.com/object/unix.html|Download the latest driver from NVIDIA}}
 
{{Warning box|msg=The current driver versions (100.14.19, 100.14.23) has a bug in the Xv code which can cause a corrupted image (see [http://www.nvnews.net/vbulletin/showthread.php?t=98852 thread on nvnews]). Workarounds include changing the resolution back and forth. The bug should be fixed in [http://www.nvnews.net/vbulletin/showthread.php?t=102509 version 169.04 beta].}}
 
 
  
 
To determine your current version try
 
To determine your current version try
Line 55: Line 25:
  
 
== Is your card supported? ==
 
== Is your card supported? ==
Check if your card is supported by the driver, a list of supported hardware can be found in Appendix A of the README file.
+
Check if your card is supported by the driver, a list of supported hardware can be found in Appendix A of the README file. To extract the README file, 'cd' to the directory where you downloaded the driver, 'chmod' the ".run" file to make it executable, and run 'NVIDIA-Linux-x86-190.18-pkg1.run -x'. This will extract the contents, without running the included installer script.
  
 
Make sure you check the hardware list from the lastest version of the driver.  
 
Make sure you check the hardware list from the lastest version of the driver.  
Line 61: Line 31:
 
Also check the specs of your card, some integrated cards do not support dual screen. (pc screen and TV-out)
 
Also check the specs of your card, some integrated cards do not support dual screen. (pc screen and TV-out)
  
There are no published plans by Nvidia to support XvMC in the 8xxx series, which may make hardware assisted high def playback problematic.[http://www.gossamer-threads.com/lists/engine?do=post_view_flat;post=314263;page=1;mh=-1;list=mythtv;sb=post_latest_reply;so=ASC]
+
There are no published plans by Nvidia to support XvMC in the 8xxx series, which may make hardware assisted high def playback problematic.[http://www.gossamer-threads.com/lists/engine?do=post_view_flat;post=314263;page=1;mh=-1;list=mythtv;sb=post_latest_reply;so=ASC]. Instead, the newer cards support the VDPAU hardware assisted acceleration method. There is a binary choice between XVMC and VDPAU, depending upon what your card supports. The latest drivers support both methods of acceleration, so xvmc users still benefit from the latest enhancements elsewhere in the code. For further information see the [[VDPAU|VDPAU entry in this wiki]].
  
 
== How is your TV connected? (TV-Out) ==
 
== How is your TV connected? (TV-Out) ==
 
The type of output your video card can do, and the type of inputs your display device can handle are primarily what dictates what you should use to connect them. From highest- to lowest-quality, the order of consideration is: HDMI, DVI (both of which are digital), VGA, Component, S-Video and finally Composite (all of the rest are analog).
 
The type of output your video card can do, and the type of inputs your display device can handle are primarily what dictates what you should use to connect them. From highest- to lowest-quality, the order of consideration is: HDMI, DVI (both of which are digital), VGA, Component, S-Video and finally Composite (all of the rest are analog).
Suggest you to read this http://www.mythtv.org/wiki/index.php/Highly_Technical_Details
+
If you want the gory details, see [[Highly Technical Details]]
  
 
[[Image:Hdmiconnector.jpg|thumb|80px|left|HDMI digital]][[Image:Dviconnector.jpg|thumb|80px|left|DVI digital]]
 
[[Image:Hdmiconnector.jpg|thumb|80px|left|HDMI digital]][[Image:Dviconnector.jpg|thumb|80px|left|DVI digital]]
 
[[Image:Vgaconnector.jpg|thumb|80px|left|VGA analog]]
 
[[Image:Vgaconnector.jpg|thumb|80px|left|VGA analog]]
 
[[Image:Componentconnector.jpg|thumb|80px|left|Component analog]][[Image:Svideoconnector.png|thumb|80px|left|S-Video analog]][[Image:Compositeconnector.jpg|thumb|80px|left|Composite analog]]
 
[[Image:Componentconnector.jpg|thumb|80px|left|Component analog]][[Image:Svideoconnector.png|thumb|80px|left|S-Video analog]][[Image:Compositeconnector.jpg|thumb|80px|left|Composite analog]]
 
 
 
 
 
 
 
 
 
  
 
== What's your Television Broadcast Standard ==
 
== What's your Television Broadcast Standard ==
Line 110: Line 71:
 
= Installing the NVIDIA Driver =
 
= Installing the NVIDIA Driver =
  
Some distributions come with the NVidia driver in their package management system, however mostly this is NOT the current version. If your distribution does not ship the driver, or you have other reasons to install the latest driver from NVidia, everything you need to get this working you will find in the README file from the driver; read the chapter ''Configuring TV-Out''.
+
Some distributions come with the NVidia driver in their package management system, however mostly this is NOT the current version. If your distribution does not ship the driver, or you have other reasons to install the latest driver from NVidia, everything you need to get this working you will find in the README file from the driver; read the chapter ''Configuring TV-Out''. A full description of the installation is set out on the [[VDPAU|VDPAU page of this wiki]]. The short version, is that you have to switch to a non-graphical run level (run-level 3 in Fedora and run the installer package ( the '.run' file) which will build the required kernel module. Reboot into run-level 5 ( the X graphical level and enjoy.
 +
 
  
 
== openSUSE ==
 
== openSUSE ==
  
[[Image:Geeko_head48.png|25px]] openSUSE is a distribution which has packages for the driver. Please see the [http://en.opensuse.org/Nvidia opensuse NVIDIA] section on how to install the driver.
+
[[Image:Geeko_head48.png|25px]] openSUSE is a distribution which has packages for the driver. Please see the [http://en.opensuse.org/NVIDIA_drivers opensuse NVIDIA] page on how to install the driver.
  
 
== Debian GNU/Linux ==
 
== Debian GNU/Linux ==
Line 157: Line 119:
 
apt-get install nvidia-glx nvidia-kernel-common nvidia-kernel-`uname -r` nvidia-xconfig
 
apt-get install nvidia-glx nvidia-kernel-common nvidia-kernel-`uname -r` nvidia-xconfig
 
</pre>
 
</pre>
 +
 +
==  Ubuntu/Mythbuntu ==
 +
 +
You may see a recommendation elsewhere to add a ppa to apt to allow the installation of the NVidia driver from an Ubuntu repository. Be aware that in Ubuntu 16.04, this repository assumes the presence of the Mir package, which Canonical was developing in preparation for the unity8 desktop. Development of Mir was dropped in 2017, though. If you are running an earlier version of Ubuntu, or a flavor of Ubuntu that does not use the Unity desktop, including Mythbuntu, the installation of the drivers from the repository may not work. You may be better off using the "official" NVidia drivers at [http://www.nvidia.com/object/unix.html the NVidia website] or the standard NVidia proprietary drivers package in Ubuntu.
 +
 +
With some versions of Ubuntu, including at least the Ubuntu 18.04 beta (as of April 2018), the easiest way to install the NVidia proprietary drivers is to do so via the official Ubuntu repositories. This approach works on at least one Ubuntu 18.04 beta installation, upgraded from a Mythbuntu 14.04 installation:
 +
 +
First, verify the availability of appropriate hardware and software by typing <code>ubuntu-drivers devices</code>. You should see output summarizing the video devices on the system, including available drivers, as in:
 +
<nowiki>
 +
$ ubuntu-drivers devices
 +
== /sys/devices/pci0000:00/0000:00:02.0/0000:01:00.0 ==
 +
modalias : pci:v000010DEd0000128Bsv00003842sd00003710bc03sc00i00
 +
vendor  : NVIDIA Corporation
 +
model    : GK208B [GeForce GT 710]
 +
driver  : nvidia-driver-390 - distro non-free recommended
 +
driver  : xserver-xorg-video-nouveau - distro free builtin</nowiki>
 +
 +
This example shows an NVidia GeForce GT 710 card, with two drivers available -- the <code>xserver-xorg-video-nouveau</code> driver, which is the default; and the proprietary <code>nvidia-driver-390</code>, which supports VDPAU. Note that the latter is described in the output as <code>recommended</code>; thus, it can be installed with the following command:
 +
 +
<code>$ sudo ubuntu-drivers autoinstall</code>
 +
 +
You must then reboot the computer to activate the new driver.
 +
 +
Additional options, including details on using a PPA or installing the drivers directly from NVidia, are available [https://linuxconfig.org/how-to-install-the-nvidia-drivers-on-ubuntu-18-04-bionic-beaver-linux here.]
 +
 +
==  Fedora ==
 +
 +
For installation on a system with Fedora, see the installation [[Installing_MythTV_on_Fedora | instuctions]].
  
 
= Common problems and solutions =
 
= Common problems and solutions =
 +
 
Here is a test image for your TV-Out
 
Here is a test image for your TV-Out
 +
 
[[Image:Testimage.gif|left|120px|mythTV Test image]]
 
[[Image:Testimage.gif|left|120px|mythTV Test image]]
 
  
 
''click image to enlarge''
 
''click image to enlarge''
 
 
 
  
 
== Blue line(s) surrounding picture ==
 
== Blue line(s) surrounding picture ==
Line 214: Line 202:
 
=== Vertical Refresh Frequency ===
 
=== Vertical Refresh Frequency ===
 
Some televisions will show a black-and-white image if you attempt to display video using an unexpected vertical refresh frequency, e.g. showing PAL-I format video at 60Hz instead of 50Hz. This is because the TV may use the vertical refresh frequency to decide whether it should be decoding PAL (usually 50Hz) or NTSC (60Hz). Either change the TVStandard (see above) to match the frequency, or [[XorgConfMonitorSectionForTV|change the frequency]] to match the video standard.
 
Some televisions will show a black-and-white image if you attempt to display video using an unexpected vertical refresh frequency, e.g. showing PAL-I format video at 60Hz instead of 50Hz. This is because the TV may use the vertical refresh frequency to decide whether it should be decoding PAL (usually 50Hz) or NTSC (60Hz). Either change the TVStandard (see above) to match the frequency, or [[XorgConfMonitorSectionForTV|change the frequency]] to match the video standard.
 
=== TVOutFormat ===
 
Some televisions support multiple video formats on a single SCART or S-VIDEO socket, but without auto-detection of the signal type. If you are seeing a sharp black and white image instead of a colour image, the nVidia card is probably outputting a Composite signal, but your TV is expecting S-Video. Either switch the TV to Composite or "AV" mode (not recommended), or set the following option in xorg.conf:
 
 
Option "TVOutFormat"  "SVIDEO"
 
 
Conversely, to force Composite output (if your TV does not support S-VIDEO), use:
 
 
Option "TVOutFormat"  "COMPOSITE"
 
 
S-Video generally has far superior image quality to composite video, especially from nVidia cards.
 
 
Annoyingly, some nVidia card/TV combinations will default to Composite mode at boot-time, meaning that your boot sequence will be in black and white until xorg.conf is loaded. (If anyone knows a fix for this (using different cables perhaps?), please insert it here!)
 
  
 
=== Television does not support S-Video ===
 
=== Television does not support S-Video ===
Line 245: Line 220:
 
== Small (unreadable) fonts or too big fonts? ==
 
== Small (unreadable) fonts or too big fonts? ==
  
The entire GUI of Mythtv was designed for 100 DPI screens,
+
See [[Frequently_Asked_Questions#All_my_fonts_look_like_they_are_the_wrong_sizes.2C_how_can_I_correct_this.3F|The FAQ "font size" entry]] and [[Specifying_DPI_for_NVIDIA_Cards|Specifying DPI for NVidia Cards]].
and the fonts are based on that selection as well.
 
  
See [[Specifying_DPI_for_NVIDIA_Cards|Specifying DPI for NVidia Cards]].
+
== Nvidia-cards and no picture when box is on before the TV ==
 +
 
 +
See [[Nvidia-cards and no picture when box is on before the TV | this link]] if you have this Problem
  
 
== System instability ==
 
== System instability ==
Line 276: Line 252:
 
  Option "UseEvents" "True"
 
  Option "UseEvents" "True"
  
to the Screen or Driver section of /etc/X11/xorg.conf.
+
to the Screen or Device section of /etc/X11/xorg.conf.
  
 
From NVIDIA docs:
 
From NVIDIA docs:
Line 289: Line 265:
 
However, this can cause problems on some hardware so if X starts getting flaky and crashing, set UseEvents back to false.
 
However, this can cause problems on some hardware so if X starts getting flaky and crashing, set UseEvents back to false.
 
</blockquote>
 
</blockquote>
 +
 +
'''Note:'''  As of driver 177.80, this problem persists, and is not fixable, on the integrated GeForce 8200 GPU.  The issue may or may not be fixed in later revisions.
 +
 +
On driver 180+ on 8200 GPU the problem seems to be livelock caused by the screen refresh rate being lower than the playback rate, for example a 720p HDTV stream or a 2x time-stretched SDTV stream will want to playback at 59.94FPS, however by default the driver modeline results in 59.84Hz refresh rate. While this may be a bug in the driver that causes the X process to eat so much CPU, the issue can be mostly avoided by increasing the screen refresh rate to and or beyond the playback rate. This seems to affect all vsync'd playback, including mplayer. Ideally you'd want to be able to run 119.88Hz, so that a 720p stream could be timestretched 2x without issues. Any refresh rate that is a non-multiple of the actual framerate will result in jitter whenever there is motion on screen.<br/>
 +
 +
On an 8200 with 180+ drivers the default mode line is:<br/>
 +
ModeLine    "1280x768_0" 81.0 1280 1328 1440 1688 768 769 772 802 +hsync -vsync<br/>
 +
Changing the pixel clock to 81.2 Mhz as follows:
 +
ModeLine    "1280x768_0" 81.2 1280 1328 1440 1688 768 769 772 802 +hsync -vsync<br/>
 +
Results in a vertical sync rate of 59.98Hz.<br/>
  
 
== Half vertical resolution on video playback/Blurry OSD ==
 
== Half vertical resolution on video playback/Blurry OSD ==
  
 
This appears to be a known problem when using interlaced modelines and nVidia's implementation of the XVideo extension. Workarounds are to run mythfrontend with XVideo disabled using the NO_XV environment variable set to 1 (i.e. NO_XV=1 mythfrontend) if you have CPU to burn, or put up with using a software deinterlacer (I recommend Linear Blend, but your tastes may differ).
 
This appears to be a known problem when using interlaced modelines and nVidia's implementation of the XVideo extension. Workarounds are to run mythfrontend with XVideo disabled using the NO_XV environment variable set to 1 (i.e. NO_XV=1 mythfrontend) if you have CPU to burn, or put up with using a software deinterlacer (I recommend Linear Blend, but your tastes may differ).
 +
 +
This problem appears to have been resolved in recent combinations of video card, Xorg, nVidia driver and MythTV (e.g. 7600GT, 256.53 drivers, MythTV 0.24)
 +
 +
== Analog audio does not work with HDMI TV input ==
 +
 +
[[Configuring_Analog_Sound_DVI_to_HDMI|Configuring Analog Sound DVI to HDMI]]
 +
 +
== HDMI audio Nvidia cards ==
 +
The following link describes how to enable and trouble shoot HDMI audio on a Nvidia VGA
 +
ftp://download.nvidia.com/XFree86/gpu-hdmi-audio-document/gpu-hdmi-audio.html
 +
 +
== HDMI audio on GT210 and GT220 cards ==
 +
 +
Problems have been reported with getting sound to work through HDMI on the G210, GT220 etc series cards (at least: this may apply to any HDMI output). At the end of a long thread on the mythtv-users list about this problem a potential fix was posted (which should apply to any distro).
 +
Extensive discussions of progress for HDMI audio on these chips are on the XBMC [http://forum.xbmc.org/showthread.php?t=69601 forum] and [http://wiki.xbmc.org/?title=HOW-TO_set_up_HDMI_audio_on_nVidia_GeForce_G210%2C_GT220%2C_or_GT240 wiki], including a patch for alsa 1.0.22.1.
 +
 +
Got this to work this weekend on Ubuntu 9.04 with mythtv.22.  Someone in another site/forum said to modify /etc/modprobe.d/alsa-base.conf and add this line to it:
 +
 +
options snd-hda-intel enable_msi=0 probe_mask=0xffff,0xfff2  (if your Nvidia card is card 1)
 +
 +
And to create this file in your home directory:  .asoundrc
 +
with this info:
 +
 +
<pre>
 +
pcm.!default {
 +
type asym
 +
playback.pcm {
 +
type plug
 +
slave.pcm "hw:1,3"
 +
}
 +
}
 +
</pre>
 +
 +
A potentially different patch has also been submitted on alsa-devel that may find its way into the next version of alsa.  See the discussion thread surrounding [http://mailman.alsa-project.org/pipermail/alsa-devel/2010-March/025791.html this post].  Note that additional patches in the thread are necessary to enable GT220 support and proper compiling.
 +
 +
'''Note:''' In order for NVidia HDMI to negotiate sound with your device, ensure "UseEDID" is NOT set to "False" in your xorg.conf. Instead use "ModeValidation" "NoEdidModes" and "UseEdidFreqs" "False" for the same effect.
 +
 +
== Mythfrontend 'Watch Recordings' Alpha Blends with Desktop Background ==
 +
 +
export XLIB_SKIP_ARGB_VISUALS=1
 +
 +
Then start mythfrontend.
  
 
= Configure the monitor! =
 
= Configure the monitor! =
Line 305: Line 333:
 
*NVIDIA FX5700LE AGP: works ([[User:MarcT|MarcT]]: Jan 2008 )
 
*NVIDIA FX5700LE AGP: works ([[User:MarcT|MarcT]]: Jan 2008 )
 
*NVIDIA 6200LE PCI-e: works ([[User:Moosylog|Moosylog]]: Jan 2007 )
 
*NVIDIA 6200LE PCI-e: works ([[User:Moosylog|Moosylog]]: Jan 2007 )
 +
*NVIDIA 6600GT PCI-e: works ([[User:Michael573114|michael573114]]: Jan 2009 )
 
*NVIDIA Riva TNT2 AGP: '''defunct''' ([[User:Michel|Michel]]: Nov 2006 )
 
*NVIDIA Riva TNT2 AGP: '''defunct''' ([[User:Michel|Michel]]: Nov 2006 )
 
*NVIDIA Riva TNT AGP (also known as AGP-V3400): '''defunct''' ([[User:Michel|Michel]]: Nov 2006 )
 
*NVIDIA Riva TNT AGP (also known as AGP-V3400): '''defunct''' ([[User:Michel|Michel]]: Nov 2006 )
 
+
*NVIDIA 6200 PCI: works but TV geometry is by default at 1024x768 and overscan is 25%. ([[User:mythtv0x7c1|mythtv0x7c1]]: July 2008)
More user experience can be found on [[TV_Out|the mythTV wiki TV Out page]]
+
*NVIDIA GeForce2 GTS/Pro: works, but there is overscan. After using the nvtv tool [http://sourceforge.net/projects/nv-tv-out/] the result is fine. ([[User:Aorie|Aorie]]: Feb 2009 )
 +
*NVIDIA 8200 on an ASUS board M3N78-VM with Mythbuntu 10.04 required an additional parameter in the kernel startup sequence "vmalloc=192M" in order to assign more memory to drivers, see full parameter list here https://bugs.launchpad.net/ubuntu/+source/linux/+bug/354633/comments/32
 +
*NVIDIA 7600GT PCIe: works, including low-dotclock PAL output via VGA connector ([[User:cowbutt|cowbutt]]: Jan 2011 )
 +
*NVIDIA GT240 PCIe: works, but WON'T output low-dotclock PAL via VGA connector, so unsuitable for VGA-to-SCART use ([[User:cowbutt|cowbutt]]: Jan 2011 )
  
 
= Links to additional info =
 
= Links to additional info =
Line 314: Line 346:
  
 
* [[Configuring HDTV]]
 
* [[Configuring HDTV]]
* [[ComponentOut|Component Out]]
+
* [[NVIDIA_Component_Out]]
 
* [[Specifying DPI for NVIDIA Cards]]
 
* [[Specifying DPI for NVIDIA Cards]]
 +
* [[VDPAU]]
  
 
'''External:'''
 
'''External:'''

Latest revision as of 17:40, 7 April 2018

Introduction

NVIDIA does not provide the documentation for their hardware, which is necessary in order for programmers to write appropriate and effective open source drivers for NVIDIA's products. Instead, NVIDIA provides their own binary graphics drivers for X.Org. This closed source driver is referred to as the NVidia Proprietary Video Driver.

This document describes how to use the NVIDIA Proprietary Video driver. The latest driver can be downloaded from the NVIDIA download site


Webpage.png - Version history from NVIDIA and older drivers

Download.png - Download the latest driver from NVIDIA

To determine your current version try

$ cat /proc/driver/nvidia/version
NVRM version: NVIDIA UNIX x86 Kernel Module  173.14.09  Wed Jun  4 23:43:17 PDT 2008
GCC version:  gcc version 4.1.2 20070925 (Red Hat 4.1.2-33)

Resolutions supported by the driver depend on the card in use. But most cards have a TV-encoder onboard (the actual IC making the TV-Out connector(s) work) with only support for a few (lower) resolutions... all 4x3 aspect ratio :-( Here's the list:
- 1024x768
- 800x600
- 720x576 (actually a very very good resolution for (european) TVs using the PAL TV-standard!)
- 720x480 (NTSC)
- 640x480

Before you begin

Is your card supported?

Check if your card is supported by the driver, a list of supported hardware can be found in Appendix A of the README file. To extract the README file, 'cd' to the directory where you downloaded the driver, 'chmod' the ".run" file to make it executable, and run 'NVIDIA-Linux-x86-190.18-pkg1.run -x'. This will extract the contents, without running the included installer script.

Make sure you check the hardware list from the lastest version of the driver. As an example, (here is Appendix A from version 100.14.11) Also check the specs of your card, some integrated cards do not support dual screen. (pc screen and TV-out)

There are no published plans by Nvidia to support XvMC in the 8xxx series, which may make hardware assisted high def playback problematic.[1]. Instead, the newer cards support the VDPAU hardware assisted acceleration method. There is a binary choice between XVMC and VDPAU, depending upon what your card supports. The latest drivers support both methods of acceleration, so xvmc users still benefit from the latest enhancements elsewhere in the code. For further information see the VDPAU entry in this wiki.

How is your TV connected? (TV-Out)

The type of output your video card can do, and the type of inputs your display device can handle are primarily what dictates what you should use to connect them. From highest- to lowest-quality, the order of consideration is: HDMI, DVI (both of which are digital), VGA, Component, S-Video and finally Composite (all of the rest are analog). If you want the gory details, see Highly Technical Details

HDMI digital
DVI digital
VGA analog
Component analog
S-Video analog
Composite analog

What's your Television Broadcast Standard

Depending on your country, you use NTSC, (National Television Standards Committee) or PAL, (Phase Alternating Line) for you Television Broadcast Standard.


TVStandards

PAL-B used in Australia, Belgium, Denmark, Finland, Germany, Guinea, Hong Kong, India, Indonesia, Italy, Luxembourg, Malaysia, The Netherlands, New Zealand, Norway, Portugal, Singapore, Spain, Sweden, and Switzerland

PAL-D used in China and North Korea

PAL-G Denmark, Finland, Germany, Italy, Luxembourg, Malaysia, The Netherlands, Norway, Portugal, Spain, Sweden, and Switzerland

PAL-H used in Belgium

PAL-I used in Hong Kong and The United Kingdom

PAL-K1 used in Guinea

PAL-M used in Brazil

PAL-N used in France, Paraguay, and Uruguay

PAL-NC used in Argentina

NTSC-J used in Japan

NTSC-M Canada, Chile, Colombia, Costa Rica, Ecuador, Haiti, Honduras, Mexico, Panama, Puerto Rico, South Korea, Taiwan, United States of America, and Venezuela

Installing the NVIDIA Driver

Some distributions come with the NVidia driver in their package management system, however mostly this is NOT the current version. If your distribution does not ship the driver, or you have other reasons to install the latest driver from NVidia, everything you need to get this working you will find in the README file from the driver; read the chapter Configuring TV-Out. A full description of the installation is set out on the VDPAU page of this wiki. The short version, is that you have to switch to a non-graphical run level (run-level 3 in Fedora and run the installer package ( the '.run' file) which will build the required kernel module. Reboot into run-level 5 ( the X graphical level and enjoy.


openSUSE

Geeko head48.png openSUSE is a distribution which has packages for the driver. Please see the opensuse NVIDIA page on how to install the driver.

Debian GNU/Linux

Debian also has packages available to install the driver. Note that these are in the non-free section. The following command gives an overview of the available packages:

root@mast:~# apt-cache search nvidia | grep ^nvidia
nvidia-xconfig - The NVIDIA X Configuration Tool
nvidia-cg-toolkit - NVIDIA Cg Toolkit installer
nvidia-kernel-common - NVIDIA binary kernel module common files
nvidia-settings - Tool of configuring the NVIDIA graphics driver
nvidia-glx - NVIDIA binary XFree86 4.x driver
nvidia-glx-dev - NVIDIA binary XFree86 4.x / Xorg driver development files
nvidia-glx-legacy - NVIDIA binary Xorg driver (legacy version)
nvidia-glx-legacy-dev - NVIDIA binary Xorg driver development files
nvidia-kernel-2.6-486 - NVIDIA binary kernel module for 2.6 series compiled for 486
nvidia-kernel-2.6-686 - NVIDIA binary kernel module for 2.6 series compiled for 686
nvidia-kernel-2.6-k7 - NVIDIA binary kernel module for 2.6 series compiled for k7
nvidia-kernel-2.6.18-4-486 - NVIDIA binary kernel module for Linux 2.6.18-4-486
nvidia-kernel-2.6.18-4-686 - NVIDIA binary kernel module for Linux 2.6.18-4-686
nvidia-kernel-2.6.18-4-k7 - NVIDIA binary kernel module for Linux 2.6.18-4-k7
nvidia-kernel-2.6.18-5-486 - NVIDIA binary kernel module for Linux 2.6.18-5-486
nvidia-kernel-2.6.18-5-686 - NVIDIA binary kernel module for Linux 2.6.18-5-686
nvidia-kernel-2.6.18-5-k7 - NVIDIA binary kernel module for Linux 2.6.18-5-k7
nvidia-kernel-legacy-2.6-486 - NVIDIA binary kernel module for 2.6 series compiled for 486
nvidia-kernel-legacy-2.6-686 - NVIDIA binary kernel module for 2.6 series compiled for 686
nvidia-kernel-legacy-2.6-k7 - NVIDIA binary kernel module for 2.6 series compiled for k7
nvidia-kernel-legacy-2.6.18-4-486 - NVIDIA binary kernel module for Linux 2.6.18-4-486 (legacy version)
nvidia-kernel-legacy-2.6.18-4-686 - NVIDIA binary kernel module for Linux 2.6.18-4-686 (legacy version)
nvidia-kernel-legacy-2.6.18-4-k7 - NVIDIA binary kernel module for Linux 2.6.18-4-k7 (legacy version)
nvidia-kernel-legacy-2.6.18-5-486 - NVIDIA binary kernel module for Linux 2.6.18-5-486 (legacy version)
nvidia-kernel-legacy-2.6.18-5-686 - NVIDIA binary kernel module for Linux 2.6.18-5-686 (legacy version)
nvidia-kernel-legacy-2.6.18-5-k7 - NVIDIA binary kernel module for Linux 2.6.18-5-k7 (legacy version)
nvidia-kernel-legacy-source - NVIDIA binary kernel module source (legacy version)
nvidia-kernel-source - NVIDIA binary kernel module source

You need to install the packages nvidia-glx, nvidia-kernel-common, the package for your running kernel and for easy configuration nvidia-xconfig. Issue the following command:

apt-get install nvidia-glx nvidia-kernel-common nvidia-kernel-`uname -r` nvidia-xconfig

Ubuntu/Mythbuntu

You may see a recommendation elsewhere to add a ppa to apt to allow the installation of the NVidia driver from an Ubuntu repository. Be aware that in Ubuntu 16.04, this repository assumes the presence of the Mir package, which Canonical was developing in preparation for the unity8 desktop. Development of Mir was dropped in 2017, though. If you are running an earlier version of Ubuntu, or a flavor of Ubuntu that does not use the Unity desktop, including Mythbuntu, the installation of the drivers from the repository may not work. You may be better off using the "official" NVidia drivers at the NVidia website or the standard NVidia proprietary drivers package in Ubuntu.

With some versions of Ubuntu, including at least the Ubuntu 18.04 beta (as of April 2018), the easiest way to install the NVidia proprietary drivers is to do so via the official Ubuntu repositories. This approach works on at least one Ubuntu 18.04 beta installation, upgraded from a Mythbuntu 14.04 installation:

First, verify the availability of appropriate hardware and software by typing ubuntu-drivers devices. You should see output summarizing the video devices on the system, including available drivers, as in:

$ ubuntu-drivers devices
== /sys/devices/pci0000:00/0000:00:02.0/0000:01:00.0 ==
modalias : pci:v000010DEd0000128Bsv00003842sd00003710bc03sc00i00
vendor   : NVIDIA Corporation
model    : GK208B [GeForce GT 710]
driver   : nvidia-driver-390 - distro non-free recommended
driver   : xserver-xorg-video-nouveau - distro free builtin

This example shows an NVidia GeForce GT 710 card, with two drivers available -- the xserver-xorg-video-nouveau driver, which is the default; and the proprietary nvidia-driver-390, which supports VDPAU. Note that the latter is described in the output as recommended; thus, it can be installed with the following command:

$ sudo ubuntu-drivers autoinstall

You must then reboot the computer to activate the new driver.

Additional options, including details on using a PPA or installing the drivers directly from NVidia, are available here.

Fedora

For installation on a system with Fedora, see the installation instuctions.

Common problems and solutions

Here is a test image for your TV-Out

mythTV Test image

click image to enlarge

Blue line(s) surrounding picture

One common symptom is the following: the NVidia driver may (or may not) give one or more blue lines surrounding the display during TV or DVD playback. This is something which is mostly seen on widescreen TV displays. The 'xvattr' command can be used to solve the problem:

xvattr -a XV_COLORKEY -v 66048

Add the command somewhere in the Xorg startup sequence. This may be done in various ways depending on your Linux distribution (need more distribution-specific information on this item!)

Solution for Ubuntu and Debian GNU/Linux 4.0

Create the following file:

/etc/X11/Xsession.d/98custom_disable-blueline

and add the following contents:

#!/bin/sh
xvattr -a XV_COLORKEY -v 66048

Restart your graphical environment and you're done!

Solution for openSUSE

Geeko head48.pngAdd the following line to /etc/X11/xinit/xinitrc after "Add your own lines here":

#disable-blueline:
xvattr -a XV_COLORKEY -v 66048

more on openSUSE and NVidia http://www.suse.de/~sndirsch/nvidia-installer-HOWTO.html

Black-and-White output

There are at least five possible causes:

TVStandard

Make sure the TVStandard option is set to the one valid for your country. Many nVidia cards default to the American TVStandard 'NTSC'. For a mythtv box in the Netherlands use the following option in the 'Device' section in xorg.conf:

Option "TVStandard" "PAL-B"

If you're in the UK, use:

Option "TVStandard" "PAL-I"

Many other variations of PAL and other standards are supported. For a complete list, see the nVidia "readme" for your driver version (see the nVidia Linux driver download page).

Vertical Refresh Frequency

Some televisions will show a black-and-white image if you attempt to display video using an unexpected vertical refresh frequency, e.g. showing PAL-I format video at 60Hz instead of 50Hz. This is because the TV may use the vertical refresh frequency to decide whether it should be decoding PAL (usually 50Hz) or NTSC (60Hz). Either change the TVStandard (see above) to match the frequency, or change the frequency to match the video standard.

Television does not support S-Video

Some televisions do not support a S-Video signal on a SCART input. This results in only the luminance signal being used. A hack exists which merges the luminance and chroma signals on the SCART connector which essentially creates a composite signal that the television uses. This is performed by connecting pins 15 (chrominance in) and 20 (luminance in) on the SCART connector. This signal is unfiltered and is at best equal in quality to a direct composite connection. See this thread for further information.

Cable Connections

Obvious perhaps, but make sure that the ends of your cables are firmly in place! (especially if you're using SCART/RGB sockets)

Annoying NVIDIA logo?

To disable the nVidia logo splash screen that is displayed when X is first initialized, you may use the following command to automatically edit your xorg.conf correctly:

nvidia-xconfig --no-logo

For older driver versions or installations without nvidia-xconfig installed, one will have to manually edit their xorg.conf and add this line in the 'Device' section:

Option "NoLogo" "True"

Small (unreadable) fonts or too big fonts?

See The FAQ "font size" entry and Specifying DPI for NVidia Cards.

Nvidia-cards and no picture when box is on before the TV

See this link if you have this Problem

System instability

The first port of call if your graphics card is unstable is the README that is included with your Nvidia driver package. There may be an online version of the README linked from the Nvidia Linux driver page for your system's processor architecture.

You should also have a look at the Linux forums which are linked to from the same page.

Try to reduce the AGP speed down (this seems to be a problem with boards with VIA KT333 and KT400 chipsets and also earlier VIA chipsets). Follow the instructions in the README for configuring the AGP rate. With an MSI KT4 Ultra (MS-6590) and Nvidia GeForce 6200A video card, the default AGP speed of 8x made the system unstable. Once it was set to 4 the system became rock steady. The BIOS changes weren't tried, they may also have helped and avoided the need for the edit/recompile.

It's possible to perform initial testing by disabling AGP entirely using the NvAGP option in your X11 config file. Check the APPENDIX F, AGP section in the README for more details.

As noted in the NVidia release notes there are known issues with some VIA chipsets as found on Athlon XP mainboards; The KT266 and KT333 chipsets are known to have problems. The solution described in the releasenotes did not solve my instability problems. However, booting the system with the 'noapic' option appended to the kernel boot line was enough to get my system stable. --Michel 14:20, 26 March 2007 (UTC)

Choppy video/High CPU Usage

Symptoms:

  • Choppy video
  • CPU usage near 100%
    • Xorg using the most CPU time
    • mythfrontend using the second most CPU time
    • other processes using negligible CPU time
  • Sync to VBlank set in nvidia-settings

Solution: Add the line

Option "UseEvents" "True"

to the Screen or Device section of /etc/X11/xorg.conf.

From NVIDIA docs:

Option "UseEvents" "boolean":

Enables the use of system events in some cases when the X driver is waiting for the hardware. The X driver can briefly spin through a tight loop when waiting for the hardware. With this option the X driver instead sets an event handler and waits for the hardware through the ‘poll()’ system call. Default: the use of the events is disabled.

This goes from using a "busy" wait by default to a less CPU intensive system call: poll, ppoll - wait for some event on a file descriptor

However, this can cause problems on some hardware so if X starts getting flaky and crashing, set UseEvents back to false.

Note: As of driver 177.80, this problem persists, and is not fixable, on the integrated GeForce 8200 GPU. The issue may or may not be fixed in later revisions.

On driver 180+ on 8200 GPU the problem seems to be livelock caused by the screen refresh rate being lower than the playback rate, for example a 720p HDTV stream or a 2x time-stretched SDTV stream will want to playback at 59.94FPS, however by default the driver modeline results in 59.84Hz refresh rate. While this may be a bug in the driver that causes the X process to eat so much CPU, the issue can be mostly avoided by increasing the screen refresh rate to and or beyond the playback rate. This seems to affect all vsync'd playback, including mplayer. Ideally you'd want to be able to run 119.88Hz, so that a 720p stream could be timestretched 2x without issues. Any refresh rate that is a non-multiple of the actual framerate will result in jitter whenever there is motion on screen.

On an 8200 with 180+ drivers the default mode line is:

ModeLine     "1280x768_0" 81.0 1280 1328 1440 1688 768 769 772 802 +hsync -vsync

Changing the pixel clock to 81.2 Mhz as follows:

ModeLine     "1280x768_0" 81.2 1280 1328 1440 1688 768 769 772 802 +hsync -vsync

Results in a vertical sync rate of 59.98Hz.

Half vertical resolution on video playback/Blurry OSD

This appears to be a known problem when using interlaced modelines and nVidia's implementation of the XVideo extension. Workarounds are to run mythfrontend with XVideo disabled using the NO_XV environment variable set to 1 (i.e. NO_XV=1 mythfrontend) if you have CPU to burn, or put up with using a software deinterlacer (I recommend Linear Blend, but your tastes may differ).

This problem appears to have been resolved in recent combinations of video card, Xorg, nVidia driver and MythTV (e.g. 7600GT, 256.53 drivers, MythTV 0.24)

Analog audio does not work with HDMI TV input

Configuring Analog Sound DVI to HDMI

HDMI audio Nvidia cards

The following link describes how to enable and trouble shoot HDMI audio on a Nvidia VGA ftp://download.nvidia.com/XFree86/gpu-hdmi-audio-document/gpu-hdmi-audio.html

HDMI audio on GT210 and GT220 cards

Problems have been reported with getting sound to work through HDMI on the G210, GT220 etc series cards (at least: this may apply to any HDMI output). At the end of a long thread on the mythtv-users list about this problem a potential fix was posted (which should apply to any distro). Extensive discussions of progress for HDMI audio on these chips are on the XBMC forum and wiki, including a patch for alsa 1.0.22.1.

Got this to work this weekend on Ubuntu 9.04 with mythtv.22. Someone in another site/forum said to modify /etc/modprobe.d/alsa-base.conf and add this line to it:

options snd-hda-intel enable_msi=0 probe_mask=0xffff,0xfff2 (if your Nvidia card is card 1)

And to create this file in your home directory: .asoundrc with this info:

pcm.!default {
type asym
playback.pcm {
type plug
slave.pcm "hw:1,3"
}
}

A potentially different patch has also been submitted on alsa-devel that may find its way into the next version of alsa. See the discussion thread surrounding this post. Note that additional patches in the thread are necessary to enable GT220 support and proper compiling.

Note: In order for NVidia HDMI to negotiate sound with your device, ensure "UseEDID" is NOT set to "False" in your xorg.conf. Instead use "ModeValidation" "NoEdidModes" and "UseEdidFreqs" "False" for the same effect.

Mythfrontend 'Watch Recordings' Alpha Blends with Desktop Background

export XLIB_SKIP_ARGB_VISUALS=1

Then start mythfrontend.

Configure the monitor!

The NVidiaProprietaryDriver supports monitors as well as TV screen (using TV-Out). To use a (wide-screen) TV as monitor, see XorgConfMonitorSectionForTV. If the basic TV configuration appears correctly, but the image does not fit exactly on your television set, you may need to adjust the overscan settings.

User experience with NVidia cards

  • NVIDIA GeForce2 MX AGP: works (Michel: Nov 2006 )
  • NVIDIA GeForce4 MX440 AGP: works (Michel: Nov 2006 )
  • NVIDIA FX5200 AGP: works (Michel: Nov 2006 )
  • NVIDIA FX5700LE AGP: works (MarcT: Jan 2008 )
  • NVIDIA 6200LE PCI-e: works (Moosylog: Jan 2007 )
  • NVIDIA 6600GT PCI-e: works (michael573114: Jan 2009 )
  • NVIDIA Riva TNT2 AGP: defunct (Michel: Nov 2006 )
  • NVIDIA Riva TNT AGP (also known as AGP-V3400): defunct (Michel: Nov 2006 )
  • NVIDIA 6200 PCI: works but TV geometry is by default at 1024x768 and overscan is 25%. (mythtv0x7c1: July 2008)
  • NVIDIA GeForce2 GTS/Pro: works, but there is overscan. After using the nvtv tool [2] the result is fine. (Aorie: Feb 2009 )
  • NVIDIA 8200 on an ASUS board M3N78-VM with Mythbuntu 10.04 required an additional parameter in the kernel startup sequence "vmalloc=192M" in order to assign more memory to drivers, see full parameter list here https://bugs.launchpad.net/ubuntu/+source/linux/+bug/354633/comments/32
  • NVIDIA 7600GT PCIe: works, including low-dotclock PAL output via VGA connector (cowbutt: Jan 2011 )
  • NVIDIA GT240 PCIe: works, but WON'T output low-dotclock PAL via VGA connector, so unsuitable for VGA-to-SCART use (cowbutt: Jan 2011 )

Links to additional info

Internal mythTV wiki:

External: