- 1 Introduction
- 2 Drivers
- 3 Obtaining XvMC capable drivers for your chipset
- 4 Compiling/Installing the driver
- 5 Configuring xorg.conf
- 6 Enabling the chipset library
- 7 Checking your installation
- 8 MythTV XvMC Support
- 9 Hardware acceleration processes not yet supported by XvMC
- 10 Alternatives
- 11 External Links
( Current as of May 1, 2008)
X-Video Motion Compensation, or XvMC, is a standardized API in The X Window System which allows video programs to offload motion compensation and iDCT (Inverse Discrete Cosine Transform) portions of MPEG-2 decoding to the GPU hardware, through the use of XvMC enabled drivers. XvMC can greatly reduce CPU utilization when playing back MPEG-2 video. In theory it should also reduce bus bandwidth requirements. Savings will be most dramatic with slow CPUs or when playing HDTV. SDTV will benefit as well, but most modern CPUs can already play back SDTV without XvMC. MythTV can be compiled to use XvMC for native playback. External players such as Xine and Mplayer can also be compiled to support XvMC.
Currently, only MPEG-2 is supported. This is usually a hardware limitation, but can also in some chips case be a device driver limitation. As an exception, one of the newer VIA Unichrome chips is capable of XvMC accelerated MPEG-4 playback, but it is not supported by the official UniChrome driver at the time of writing this, instead you have to turn to the openChrome project drivers (which has an updated forked version of libxvmc).
A new video acceleration API is being developed, in an effort lead by Intel. This new API supports more complete offload (VLD) as well as iDCT+MC, and can support acceleration of MPEG-4 ASP (H.263), MPEG-4 AVC (H.264), VC-1/VMW3, as well as MPEG-2. The website for this effort is located at: http://www.freedesktop.org/wiki/Software/vaapi. XvMC-vld is also available on Via Unichrome-Pro chipsets, through the openchrome driver.
Some users have also provided a chart of example CPU savings
Each hardware video card capable of XvMC video acceleration requires a software driver to enable these features. Below is a list of the recommended software drivers and driver versions for each supported hardware type.
|NVIDIA||Yes||www.nvidia.com||Any recent version of the driver is acceptable. See the Nvidia page for more details. Be careful though: some nVidia cards do not support XvMC: the only cards that support XvMC are those that support nVidia "PureVideo", which is not the case for most motherboard GPUs such as the GeForce 7 series. Geforce 8 and later do not support XvMC: driver limitation.||nvidia.|
|NVIDIA||No||Included with Xorg distribution||Included version||nv|
|Unichrome||Yes||Included in Xorg 6.9 and above distributions||Included version||openchrome|
|Intel||Yes||Included in Xorg distribution||Included version||i810 or intel|
||No||At this time, there is no XvMC support for ATI cards in Linux drivers||N/A||N/A|
||Yes||s3graphics.com||MPEG-2 HW acceleration (only support Red Hat Enterprise Linux WS release 4) on some Chrome 20 GPUs made by S3 company||s3g|
Obtaining XvMC capable drivers for your chipset
In order for MythTV to utilize the XvMC feature of your video chipset, you need to install the appropriate XvMC-enabled driver for your distribution. Drivers are presently available for NVidia, Via Unichrome and Intel video chipsets. There are presently (May, 2008) no XvMC enabledd ATI drivers. The installation process will vary by chipset and distribution but some general recommendations can be made.
There are main three steps: installing the driver (including compiling in some cases), revising xorg.conf and XvMCConfig to enable XvMC, and, finally, setting the playback profile in Mythtv (see the Playback_profiles page.
For many distributions, a separate XvMCW (Wrapper) library package can be installed. This is the library that MythTV will use to hook into the XvMC support in the driver. It provides a common interface for applications wishing to use XvMC.
yum install libXvMC
or the equivalent will do the trick. This may already be installed by your distribution.
For NVIDIA chipsets, the open source nv driver is provided in all distributions. The nv driver does not enable XvMC. NVidia provides a closed source binary driver installer package on their website which gives an XvMC enabled driver. Clicking on the 'Archive' link, below the first listed driver will disclose not only the older "legacy" drivers, but also other more recent drivers. Download and follow the instructions there for executing the package and installing the modules.
The Envy project provides binary packages for Ubuntu and Debian to quickly install and configure nvidia drivers and XvMC support. It also will install the binary ATI drivers which do not, however, support XvMC.
VIA Unichrome Chipsets
For via unichrome chipsets, go to Openchrome.org to download the source code or for links to binary driver packages. Note that not all features of the unichrome chipsets, especially TV-out, are presently working, but XvMC appears to be working on all chipsets using video-out (VGA/DVI).
Intel chipsets are often considered 'low-powered' but in fact the drivers have been deficient in enabling the capabilities of the hardware. For Intel chipsets, as of May 1, 2008, the intel driver which is included in most distributions does NOT enable XvMC, except on i810 and i815 chipsets. XvMC is now available through the xf86-video-intel driver as of version 2.3.0 on all Intel chipsets and is presently available as a source tar from:
or through git from
$ git-clone git://anongit.freedesktop.org/git/xorg/driver/xf86-video-intel
Compiling/Installing the driver
The nvidia installer binary runs from runlevel 3. Edit the first uncommented line in /etc/inittab to read:
and reboot. (Reverse the edit after the installation). The installer binary will install the driver in /usr/lib/xorg/modules/drivers and the accompanying libraries in /usr/lib.
To get the latest code using anonymous subversion, cd to a scratch space and use:
svn co http://svn.openchrome.org/svn/trunk openchrome
'cd' into the newly created openchrome sub-directory, and do the 'configure && make && make install' cycle. The openchrome.org website has some good documentation. The openchrome users list is a good resource as well.
1. Requirements for Compilation: (Revised May 9, 2008)
The INTEL_GRAPHICS site lists some requirements for building from the tarball or from the git source. The site is a good reference! What is not explained is that you will require a quite recent git build of drm and mesa, from x.org, as even libdrm-2.4.0-0.3.fc9 is not be sufficiently new enough. You will likely have to build drm and mesa from git sources, and install new (testing) versions of other programs. Note that if you are on Fedora, it appears that Fedora 9 will have the new programs and libraries. You may be able to avoid building drm and mesa by using packages from the Fedora 9 Preview Fedora 9 Preview Packages. Due to an apparent regression in the git version of the intel driver, I strongly suggest that you try to build from the tarball version 2.3.0. Note that you may still have the problem referred to in section 5.b) and you will have to revise the drm.h file.
From the Intel Linux Graphics Page:
In order to test or use the latest Intel X driver, you typically don't need to upgrade other components of the graphics stack, like Mesa or the DRM drivers. In order to build the driver, you'll need several development packages installed (list taken from the Fedora build requirements for the driver): - autoconf - automake - libtool - hwdata (for PCI IDs) - xorg-x11-server-Xorg >= 18.104.22.168-6 - xorg-x11-server-sdk >= 22.214.171.124-6 - libXvMC-devel - mesa-libGL-devel >= 6.5-9 - libdrm-devel >= 2.0-1
Packages meeting these requirements are available for Fedora in the 'Everything' repository and can be installed with yum. The first 4 are likely already installed if you are running Mythtv. You will also need git which is available from the usual places. Note that the first sentence of the quoted section is not true.
In addition you will need to find and install at least
which can be found, at present, at Fedora 9 Preview Packages
If you have any of the problems described below, check out the wiki at Xorg Development which describes a smooth set of steps to building drm and mesa using the git sources.
In Fedora, to get the 'Everything' repo change line 4 in your /etc/yum.repos.d/fedora.repo file to read:
In Ubuntu, the equivalent files may require that the 'universe' or 'multiverse' repositories have to be enabled.
2. Getting the Driver Source: (Revised May 9, 2008)
'cd' to a scratch space and either download the tarball from:
or get the driver source by running:
$ git-clone git://anongit.freedesktop.org/git/xorg/driver/xf86-video-intel
Unpack the tarball (if required) and 'cd' into the xf86-video-intel directory which has been created. If you downloaded the tarball, run ./configure --prefix=/usr.
If you downloaded with git, run './autogen.sh' Autogen.sh creates and then runs 'configure'
3. Problem Workaround
If configure fails with:
./configure: line 20860: syntax error near unexpected token `XINERAMA,' ./configure: line 20860: `XORG_DRIVER_CHECK_EXT(XINERAMA, xineramaproto)'
Then you are missing the xorg-x11-proto-devel file (in Fedora) or the xorg-server-devel or source files, in other distributions. If you are sure you followed the instructions given above, you can run 'ldconfig', or if necessary, attempt to re-install the rpms, as these files should be there.
If you cannot otherwise fix the problem, you can avoid the error, by a workaround. You will need to open 'configure' with your favourite editor, find (note the line number!) and comment out the following lines:
# Checks for extensions #XORG_DRIVER_CHECK_EXT(XINERAMA, xineramaproto) #XORG_DRIVER_CHECK_EXT(RANDR, randrproto) #XORG_DRIVER_CHECK_EXT(RENDER, renderproto) #XORG_DRIVER_CHECK_EXT(XF86DRI, xextproto x11) #XORG_DRIVER_CHECK_EXT(DPMSExtension, xextproto)
Those lines should not throw an error if you have the full xorg sources properly installed. And sometimes, things do not quite go right!
4. Saving Your Old Drivers:
To save your old drivers in case of problems (since we all know that you will *not* have problems if you do this: problems only arise when you cannot reverse out of the trouble!).
$ cd /usr/lib/xorg/modules/drivers $ mkdir old $ mv ivch.so ivch.la ch7xxx.so ch7xxx.la ch7017.so ch7017.la tfp410.so tfp410.la \ sil164.so sil164.la intel_drv.so intel_drv.la i810_drv.so ./old
To save the old libraries:
$ cd /usr/lib $ mkdir old $ mv libI810XvMC* libIntelXvMC* ./old
5. Compile Time:
By default configure wants to build (and install) in /usr/local/, but we want everything in /usr so that xorg works as usual. So re-run configure as follows, whichever version you are building.
$ ./configure --prefix=/usr $ make $ sudo make install
a) If configure fails with an error 'No package Xorg-server found', you need to install the requirements listed in Section 1, above.
b) If make fails with an error like:
"In file included from /usr/local/include/xf86drm.h:39, from intel_xvmc.h:42, from intel_xvmc.c:27: /usr/include/drm/drm.h:650: error: expected specifier-qualifier-list before ‘uint64_t’"
then there is a problem with libdrm. You will need to add a line "#include <inttypes.h>" to the /usr/include/drm/drm.h file. *Even the latest git-clone of drm seems to require this fix.* I made this change, then updated to a git source build and the error reappeared.
c) If make fails with an error like:
"i830_driver.c: In function ‘I830LeaveVT’: i830_driver.c:3194: error: ‘DRM_BO_MEM_TT’ undeclared (first use in this function) i830_driver.c:3194: error: (Each undeclared identifier is reported only once i830_driver.c:3194: error: for each function it appears in.) i830_driver.c: In function ‘I830EnterVT’: i830_driver.c:3231: error: ‘DRM_BO_MEM_TT’ undeclared (first use in this function) i830_driver.c: In function ‘I830CloseScreen’: i830_driver.c:3359: error: ‘DRM_BO_MEM_TT’ undeclared (first use in this function) make: *** [i830_driver.lo] Error 1"
then the problem is that the version of libdrm on your system is too old. You can try a re-install/update of libdrm and libdrm-devel, then ldconfig. If that does not work, you can try installing even newer versions from the Fedora 9 Preview repository referred to above. If all else fails go to Xorg Development and download the latest drm source, and build it following the straightforward and clear instructions given on that page. If you get an error message referring to dri2proto, your xorg-x11-proto-devel version is too old and you will need to find a more recent version on the Fedora 9 Preview repo.
The 'make install' step will install the intel_drv.so driver in /usr/lib/xorg/modules/drivers and the libIntelXvMC.so.1.0.0 library in /usr/lib. The libI810XvMC.so driver will be installed in the same place or may be replaced by a symlink to the libIntelXvMC library.
Check that the driver was put into the correct place. If you are unsure, check in /usr/local/lib/xorg/modules/drivers. If the new driver is there, you forgot to (re-)run configure with --prefix=/usr since it should be in /usr/lib/xorg/modules/drivers/
Check that (or revise as necessary to put) '/usr/lib/xorg/modules/drivers' and '/usr/lib/xorg/extensions are listed in /etc/ld.so.conf.
Run ldconfig one last time, to tell the kernel about the new driver.
Each chipset driver has specific configuration options to enable XvMC.
Appendix B to the Nvidia Driver README details the available xorg.conf switches.
Use the 'nvidia-settings' or 'nvidia-config' control panel program to disable Video Texture Adapter and/or Video Blitter Adapter "Sync to VBlank" checkboxes (on the "X Server XVideo Settings" page)
Revise /etc/X11/xorg.conf as follows:
Section "Device" Identifier "Videocard0" Driver "nvidia" Option "UseEvents" "true" Option "XvmcUsesTextures" "false" # necessary for color Chromakey OSD) Option "NVAGP" "1" # some users report 2 or 3 works better EndSection
The following option appears to be required for XvMC use with nvidia chipsets, especially when using the Chromakey OSD. This turns off the 'Compiz eye-candy'. (The 'Extensions' option is a part of xorg.conf, and is a replacement for SubSections which were used to pass Options to a specific module. Since modules are now 'automagically' loaded through extmod, the "Modules" section has disappeared. The 'Extensions' section is the replacement for that capability.)
Section "Extensions" Option "Composite" "Disabled" EndSection
Enabling the UseEvents option can sometimes bring out an issue with the nvidia driver resulting in a blue line around the screen. To fix this simply run the following somewhere during your Xorg startup:
xvattr -a XV_COLORKEY -v 66048
Refer to this section on the Nvidia proprietary driver page for more information.
Many Via based motherboards and especially the EPIA motherboards have video *and* TV outputs. The TV output can be sent directly to a standard NTSC or PAL (CRT-style) TV. These motherboards have a setting in the BIOS which must be set to enable the dual outputs. In addition, the dual outputs must be enabled in xorg.conf. XvMC is enabled by default on chipsets which support it, but depends upon certain extensions, especially 'dri'. Revise xorg.conf as follows:
Section "Module" Load "dbe" Load "extmod" Load "fbdevhw" Load "glx" Load "record" Load "freetype" Load "type1" Load "dri" EndSection Section "Extensions" Option "Composite" "Enable" EndSection Section "Device" Identifier "CardTV" Driver "via" VendorName "VIA" # Optional BoardName "VIA Technologies, Inc. S3 Unichrome Pro VGA Adapter" Option "Active Device" "CRT TV" Option "TVType" "NTSC" Option "TVOutput" "S-Video" Option "EnableAGPDMA" "1" EndSection Section "DRI" Group "video" Mode 0666 EndSection
The 'intel' man page which was installed with the driver contains a full listing of the switches now available. Revise /etc/X11/xorg.conf as follows:
Section "Device" Identifier "Videocard0" Driver "intel" Option "XvMCSurfaces" "7" # for i810/i815 chipsets only, otherwise # XvMC is disabled Option "XvMC" "true" # enable XvMC chipsets after i815 # default is disabled Option "UseEvents" "true" ## The following 2 options should be used together, and may or may not make a difference to CPU load Option "AccelMethod" "XAA" # The default is "EXA" Option "CacheLines" "xxxx" # Allows the user to change the amount of graphics # memory used for 2D acceleration and video when # XAA acceleration is enabled. # Decreasing leaves more for 3D textures. Increasing # can improve 2D performance at the expense of 3D. # xxxx=8192/16384/32768 etc. # Note AccelMethod XAA does not appear to work with Xvmc enabled. Testing required. EndSection
Note that the intel driver, like the latest nvidia driver loads all the required extensions by default and does not require a 'Module' section in xorg.conf. The Intel linux graphics page states that xorg.conf requires
Section "DRI" Mode 0666 EndSection
Due to the fact that all extensions are loaded by default, without a "Modules" section, this Section is not required (DRI is enabled without it), but it actually may do something....
Enabling the chipset library
Each chipset provides an interface between the XvMC library and the actual driver. This is usually handled by what is called the XvMCW wrapper library. The XVMCW library looks to a text configuration file in order to find the correct XvMC library. The wrapper configuration file is usually found at /etc/X11/XvMCConfig and you may have to create this file yourself as it is sometimes not pre-installed even though it is needed.
This file contains a single line consisting of the driver library name:
libXvMCVIA.so or libXvMCVIAPro.so (depending on the chipset)
Note that the Intel driver requires the absolute pathname of the library. If you used a package (atrpms, livna) based driver installation, you might also be able to find the library name by doing
rpm -ql <driver-package> | grep -i xvmc
and look for library names that include the XvMC text.
Checking your installation
You will have to reboot. To check if XvMC is properly enabled, run:
$ cat /var/log/Xorg.0.log | grep Motion
and you should see:
(II) Loading extension XVideo-MotionCompensation
MythTV XvMC Support
Packages and Distributions
Most package and distributions of MythTV should have XvMC enabled by default. You can check your version by doing
and looking for
Options compiled in: linux release using_oss using_alsa using_jack using_backend using_dbox2 using_dvb using_firewire using_frontend using_hdhomerun using_iptv using_ivtv using_joystick_menu using_lirc using_v4l using_x11 using_xrandr using_xv using_xvmc using_xvmcw using_xvmc_vld using_bindings_perl using_bindings_python using_opengl using_ffmpeg_threads using_libavc_5_3 using_live
If you see options like
using_xvmcw then your Myth system is prepared for XvMC.
Compiling from source
When you run configure, if you have the proper XvMC development files installed (libXvMC-devel, libXvMCW-devel) , you should see this type of output from the configure script:
# Video Output Support x11 support yes xrandr support yes xv support yes XvMC support yes XvMC VLD support yes XvMC pro support yes XvMC OpenGL sup. no XvMC libs -lXvMCW
The last libs line will tell you which libraries that MythTV is compiling against. In this case, it is the XvMC wrapper mentioned above. For some, it may be the nvidia libraries. But regardless, you need to see yes before building MythTV.
If installing MythTV on Gentoo Linux, make sure VIDEO_CARDS is properly configured in /etc/make.conf to include, "i810", "nvidia" or "via". The "xvmc" USE flag is also required. These flags can usually be set on a per package basis in /etc/portage/package.use, for example:
$ cat /etc/portage/package.use media-tv/mythtv xvmc
Or globally for all packages in /etc/make.conf via USE="xvmc"
XvMC usage is controlled using the Playback profiles feature of MythTV as of version 0.21. In previous versions, you simply set the Preferred MPEG-2 Decoder to one of the XvMC methods appropriate for your chipset. With 0.21, this is a more flexible configuration allowing for different decoders depending on the content. For XvMC, the Bob deinterlacer provides the best deinterlacing but with some performance penalty.
Sometimes the OpenGL VSync feature can affect XvMC use and performance. This can be enabled or disabled in the frontend configuration if compiled into your MythTV system.
If your on screen display is greyscale, then XvMC is working and you are using the softblend OSD. If you are experiencing performance issues related to the OSD (on screen display), disabling the OSD fade will improve this greatly. You can also try using the Chromakey OSD
The chromakey OSD will provide a color OSD and for many chipsets, this also uses far less CPU resources, making playback smoother.
Using other applications
If you use other applications for playback in MythVideo such as mplayer or xine, each has its own configuration options for using XvMC
MPlayer includes an XvMC output plugin. To use it, try this:
mplayer -vo xvmc -vc ffmpeg12mc
(assuming you are running mplayer with XvMC support compiled in)
- Note - unichrome support in mplayer requires a patch.
Xine's has two plugins for XvMC support, the first is named xvmc as you would expect, however to take advantage of XvMC acceleration on nVidia graphics cards using the nVidia binary drivers they have a seperate plugin named xxmc (xx instead of xv).
To use Xine with XvMC support for most graphics cards use:
xine -V xvmc
To use Xine with XvMC support for nVidia graphics cards use:
xine -V xxmc
(assuming, of course, that xine has been compiled with XvMC support.)
Hardware acceleration processes not yet supported by XvMC
XvMC does have the potential for more than it currently features it supports today. Modern graphic processors (since the DirectX 9 generation and newer) support more than just accelerating motion compensation and iDCT, and they can support more video formats than only MPEG-2 as XvMC does today. NVIDIA's PureVideo Technology and ATI's AVIVO Technology support hardware accelerated video decoding for MPEG-2, MPEG-4 AVC (H.264), and WMV3 (often referred to as WMV9 because Microsoft released it as the same time as Windows Media Player 9). Unfortunate, neither NVIDIA or ATI have yet to open up those technologies to Linux nor the API documentations to the open source community, and no publicly known attempt to reverse engineer access to those APIs have been made.
As a temporary (or permanent) solution to this, shaders in and/or Cg code could be written for these processes and added to the XvMC library to gain some of the additional advantages that GPU assisted video decoding brings and Shader Model 3.0 based GPUs can support today. (Best would be if FFmpeg was used as a reference codec and player/utility). Both NVIDIA and ATI have extensive tools for creating and maintaining the process of developing shaders easier than it has been in the past.
Processes that can be accelerated for most video formats
- Motion compensation (mo comp)
- Inverse Discrete Cosine Transform (iDCT)
- Inverse Telecine 3:2 and 2:2 pull-down correction
- Inverse modified discrete cosine transform (iMDCT)
- in-loop deblocking
- intra-frame prediction
- inverse quantization (IQ)
- Variable Length Decoding (VLD)
- More commonly known as slice level acceleration
- Spatial-Temporal De-Interlacing
- Plus automatic interlace/progressive source detection
- Bitstream processing (CAVLC/CABAC)
There are alternative methods for hardware assisted MPEG-2 playback of standard definition video: