[mythtv-users] Intel hd 4000 support

Stephen Worthington stephen_agent at jsw.gen.nz
Mon May 13 17:49:24 UTC 2013


On Mon, 13 May 2013 10:16:02 -0400, you wrote:

>On 05/13/2013 03:13 AM, Stephen Worthington wrote:
>> On Mon, 13 May 2013 14:54:23 +1000, you wrote:
>>
>>> On 13 May 2013 13:18, Rajil Saraswat wrote:
>>>
>>>> Hi
>>>> I am in the market looking for an uktrabook which is easy to carry around.
>>>> Most ultrabooks use intel hd 4000 graphics card. How well does mythtv
>>>> support it? Does it provide hardware acceleration similar to vdpau?
>>>> Tx
>>>>
>>> myth supports VAAPI hardware accelerated decoding.
>>> nowhere near as powerful and flexible as VDPAU (especially in regards to
>>> deinterlacing).
>>>
>>> The things to keep in mind, is that most laptop modern enough to have a
>>> HD4000, are likely fast enough to software decode anything you throw at
>>> them to start with.
>> I found that was not quite true - CPU power alone was not quite
>> enough.
>>
>> I have had an MSI GT70 laptop since late last year: Core i7-3610QM
>> CPU, 12 GiBytes RAM, builtin HD4000 video.  There are few laptops with
>> more CPU power than that.  When playing back HD video on CPU alone, I
>> would often get a tearing effect towards the top of the screen.  I
>> never really diagnosed the problem, but I think it was only on 1080i
>> H.264 HD video.
>
>Tearing--as opposed to dropped frames or "stuttery" playback--is 
>generally caused by either bad video drivers or improper video driver 
>configuration and has nothing to do with the CPU's ability to keep up 
>with video decoding demans.  All of our video rendering methods (Xv, 
>VA-API, OpenGL, and VDPAU) rely on the video card drivers to proper 
>video sync, which will prevent tearing.  Most drivers require the user 
>to explicitly enable sync--and often separately for Xv vs OpenGL/VDPAU.
>
>That said, running at 2.3GHz, your CPU may not be able to keep up with 
>high-bitrate H.264 decoding in software (so even if you fixed the driver 
>configuration, you'd likely have dropped frames or other problems).  The 
>3.3GHz turbo might do better, but I don't know that it could maintain 
>that long enough for real playback.  However, you shouldn't have any 
>problems with MPEG-2 playback.

There is no sign at all of dropped frames or anything like that.  On
an i7 CPU, 2.33 GHz gives you a lot more CPU than the speed suggests.
My thought was that the tearing was as a result of the deinterlacing,
as it seemed to only happen on interlaced HD H.264 video.  It could
easily be a bug in the deinterlacing software somewhere, but I tried
all the options I could find and none of them made any difference. The
tearing was just in a small strip close to the top of the screen, and
it came and went with screen content, as I would expect for a
deinterlace problem.

Should anyone have any useful suggestions about this, I would be happy
to test them out.  All I have to do is adjust a couple of settings in
mythfrontend to switch between VAAPI and CPU modes.

>>    SD and MPEG2 recordings certainly did not do it.  I
>> did try changing various options, but never found anything better. And
>> the colours were always a little off, a bit dark, and there was no way
>> to adjust the colours - mythfrontend did not provide the colour
>> settings I get on my main Nvidia based MythTV box.
>
>Again, the drivers need to provide support for color adjustment (and/or 
>"studio mode" and such).
>
>It's quite possible the Intel drivers lack support for these things (or 
>have them only partially implemented).  I have heard quite a few 
>comments about the poor state of the Intel drivers (and it seems Intel 
>doesn't care much about F/LOSS users--though they're starting to care 
>more about "closed" Linux-based systems due to the mobile 
>bubble^H^H^H^H^H^Hboom, but that probably won't help us too much).

The latest Intel drivers on the 01.org site seem to show a bit more
interest on Intel's part - the installer that they provide did a good
job of installing a matching set of packages that worked and made
VAAPI work.

>>    The results were
>> certainly viewable, but just a little annoying at times.
>>
>> When I tried VAAPI it worked for playing one program, but locked up
>> the display on exit from VAAPI - to get things going again I had to
>> swap to another TTY and restart lightdm and log in again.  But playing
>> the program produced a rather better result - correct colours, no
>> tearing.  And the colour settings were available, although I did not
>> then need them.
>>
>> Then (a couple of months ago?) I ran across a post on this list saying
>> that Intel had produced an installer that installed the latest VAAPI
>> drivers and all the right matching packages, and they had VAAPI
>> working as a result.  So I tried that and now have VAAPI running
>> without any problems, although I have yet to test it in real life as I
>> have not been away from home with the laptop since then.
>>
>> So I would certainly recommend trying VAAPI if you have problems with
>> just using CPU alone:
>>
>>
>> https://01.org/linuxgraphics/downloads/2013/intel-linux-graphics-installer
>>
>> BTW My GT70 also has an Nvidia (GT650?) GPU, but current software to
>> access it from Linux (http://bumblebee-project.org) does not allow
>> VDPAU to work.  I am hoping that will change in the not too distant
>> future - rumour has it that there are changes coming in new kernels to
>> help.
>
>Why not use the proprietary nvidia drivers? Do they not support your GPU?
>
>IMHO, VDPAU is the right tool for video playback on GNU/Linux and the 
>right implementation of it is in the nvidia proprietary drivers.  
>Regardless of AMD's recent announcement, their support for VDPAU is 
>limited (but, we hope will improve over time).  And, as far as F/LOSS 
>drivers go, well, they are works in progress.
>
>Mike

The GT70, like most recent laptops with Nvidia support, uses an Nvidia
Optimus setup.  This has the lower power Intel HD4000 hardware in use
normally, and just powers up the Nvidia hardware when it is needed, on
request from the software.  Output from both GPUs is merged on screen
somehow - I am not clear on the details.  This works well in Windows
where there is proper driver support for it.  The Nvidia setup program
allows you to specify which programs will have the Nvidia GPU turned
on for them.

On Linux, I have the Nvidia drivers installed and can use them via
Bumblebee, which is the software that supports switching on and using
the Nvidia hardware.  I can run programs outputting to the Nvidia
hardware, which has its display overlayed on top of the HD4000 output.
You do that by using an "optirun" command in front of the program you
want to run on the Nvidia hardware.  So if I run "glxshperes", I will
get 40 frames per second or so running on the HD4000 GPU, and if I run
"optirun glxspheres" I will get 66 fps or so running on the Nvidia
GPU.

But Bumblebee does not support VDPAU:

  https://github.com/Bumblebee-Project/Bumblebee/issues/36

So while I can do "optirun mythfrontend.real", and it will work, as
soon as it uses VDPAU for output I get nothing, and if I remember
correctly it will log an error about GLX or something like that.

There is no way to just use the Nvidia drivers directly as the Nvidia
hardware is off by default and needs to be turned on, and the BIOS
does not support swapping the hardware so that Nvidia is on and HD4000
is off.  Nor does Bumblebee or any other software I have found.  I
would love to be able to just enable the Nvidia GPU as the default and
run the Nvidia drivers only, but it is not possible.


More information about the mythtv-users mailing list