[mythtv-users] OpenGL renderer with Intel graphics

Mark Kendall mark.kendall at gmail.com
Fri Aug 27 03:51:23 UTC 2010


On 27 August 2010 11:03, George Nassas <gnassas at mac.com> wrote:
> On 2010-08-26, at 8:32 PM, Mark Kendall wrote:
>
> If you want to post the full output of mythfrontend -v
> playback and glxinfo, I can try and take a look. My gut feeling is
> that the hardware just isn't powerful enough
>
> I put a zip of some logs here. The DailyShow.txt file is from playing a 720p
> ATSC recording and the TED.txt is one of the recent TED Talks, I think
> they're lowish res. glxinfo and glxgears output is there too. I understand
> that glxgears is not representative of anything but it's fun to watch.
> Do I have the scenario correct? OpenGL gets a decoded frame from myth and
> all it has to do is scale it and stamp it to the screen? That doesn't sound
> very taxing but what do I know.
> Anyway, thanks for looking.

George - I can't be sure (those logs are not from -v playback and are
not complete) but I'm guessing that the programmable hardware just
isn't fast enough with that gpu.

The preferred approach in the opengl renderer is to use fragment
programs to do the colour space conversion - when they are available.
The conversion is already optimised to a certain extent by getting the
cpu to do a little repacking before the frame is given to the gpu
(which also allows for some quality improvements for interlaced
material) but the renderer cannot know how fast (or slow) the gpu is.

On your system the correct opengl support is there for fragment
programs so the renderer will be using them - but they just aren't
fast enough. There is however an alternative fallback on Intel cards.
If you try adding opengloptions=nofrag to the filter section of your
playback proffile, it should drop back to using a specific video
related extension. If you don't see any improvments, post another log.

regards

Mark


More information about the mythtv-users mailing list