[mythtv-users] Very variable CPU usage from X using bob deinterlace

junk junk at giantblob.com
Mon Jan 29 14:14:33 UTC 2007


Steven Adeff wrote:
> On 1/28/07, junk <junk at giantblob.com> wrote:
>   
>> Paul Bender wrote:
>>     
>>> junk wrote:
>>>
>>>       
>>>> Hi,
>>>>
>>>> I'm experiencing very high and very variable CPU usage from X during DVB
>>>> playback with the 'bob' deinterlacer - sometimes X uses a few percent of
>>>> the CPU, sometimes it uses nearly 90 percent. I do have cpufreq turned
>>>> on with the ondemand governor but this is not enough to explain the
>>>> variation I'm seeing (as much as a hundred fold difference in load).
>>>> There is no apparent difference in playback quality when the load
>>>> changes and the load changes up and down while watching a single
>>>> program. Switching deinterlacing off or choosing a different
>>>> deinterlacing algorithm results in consistent CPU load.
>>>>
>>>> Does anyone else see this? Is it expected behaviour? I understand that
>>>> bob is CPU intensive, but it seems odd that it sometimes takes
>>>> practically no CPU and other times it nearly swamps my machine.
>>>>
>>>> My setup is:
>>>>    AMD Athlon 64 3000+ (1800MHz), 2GB memory, NForce motherboard (6500,
>>>> I think?), using built in TV Out to PAL TV
>>>>    Gentoo 2006.1
>>>>    Linux 2.6.19.2, (vanilla kernel, not gentoo kernel)
>>>>    modular X.Org server (from stable portage)
>>>>    recent proprietory NVidia drivers (emerge ~x86)
>>>>    recent MythTV (from SVN a few days ago)
>>>>    'Standard' output (not XVMC as it seems to crash X and or the front end)
>>>>
>>>> Thanks,
>>>> -- jeek
>>>>
>>>>         
>>> Yes, I have seen this behavior. Adding
>>>
>>> Option "UseEvents" "true"
>>>
>>> to the nvidia device section in xorg.conf solved the problem.
>>>       
>
> UseEvents needs the new 9xxx series drivers.
>
> The other reason could be that your modeline is using a refresh rate
> that isn't what the video refresh is. This will cause X to spike in
> CPU every time it has to mash frames to get things to work right.
> (mash=I don't know exactly what its doing, but it doesn't like it and
> is doing something to fix things)
>
>   
Yeah - I'm running the recent NVidia drivers and my video frame rate == 
TV refresh rate == 50Hz (PAL) with OpenGL vblank sync on, which all 
seems to work well - CPU usage is reasonably low and doesn't spike and 
playback appears smooth. Not sure the interlaced output from bob is 
actually correct but it frame rate feels much smoother than the 'no 
interlace' option and picture is sharper than blend etc.

-- jeek


More information about the mythtv-users mailing list