[mythtv-users] high cpu utilisation with bob deinterlacer

Tom Dexter digitalaudiorock at gmail.com
Sun Feb 22 16:16:29 UTC 2009


On Sat, Feb 21, 2009 at 6:20 PM, Adam <adam at fastmail.com.au> wrote:
> Hi,
> Does anyone know why some of the deinterlacers (such as BOB) cause
> extremely high Xorg cpu usage when playing livetv or a recording inside
> mythtv?
>
> My system is as follows: Intel Core 2 duo 2.4GHz, 2GB ram, Geforce
> 9600GT with HDMI out, Hauppauge Nova-T-500 DVT card.
>
> When playing back a 1080 stream (with the BOB deinterlacer) the cpu
> utilisation is typically:
> mythfrontend - ~50%
> X - ~95%
>
> If I use the linear deinterlacer the cpu usage is more like:
> mythfrontend - ~45%
> X - ~5%
>
> Although live tv does play back perfectly with the bob deinterlacer, I'm
> not happy about the high cpu usage that results when watching tv.
>
> I'm running mythtv-0.21_p18314-r1, 32-bit gentoo with kernel 2.6.27-r8,
> xfce 4.4.3 and nvidia-drivers-177.82. Since experiencing this problem I
> have tried adding the UseEvents option to my xorg.conf under the nvidia
> device section and I have tried upgrading my nvidia drivers to 188.29.
>
> I look forward to hearing any suggestions.
> _______________________________________________

That is an odd one.  I'm running almost the exact same version of
everything under Gentoo, except that I'm still on the 2.6.27-gentoo-r7
kernel and I use fluxbox rather than xfce.  I'm using the 177.82
nVidia drivers.  On my Dell 4700 3 Ghz P4 frontend the typical usage
of mythfrontend and X combined is probably about 60% total playing
1080i.  X rarely uses much of any at all.

I have always had UseEvents on my the way.  I recall that without it I
would frequently have X using almost 100%.  Also note that I recently
disabled OpenGL vsync in MythTV and got RTC working, as that was
causing locking issues that got much worse with the 177.82 drivers.

Tom


More information about the mythtv-users mailing list