[mythtv] still getting jitter/speed pulsing

Kyle Rose krose+mythtv at krose.org
Fri Dec 31 20:13:54 UTC 2004


I was wrong about setting Bob correctly getting rid of the jittery
playback: it was the lack of a control that led me to believe it was
gone.

The difference appears to be that the jitter happens only in the
presence of both Xv output and the new ffmpeg code.  I don't know how
the two can possibly be related, which is part of the reason I'm
having a hard time figuring out where to start debugging this.

I added a few Jitterometers in AVSync, and found a potentially
interesting difference between X11 and Xv output.

X11 (normal)
------------
'prepare1' sd0fmean = '12.87', mean = '8.21', std. dev. = '1.06', fps = '121802.68'
'vsync1' sd0fmean = '11.00', mean = '4.67', std. dev. = '0.51', fps = '214132.76'
'vout1' sd0fmean = '17.59', mean = '14254.70', std. dev. = '2506.94', fps = '70.15'
'prepare2' sd0fmean = '10.90', mean = '7.83', std. dev. = '0.85', fps = '127713.92'
'vsync2' sd0fmean = '12.76', mean = '3.81', std. dev. = '0.49', fps = '262467.19'
'vout2' sd0fmean = '13.14', mean = '15888.00', std. dev. = '2088.11', fps = '62.94'
'video_output' sd0fmean = '2.15', mean = '33318.28', std. dev. = '716.07', fps = '30.01'

Xv (abnormal)
-------------
'prepare1' sd0fmean = '9.16', mean = '7.82', std. dev. = '0.72', fps = '127877.24'
'vsync1' sd0fmean = '103.59', mean = '12139.34', std. dev. = '12575.42', fps = '82.38'
'vout1' sd0fmean = '9.75', mean = '3117.61', std. dev. = '304.00', fps = '320.76'
'prepare2' sd0fmean = '12.86', mean = '7.06', std. dev. = '0.91', fps = '141643.06'
'vsync2' sd0fmean = '25.75', mean = '12002.08', std. dev. = '3090.27', fps = '83.32'
'vout2' sd0fmean = '10.26', mean = '2886.44', std. dev. = '296.08', fps = '346.45'
'video_output' sd0fmean = '43.54', mean = '33651.20', std. dev. = '14652.30', fps = '29.72'

Note that the first vsync seems to vary a *lot*.  I would expect vsync
in Xv to take more time, since the frames actually display quicker so
mythfrontend is spending more time idle; but the standard deviation is
as high as the mean, which suggests something funky is going on there.

What I don't understand is why vsync would start acting up just
because I started using Xv output: I'm using GL sync in both cases.

Any thoughts?

Cheers,
Kyle


More information about the mythtv-dev mailing list