[mythtv-users] simultaneous viewing

Michael T. Dean mtdean at thirdcontact.com
Thu Apr 12 16:22:53 UTC 2012


On 04/12/2012 08:08 AM, Russell Gower wrote:
> On 12 Apr 2012, at 12:37, tortise wrote:
>>> Ideally I would like two new features to mythfrontend, 1. add functionality to broadcast a playback timestamp for the currently playing file. 2. add functionality to allow "slave" frontends to optionally synchronise to this timestamp (with a configurable varience to allow for syncing audio). The first should be fairly easy, I'm not sure about the second. I would be willing to have a stab at implementing this if it where likely to be accepted by the devs.
>> Damn, should have read the whole thread first, apologies.  I think any reasonable dev is unlikely to accept that proposition. The app needs to be written and tested, I think it would work, however I think one can only expect the dev's to consider something concrete.  Similarly using the frontend network interface its possible (I think) to write something that can be independent of the code, so long as it complies with the protocols it could remain independent of the code.  I think that something tested and working for a number of people is likely to get a favourable hearing from the dev's, but I can't expect them to commit to anything before they see it.
>>
>> An obvious issue to me is how many frontends will be coordinated slaves and how will they be controlled in the various combinations?
> Agreed, I wouldn't expect anything to be accepted until well proven, what I would be looking for is agreement that a well implemented solution would be considered rather than turned down point blank.
>
> I'm not sure how an external application could provided the required timing for near frame accurate synchronisation.
>
> I've not done any real design work yet, but my initial thoughts are that a frontend operating normally would use ip multicasting to announce it's name the file name of the currently playing recording and its current position within the file, this would take place every couple of seconds.
>
> All frontends would listen for the multicast's and provide a play back option for each of them, once playing they would use the time stamp portion of the multicast to remain synchronised.
>

During playback, timing will drift from "absolute" independently on 
different systems depending on what's happening.  Therefore, you would 
need an very-well-designed approach for synchronizing playback on 
multiple devices, including a protocol for control and a protocol for 
playback sync.

There are protocols specifically designed for this, such as those used 
by Sonos, Slimserver UDP protocol, and Logitech's Squeezebox (though 
Squeezebox may just try to start playback at the same time and ignore 
synchronization) or even RTSP (which was used by the "Cidero controller" 
for this purpose*), and if we were to accept something to allow this, we 
would want to use some "standard" protocol, that's also likely to 
interoperate with other devices.  (However, it would obviously also need 
to be an open protocol that we're allowed to use.)

Most likely the best approach would be to implement the design so that 
it works for audio-only playback to start, then extend it to cover 
video, too (which would likely follow nicely, since video playback 
timing is based on audio timing).

In 2009, Daniel K discussed the idea a bit on IRC with a couple other 
devs/users, and he might be able to give you more information about what 
we would and wouldn't want.  Feel free to send a quick message to the 
-dev list proposing the idea (and the more research you can do before 
then--such as regarding possible protocols and their 
capabilities/limitations--the better).  I'd suggest you try to keep the 
initial post short to make it more likely that the right people will 
read it/reply (so it doesn't seem like so much work to send a reply).

Mike

*Though I can't find it on any existing web site, today, this seems to 
be the document that was ref'ed:

http://repo.or.cz/w/learning-java-upnp.git/blob/HEAD:/cidero/doc/html/multiRendererSync.html

which documents the approach used by:

http://repo.or.cz/w/learning-java-upnp.git/blob/HEAD:/cidero/doc/html/mediaController.html

Though the approach documented there sounds like one that would not work 
well on MythTV--where even within one user's network, various frontends 
are sufficiently different that timing/sync would drift significantly.


More information about the mythtv-users mailing list