[mythtv-users] Way out idea on watching same thingin multiplerooms

Gareth Glaccum gareth.glaccum at btopenworld.com
Wed Jan 6 22:15:47 UTC 2010


----- Original Message ----- 
From: "Tortise" <tortise at paradise.net.nz>
>Add in one more FE, a bit of VOIP and some file downloads and I can see a 
>"1G" LAN intermittently falling over at those levels for VOIP and Video. 
>"1G" LANs are in my experience much less than 1G.  I've not yet seen a 1G 
>NIC get anything near that through it. Even 100M would raise my eyebrows.
>While "10G" networks are coming around home network gear tends to lag and 
>not lead in available capability.
>Chris Pinkham early in this thread described an alternative method to 
>achieve a similar thing.  I think multicast and Chris's method both have 
>good and bad points.
>
>Wired network latency seems irrelevant.  Wireless is well known to have 
>problems with multicast which will be relevant for some.  (I personally try 
>to avoid wireless as much as possible - this is just another reason why.)
>
>Would Chris's method avoid different length frontend buffers?  If I start a 
>bookmarked recording there also seems to be a buffering
(>"Wait please...") pause, certainly a start pause, in that respect it may 
have little advantage?  Synchronisation is a good idea,
>would it work smoothly on slave frontends?
>Which method would be better?  The answer may be related to the number of 
>frontends - the more there are the better multicast looks. If one is making 
>a scalable piece of kit, instead of doing it unicast - then lateron 
>revising it for multicast, would it be better to just do it with multicast?

I think that there are two problems. One is how to get the data there, which 
multicast is a good solution, it (*in theory*) broadcasts the same data to 
all the clients that want it (but on bad configurations, instead everyone 
gets it).

The second, is assuming that all the data is there, how do we get the end 
device, THE VIEWSCREEN, and SPEAKERS, to be in sync, given an unknown/random 
delay occuring between data storage, and final output.

Chris Pinkham's idea/concept is probably the way that things need to happen. 
A primary frontend sends commands to the slave frontends. I don't know how 
far he got, but I think that some way of testing the synchronisation of the 
streams at the point of output is required (which gets fed back to a stream 
position fudge factor). 



More information about the mythtv-users mailing list