You are on page 1of 1

We are dealing with a synchronization issue called multipoint or inter-destination synchronization in multiparty conferencing system, involving the synchronization

of the play- out processes of same streams in different receivers, at the same time, to achieve fairness among the receivers. We can cite the example of tele teaching applications in which a teacher could send (multicast) a video sequence (documentary or film stored content stream) and, during the session, sometimes the teacher could make occasional comments about the video (live content stream).Network quizzes are other examples, in which the same multimedia question must appear at the same time to all the participants to guarantee fair play. In the first example, a simultaneous play out of the streams is important for both stored content and live content streams. Even if we only send the video stream (documentary or film), each video MDU (frame) should be played simultaneously in all the receivers (students) and then the students could comment on the video content with other students. In Figure 1, we can see the playout of the jumping ball sequence, synchronized in all the receivers.

figure 1 When the source sends more than one simultaneous stream and the stereams are having some temporal relationships, then there will be a master stream based on which multipoint synchronization will take place. In our case of source sending audio and video streams, audio will be the master stream based on which multipoint synchronization will take place. Then at different receivers we should also ensure inter-stream synchronization, i.e. Synchronization between the audio and video streams. We should maintain the temporal relationship between different media data units of audio and video. Our Proposed Approach 1) We adaptively calculate the expected playout instant for each audio frame which is same for all the receivers. Since different receivers will receive the same frame at different time, playing out the same frame at the same time will require shrinking or extending of silent zones keeping data unchanged. 2) Based on the expected playout instant of the audio frame, we will calculate expected playout instant of the video frames with the help of their intermedia relationships. 3) We will play the audio and video frames at their expected playout instant.

You might also like