US7386782B2 - Method for synchronizing a multimedia file - Google Patents

Method for synchronizing a multimedia file Download PDF

Info

Publication number
US7386782B2
US7386782B2 US10/380,288 US38028803A US7386782B2 US 7386782 B2 US7386782 B2 US 7386782B2 US 38028803 A US38028803 A US 38028803A US 7386782 B2 US7386782 B2 US 7386782B2
Authority
US
United States
Prior art keywords
data file
data
synchronization command
file
event
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US10/380,288
Other versions
US20040098365A1 (en
Inventor
Christophe Comps
Daniel Boudet
Xavier Sarremejean
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alcatel Lucent SAS
Original Assignee
Alcatel SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alcatel SA filed Critical Alcatel SA
Assigned to ALCATEL reassignment ALCATEL ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BOUDET, DANIEL, COMPS, CHRISTOPHE, SARREMEJEAN, XAVIER
Publication of US20040098365A1 publication Critical patent/US20040098365A1/en
Application granted granted Critical
Publication of US7386782B2 publication Critical patent/US7386782B2/en
Assigned to CREDIT SUISSE AG reassignment CREDIT SUISSE AG SECURITY AGREEMENT Assignors: ALCATEL LUCENT
Assigned to ALCATEL LUCENT reassignment ALCATEL LUCENT RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: CREDIT SUISSE AG
Adjusted expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/43Querying
    • G06F16/438Presentation of query results
    • G06F16/4387Presentation of query results by the use of playlists
    • G06F16/4393Multimedia presentations, e.g. slide shows, multimedia albums

Definitions

  • the present invention relates to a method of synchronizing different types of data in a multimedia file. It applies, for example, to portable systems such as mobile radio terminals, pocket computers or any other equipment that can have multimedia capabilities and for which the size of multimedia files and the computation power needed to process them constitute a problem.
  • multimedia file generally refers to integrating different types of data (such as pictures, sound and text) in the same file.
  • Each type of data is contained in a given track.
  • Each track is organized in the form of a series of commands.
  • Each track is scanned by a microprocessor.
  • Each microprocessor executes, at the same time as the others, commands from one track or simultaneous commands from more than one track and can present the data, via different interfaces, to a user of an equipment with multimedia capabilities.
  • the interfaces can be a screen for text and picture data and a loudspeaker for audio data. The user therefore sees text and pictures whilst hearing sounds.
  • the problem is therefore to match the text to the music and the pictures, i.e. to synchronize the different types of data contained in the same multimedia file.
  • Each microprocessor, associated with each track containing one type of data uses an oscillator.
  • Each oscillator produces a signal with a frequency slightly different from those of the other oscillators.
  • the software executed by each processor can be based on different operating systems, which drift with time in dissimilar ways. Thus two microprocessors that begin to read their respective tracks at the same time are eventually no longer synchronized with each other. For example, if the microprocessor for the sound data track is lagging behind the microprocessor for the text data track, the text of a phrase will be displayed before the sung phrase is heard.
  • the prior art solution is temporal synchronization.
  • the microprocessor ⁇ p 1 which reads the track 1 containing sound, sends synchronization data every 3 ⁇ s to the microprocessor ⁇ p 2 , which reads the track 2 containing text.
  • the synchronization data can optionally be stored in the multimedia file.
  • the microprocessor ⁇ p 2 verifies every 3 ⁇ s whether its clock is synchronized to that of the first microprocessor ⁇ p 1 . If the microprocessor ⁇ p 2 finds that it is in advance of the other one, it calculates the time difference and stops reading track 2 for that period of time. It then restarts in synchronism with the microprocessor ⁇ p 1 . It is apparent that the better the synchronization required, the greater the quantity of synchronization data that has to be sent and the more frequently it has to be sent.
  • the synchronization data can reach the microprocessor ⁇ p 2 while it is in the middle of displaying a phrase. The display of the phrase is then stopped short, and the user does not receive the impression of fluid presentation of data.
  • the object of the present invention is to reduce the size of multimedia files, to optimize the quantity of data exchanged, and to provide optimum synchronization.
  • the invention provides a method of synchronizing data in a multimedia document ( 50 ) comprising at least two separate data files (track 1 , track 2 ) referred to as the first file, the second file, etc., in which method:
  • the method is advantageously characterized in that the important event corresponds to a command to display a text, a command to display a picture, or a command to reproduce a sound.
  • the invention also provides a device for synchronizing data in a multimedia file containing at least one track in which said data is stored and at least one synchronization command in each track, said device having first means for reading the data of each track and second means enabling the first means to communicate with each other, the data communicated between said first means concerning the occurrence of a synchronization command.
  • the device is characterized in that one of the first data reading means is designated as having the highest priority and forces the other first means to synchronize with it.
  • FIG. 1 already described, represents the synchronization of a multimedia file in the prior art.
  • FIG. 2 is a diagram of a multimedia file conforming to the invention.
  • FIG. 3 is a detailed view of tracks in a multimedia file.
  • the data in a multimedia file can comprise either time values or sound, text or picture coding values.
  • the time values can represent a note duration, an image display time, a track start or end time, or a waiting time between two events.
  • the tracks of the multimedia file also include synchronization commands related to the various events included in the track (note, picture, text, etc.).
  • FIG. 2 is a diagram showing the structure of a multimedia file according to the invention.
  • the multimedia file 50 includes a header 55 and tracks 60 , 70 and 80 .
  • a multimedia file can include a number of tracks from 1 to n and FIG. 2 merely represents one example of this kind of file.
  • the header 55 includes data that is common to all of the tracks and is not described in detail here.
  • Each track of the file 50 can contain a single type of data.
  • track 60 can be a MIDI (Musical Instrument Digital Interface) format track for sound
  • track 70 can contain a sequence of pictures
  • track 80 can contain sequences of texts.
  • the different tracks may be intended to be scanned by microprocessors and presented simultaneously to a user. The different microprocessors therefore scan the tracks at the same time.
  • Each track 60 , 70 and 80 has a respective header 65 , 75 and 85 .
  • Each header contains an indicator of the type of data contained in the track. Thus the microprocessor able to read MIDI data knows from this indicator which track to read.
  • Each track also contains data organized in the form of commands which are executed sequentially by the microprocessor (for example to display a picture or a text).
  • FIG. 3 shows one example of a structure of three tracks contained in a multimedia file.
  • Each track has a Start field for starting presentation to the user and an End field for ending presentation to the user.
  • Track 1 contains data relating to sound.
  • a first field Nf 1 represents the frequency of a first note and a second field Nd 1 represents its duration.
  • the fields Nh 2 and Nd 2 define a second note.
  • the field D 1 represents a waiting time before presenting the subsequent notes of the track.
  • the fields Nh 3 and Nd 3 respectively represent the frequency and the duration of a third note.
  • Track 2 contains data corresponding to sequences of JPEG images.
  • two JPEG images represented by the fields JPEG 1 and JPEG 2 must be presented to the user for a given time represented by the field JPEGd 1 for the image JPEG 1 and the field JPEGd 2 for the image JPEG 2 .
  • the fields JPEGD 0 , JPEGD 1 and JPEGD 2 represent waiting times before or between images.
  • Track 3 contains data corresponding to text messages.
  • two syllables represented by the fields TEXT 1 and TEXT 2 must be presented to the user.
  • the fields TEXTD 0 and TEXTD 1 represent waiting times before a text.
  • the synchronization commands are represented by fields SYNCHi for i from 1 to n.
  • the synchronization commands are not temporal commands, as in the prior art, but are instead dependent on a specific event. Thus the fields SYNCHi are not present in the tracks at regular time intervals.
  • slaves it forces the other microprocessors in charge of the other tracks, referred to as slaves, to synchronize with it.
  • the first fields NF 1 and Nd 1 correspond to a first note.
  • the second note, corresponding to the second fields NF 2 and Nd 2 must be heard at the moment the first picture is displayed, corresponding to the field JPEG 1 of track 2 .
  • the third note, corresponding to the third fields Nf 3 and Nd 3 must be heard at the moment that the first syllable is displayed, corresponding to the field TEXT 1 of track 1 .
  • the fourth note corresponding to the fourth fields Nf 4 and Nd 4 , must be heard at the moment at which are simultaneously displayed the second picture, corresponding to the field JPEG 1 in track 2 , and the second syllable, corresponding to the field TEXT 2 in track 1 .
  • the first synchronization command which corresponds to the field SYNCH 1 , is:
  • the second synchronization command which corresponds to the field SYNCH 2 , is:
  • the third synchronization command which corresponds to the field SYNCH 3 , is:
  • the microprocessors scan all the tracks at the same time. Two situations arise, according to whether the slave microprocessors are lagging behind or in advance of the master microprocessor. Each slave microprocessor can receive data concerning the synchronization commands from the master microprocessor.
  • the master microprocessor which is dedicated to track 1 , reaches the first synchronization command, corresponding to the field SYNCH 1 , and sends first synchronization data to the other microprocessors.
  • each important command i.e. a command whose execution must not be interrupted
  • a command is represented by a given field preceded by a field representing a synchronization command.
  • the synchronization command is at the same place in all the other tracks. Because of this, different tracks are resynchronized before any important command, if necessary.
  • the invention synchronizes data in a multimedia file without overloading the memory with unnecessary synchronization data, and thereby restricting transfers of synchronization data between the microprocessors, without overloading the tracks with large quantities of unnecessary synchronization data, and most importantly without the execution of an important command stopping in the middle.

Abstract

The invention concerns a method for synchronizing data in a multimedia document (50), said document comprising at least two separate computer files (track1, track2) called the first file, the second file, etc., in which method: there is stored in the first file, in the second file, etc., respectively data of a first type, of a second type, etc., said data being grouped in the form of at least one event-related command characterizing an event, said event being either important or unimportant,—at least one synchronization command is inserted in each file, characterized in that said synchronization command is inserted before each event-related command characterizing an important event.

Description

The present invention relates to a method of synchronizing different types of data in a multimedia file. It applies, for example, to portable systems such as mobile radio terminals, pocket computers or any other equipment that can have multimedia capabilities and for which the size of multimedia files and the computation power needed to process them constitute a problem.
There are very many monomedia files, i.e. files relating to only one particular type of data, such as the JPEG (Joint Photographic Expert Group) format for storing pictures or the RTF (Rich Text File) format for storing text.
The expression “multimedia file” generally refers to integrating different types of data (such as pictures, sound and text) in the same file. Each type of data is contained in a given track. Each track is organized in the form of a series of commands. Each track is scanned by a microprocessor. Each microprocessor executes, at the same time as the others, commands from one track or simultaneous commands from more than one track and can present the data, via different interfaces, to a user of an equipment with multimedia capabilities. The interfaces can be a screen for text and picture data and a loudspeaker for audio data. The user therefore sees text and pictures whilst hearing sounds.
The problem is therefore to match the text to the music and the pictures, i.e. to synchronize the different types of data contained in the same multimedia file.
Each microprocessor, associated with each track containing one type of data uses an oscillator. Each oscillator produces a signal with a frequency slightly different from those of the other oscillators. Also, the software executed by each processor can be based on different operating systems, which drift with time in dissimilar ways. Thus two microprocessors that begin to read their respective tracks at the same time are eventually no longer synchronized with each other. For example, if the microprocessor for the sound data track is lagging behind the microprocessor for the text data track, the text of a phrase will be displayed before the sung phrase is heard.
The prior art solution is temporal synchronization.
In the FIG. 1 example, the microprocessor μp1, which reads the track 1 containing sound, sends synchronization data every 3 μs to the microprocessor μp2, which reads the track 2 containing text. The synchronization data can optionally be stored in the multimedia file.
Thus the microprocessor μp2 verifies every 3 μs whether its clock is synchronized to that of the first microprocessor μp1. If the microprocessor μp2 finds that it is in advance of the other one, it calculates the time difference and stops reading track 2 for that period of time. It then restarts in synchronism with the microprocessor μp1. It is apparent that the better the synchronization required, the greater the quantity of synchronization data that has to be sent and the more frequently it has to be sent.
In mobile terminals there are severe file size constraints. The available memory is limited for reasons of overall size and battery life. What is more, multimedia files must be downloadable from a server center in a reasonable time, which is directly dependent on the file size.
Storing recurrent synchronization data is costly in terms of memory: the flow of data exchanged is burdened with many exchanges of synchronization data, which overloads the memory.
The above solution also has a further and major disadvantage: the synchronization data can reach the microprocessor μp2 while it is in the middle of displaying a phrase. The display of the phrase is then stopped short, and the user does not receive the impression of fluid presentation of data.
The object of the present invention is to reduce the size of multimedia files, to optimize the quantity of data exchanged, and to provide optimum synchronization.
To this end, the invention provides a method of synchronizing data in a multimedia document (50) comprising at least two separate data files (track1, track2) referred to as the first file, the second file, etc., in which method:
    • data of a first type, of a second type, etc. is stored in the first file, in the second file, etc., respectively, and grouped into the form of at least one event-related command characterizing an important event or an unimportant event, and
    • at least one synchronization command is inserted into each file, which method is characterized in that said synchronization command is inserted before each event-related command characterizing an important event.
The method is advantageously characterized in that the important event corresponds to a command to display a text, a command to display a picture, or a command to reproduce a sound.
The invention also provides a device for synchronizing data in a multimedia file containing at least one track in which said data is stored and at least one synchronization command in each track, said device having first means for reading the data of each track and second means enabling the first means to communicate with each other, the data communicated between said first means concerning the occurrence of a synchronization command. The device is characterized in that one of the first data reading means is designated as having the highest priority and forces the other first means to synchronize with it.
The invention and its advantages will become clearer in the course of the following description with reference to the accompanying drawings.
FIG. 1, already described, represents the synchronization of a multimedia file in the prior art.
FIG. 2 is a diagram of a multimedia file conforming to the invention.
FIG. 3 is a detailed view of tracks in a multimedia file.
The data in a multimedia file according to the invention can comprise either time values or sound, text or picture coding values. The time values can represent a note duration, an image display time, a track start or end time, or a waiting time between two events. According to the invention, the tracks of the multimedia file also include synchronization commands related to the various events included in the track (note, picture, text, etc.).
FIG. 2 is a diagram showing the structure of a multimedia file according to the invention.
The multimedia file 50 includes a header 55 and tracks 60, 70 and 80. According to the invention, a multimedia file can include a number of tracks from 1 to n and FIG. 2 merely represents one example of this kind of file.
The header 55 includes data that is common to all of the tracks and is not described in detail here.
Each track of the file 50 can contain a single type of data. For example, track 60 can be a MIDI (Musical Instrument Digital Interface) format track for sound, track 70 can contain a sequence of pictures, and track 80 can contain sequences of texts. The different tracks may be intended to be scanned by microprocessors and presented simultaneously to a user. The different microprocessors therefore scan the tracks at the same time.
Each track 60, 70 and 80 has a respective header 65, 75 and 85. Each header contains an indicator of the type of data contained in the track. Thus the microprocessor able to read MIDI data knows from this indicator which track to read.
Each track also contains data organized in the form of commands which are executed sequentially by the microprocessor (for example to display a picture or a text).
FIG. 3 shows one example of a structure of three tracks contained in a multimedia file.
In this example:
    • Track 60 or track 1 contains only MIDI sound data. The sound data could consist of sampled sounds (speech, miscellaneous noises such as applause or microphone noise, etc.).
    • Track 70 or track 2 contains only data corresponding to sequences of JPEG images. This data could equally be video data.
    • Track 80 or track 3 contains only data corresponding to text messages.
Each track has a Start field for starting presentation to the user and an End field for ending presentation to the user.
Track 1 contains data relating to sound. A first field Nf1 represents the frequency of a first note and a second field Nd1 represents its duration. Likewise, the fields Nh2 and Nd2 define a second note. The field D1 represents a waiting time before presenting the subsequent notes of the track.
The fields Nh3 and Nd3 respectively represent the frequency and the duration of a third note.
Thus fields defining a note or a waiting time can follow on from each other in track 1.
Track 2 contains data corresponding to sequences of JPEG images. In this example, two JPEG images represented by the fields JPEG1 and JPEG2 must be presented to the user for a given time represented by the field JPEGd1 for the image JPEG1 and the field JPEGd2 for the image JPEG2. The fields JPEGD0, JPEGD1 and JPEGD2 represent waiting times before or between images.
Track 3 contains data corresponding to text messages. In this example two syllables represented by the fields TEXT1 and TEXT2 must be presented to the user. The fields TEXTD0 and TEXTD1 represent waiting times before a text.
The synchronization commands are represented by fields SYNCHi for i from 1 to n.
The synchronization commands are not temporal commands, as in the prior art, but are instead dependent on a specific event. Thus the fields SYNCHi are not present in the tracks at regular time intervals.
In the FIG. 3 example, musical notes included in the track 1 data must not be interrupted. Synchronizing the three tracks must not entail interrupting the music heard by the user. The microprocessor dedicated to reading this track is considered to be the master.
It forces the other microprocessors in charge of the other tracks, referred to as slaves, to synchronize with it.
Some notes must correspond to the display of an image or a syllable.
In this example, the first fields NF1 and Nd1 correspond to a first note. The second note, corresponding to the second fields NF2 and Nd2, must be heard at the moment the first picture is displayed, corresponding to the field JPEG1 of track 2. Then, after a waiting time corresponding to the field D1, the third note, corresponding to the third fields Nf3 and Nd3, must be heard at the moment that the first syllable is displayed, corresponding to the field TEXT1 of track 1. Finally, the fourth note, corresponding to the fourth fields Nf4 and Nd4, must be heard at the moment at which are simultaneously displayed the second picture, corresponding to the field JPEG1 in track 2, and the second syllable, corresponding to the field TEXT2 in track 1.
Thus the first synchronization command, which corresponds to the field SYNCH1, is:
    • between the fields Nd1 and Nf2 in track 1,
    • between the fields JPEGD0 and JPEG1 in track 2,
    • between the fields TEXTD0 and TEXTD1 in track 3.
The second synchronization command, which corresponds to the field SYNCH2, is:
    • between the fields D2 and Nf3 in track 1,
    • between the fields JPEGD1 and JPEGD2 in track 2,
    • between the fields TEXTD1 and TEXT in track 3.
The third synchronization command, which corresponds to the field SYNCH3, is:
    • between the fields Nd3 and Nf4 in track 1,
    • between the fields JPEGD2 and JPEG2 in track 2,
    • between the fields TEXT1 and TEXT2 in track 3.
When the multimedia document is presented to the user, the microprocessors scan all the tracks at the same time. Two situations arise, according to whether the slave microprocessors are lagging behind or in advance of the master microprocessor. Each slave microprocessor can receive data concerning the synchronization commands from the master microprocessor.
The master microprocessor, which is dedicated to track 1, reaches the first synchronization command, corresponding to the field SYNCH1, and sends first synchronization data to the other microprocessors.
Two situations arise:
    • If, at the moment it receives data, the slave microprocessor dedicated to track i, which is lagging behind, has not yet encountered the field SYNCH1 in track i, it continues to scan its file without executing the commands encountered, in order to reach the field SYNCH1 as quickly as possible. It then resumes execution of commands encountered after the field SYNCH 1.
    • If, before receiving the data, the slave microprocessor dedicated to track i had already reached the field SYNCH1, it stops reading the fields of track i until it receives the first synchronization data sent by the master microprocessor. It then resumes reading the fields of track i and executes the commands described.
Thus each important command, i.e. a command whose execution must not be interrupted, is represented by a given field preceded by a field representing a synchronization command. The synchronization command is at the same place in all the other tracks. Because of this, different tracks are resynchronized before any important command, if necessary.
Thus the invention synchronizes data in a multimedia file without overloading the memory with unnecessary synchronization data, and thereby restricting transfers of synchronization data between the microprocessors, without overloading the tracks with large quantities of unnecessary synchronization data, and most importantly without the execution of an important command stopping in the middle.

Claims (14)

1. A method of synchronizing data in a multimedia document which comprises at least first and second data files, the method comprising:
storing data of a first type in the first data file and data of a second type in the second data file, wherein the data of the first type and the data of the second type are stored sequentially in a plurality of fields of the first data file and the second data file, respectively, and at least one event-related command, which identifies an event which is executed during reproduction of the multimedia document, is stored in a field of the first data file and the second data file, and
inserting at least one synchronization command into the first data file and the second data file, wherein the at least one synchronization command is inserted before the at least one event-related command which is stored in the field of the first data file and the second data file and each synchronization command which is included in the first data file is repeated identically in the second data file of the multimedia document,
designating one of the first data file and the second data file as a priority file,
reading the first data file, and
reading the second data file,
wherein if the first data file is designated as the priority file and a synchronization command is encountered in the first data file prior to encountering a corresponding synchronization command in the second data file, reading the plurality of fields of the second data file in sequence without executing event related commands which are stored in the plurality of fields of the second data file, and
wherein if the corresponding synchronization command is encountered in the second data file, execution of event related commands in the second data file resumes.
2. The method according to claim 1, wherein the event, which is identified by the at least one event related command, is one of a command to display a text message, a command to display a picture, and a command to reproduce a sound.
3. The method according to claim 1, wherein a synchronization command which is included in the first data file is repeated identically in the second data file of the multimedia document such that execution of a first synchronization command which is included in the first data file at the beginning of the first file is concomitant with execution of the first synchronization command which is included in the second data file at the beginning of the second data file.
4. A method according to claim 3, wherein:
the first data file and the second data file are read simultaneously and successive event-related commands in the first data file and the second data file are executed.
5. The method according to claim 1, wherein each synchronization command is uniquely identified.
6. The method according to claim 1, wherein each synchronization command of each event to be executed simultaneously is numbered identically from one file to another.
7. The method according to claim 1, wherein the first data file is a first track of the multimedia document and the second data file is a second track of the multimedia document.
8. The method according to claim 1, wherein the first data file and the second data file are separate data files.
9. The method according to claim 1, further comprising:
if the second data file is designated as the priority file, and a synchronization command is encountered in the second data file prior to encountering a corresponding synchronization command in the first data file, reading the plurality of fields of the first data file in sequence without executing event related commands which are stored in the plurality of fields of the first data file,
wherein execution of event related commands in the first data file resumes after the corresponding synchronization command is encountered in the first data file.
10. The method according to claim 1, wherein if the first data file is designated as a priority file and a synchronization command is encountered in the second data file prior to encountering a corresponding synchronization command in the first data file, reading of the plurality of fields of the second data file is stopped and reading of the plurality of fields of the first data file in sequence and execution of event-related commands, which are stored in the plurality of fields of the first data file, is continued,
wherein reading of the second data file and execution of event related commands in the second data file resumes after the corresponding synchronization command is encountered in the first data file.
11. The method according to claim 1, wherein if the second data file is designated as the priority file and a synchronization command is encountered in the first data file prior to encountering a corresponding synchronization command in the second data file, reading of the plurality of fields of the first data file is stopped and reading of the plurality of fields of the second data file in sequence and execution of event-related commands, which are stored in the plurality of fields of the second data file, is continued,
wherein reading of the first data file and execution of event related commands in the first data file resumes after the corresponding synchronization command is encountered in the second data file.
12. The method according to claim 1, wherein the event related commands are read in sequence without skipping fields of the first data file if the synchronization command is encountered in the first data file prior to encountering the corresponding synchronization command in the second data file.
13. A device for implementing a method of synchronizing data in a multimedia document which comprises at first and second data files, the method comprising:
storing data of a first type in the first data file and data of a second type in the second data file, wherein the data of the first type and the data of the second type are stored sequentially in a plurality of fields of the first data file and the second data file, respectively, and at least one event-related command, which identifies an event which is executed during reproduction of the multimedia document, is stored in a field of the first data file and the second data file,
reading the first data file, and
reading the second data file,
inserting at least one synchronization command into the first data file and the second data file, wherein the at least one synchronization command is inserted before the at least one event-related command which is stored in the field of the first data file and the second data file and each synchronization command which is included in the first data file is repeated identically in the second data file of the multimedia document,
wherein if a synchronization command is encountered in the first data file prior to encountering a corresponding synchronization command in the second data file, the first data file is designated as a priority file and the plurality of fields of the second data file are read in sequence without executing event related commands which are stored in the plurality of fields of the second data file, and
wherein if the corresponding synchronization command is encountered in the second data file, execution of event related commands in the second data file resumes,
the device including reading means for reading data in said first data file and said second data file and communication means for communicating data with said reading means concerning reading of a synchronization command in a file.
14. The device according to claim 13, wherein the event related commands are read in sequence without skipping fields of the first data file if the synchronization command is encountered in the first data file prior to encountering the corresponding synchronization command in the second data file.
US10/380,288 2000-09-14 2001-09-13 Method for synchronizing a multimedia file Expired - Fee Related US7386782B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
FR00/11713 2000-09-14
FR0011713A FR2814027B1 (en) 2000-09-14 2000-09-14 METHOD FOR SYNCHRONIZING A MULTIMEDIA FILE
PCT/FR2001/002844 WO2002023912A1 (en) 2000-09-14 2001-09-13 Method for synchronising a multimedia file

Publications (2)

Publication Number Publication Date
US20040098365A1 US20040098365A1 (en) 2004-05-20
US7386782B2 true US7386782B2 (en) 2008-06-10

Family

ID=8854288

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/380,288 Expired - Fee Related US7386782B2 (en) 2000-09-14 2001-09-13 Method for synchronizing a multimedia file

Country Status (6)

Country Link
US (1) US7386782B2 (en)
EP (1) EP1189446A1 (en)
JP (1) JP4703095B2 (en)
CN (1) CN100396099C (en)
FR (1) FR2814027B1 (en)
WO (1) WO2002023912A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100178036A1 (en) * 2009-01-12 2010-07-15 At&T Intellectual Property I, L.P. Method and Device for Transmitting Audio and Video for Playback
US20110306397A1 (en) * 2010-06-11 2011-12-15 Harmonix Music Systems, Inc. Audio and animation blending
US20140344528A1 (en) * 2013-05-17 2014-11-20 Nvidia Corporation Techniques for assigning priorities to memory copies
US9358456B1 (en) 2010-06-11 2016-06-07 Harmonix Music Systems, Inc. Dance competition game
US9981193B2 (en) 2009-10-27 2018-05-29 Harmonix Music Systems, Inc. Movement based recognition and evaluation
US10220303B1 (en) 2013-03-15 2019-03-05 Harmonix Music Systems, Inc. Gesture-based music game
US10357714B2 (en) 2009-10-27 2019-07-23 Harmonix Music Systems, Inc. Gesture-based user interface for navigating a menu

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7159039B1 (en) * 2000-02-28 2007-01-02 Verizon Laboratories Inc. Systems and methods for providing in-band and out-band message processing
JP4765475B2 (en) * 2005-08-17 2011-09-07 ソニー株式会社 Information signal processing apparatus and processing method
US7764713B2 (en) * 2005-09-28 2010-07-27 Avaya Inc. Synchronization watermarking in multimedia streams

Citations (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5333299A (en) * 1991-12-31 1994-07-26 International Business Machines Corporation Synchronization techniques for multimedia data streams
US5471576A (en) * 1992-11-16 1995-11-28 International Business Machines Corporation Audio/video synchronization for application programs
US5487167A (en) * 1991-12-31 1996-01-23 International Business Machines Corporation Personal computer with generalized data streaming apparatus for multimedia devices
US5602356A (en) * 1994-04-05 1997-02-11 Franklin N. Eventoff Electronic musical instrument with sampling and comparison of performance data
US5642171A (en) * 1994-06-08 1997-06-24 Dell Usa, L.P. Method and apparatus for synchronizing audio and video data streams in a multimedia system
US5675511A (en) * 1995-12-21 1997-10-07 Intel Corporation Apparatus and method for event tagging for multiple audio, video, and data streams
US5680639A (en) * 1993-05-10 1997-10-21 Object Technology Licensing Corp. Multimedia control system
US5701511A (en) * 1995-08-02 1997-12-23 Microsoft Corporation Redbook audio sequencing
US5737531A (en) * 1995-06-27 1998-04-07 International Business Machines Corporation System for synchronizing by transmitting control packet to omit blocks from transmission, and transmitting second control packet when the timing difference exceeds second predetermined threshold
US5751280A (en) 1995-12-11 1998-05-12 Silicon Graphics, Inc. System and method for media stream synchronization with a base atom index file and an auxiliary atom index file
US5754783A (en) * 1996-02-01 1998-05-19 Digital Equipment Corporation Apparatus and method for interleaving timed program data with secondary data
US5768607A (en) 1994-09-30 1998-06-16 Intel Corporation Method and apparatus for freehand annotation and drawings incorporating sound and for compressing and synchronizing sound
US5794018A (en) * 1993-11-24 1998-08-11 Intel Corporation System and method for synchronizing data streams
US5808987A (en) * 1990-03-30 1998-09-15 Hitachi, Ltd. Information reproducing system for reproducing interleaved image data and voice data from a recording medium in accordance with a control program reproducing from the recording medium
US5822537A (en) * 1994-02-24 1998-10-13 At&T Corp. Multimedia networked system detecting congestion by monitoring buffers' threshold and compensating by reducing video transmittal rate then reducing audio playback rate
US5826102A (en) * 1994-12-22 1998-10-20 Bell Atlantic Network Services, Inc. Network arrangement for development delivery and presentation of multimedia applications using timelines to integrate multimedia objects and program objects
US5861880A (en) * 1994-10-14 1999-01-19 Fuji Xerox Co., Ltd. Editing system for multi-media documents with parallel and sequential data
US5902949A (en) * 1993-04-09 1999-05-11 Franklin N. Eventoff Musical instrument system with note anticipation
US6006241A (en) * 1997-03-14 1999-12-21 Microsoft Corporation Production of a video stream with synchronized annotations over a computer network
US6016166A (en) * 1998-08-31 2000-01-18 Lucent Technologies Inc. Method and apparatus for adaptive synchronization of digital video and audio playback in a multimedia playback system
US6148139A (en) * 1993-10-29 2000-11-14 Time Warner Entertainment Co., L.P. Software carrier with operating commands embedded in data blocks
US6173317B1 (en) * 1997-03-14 2001-01-09 Microsoft Corporation Streaming and displaying a video stream with synchronized annotations over a computer network
US6177928B1 (en) * 1997-08-22 2001-01-23 At&T Corp. Flexible synchronization framework for multimedia streams having inserted time stamp
US6195701B1 (en) * 1994-03-16 2001-02-27 International Business Machines Corporation Method and apparatus for synchronization and scheduling of multiple data streams and real time tasks
US20010014891A1 (en) * 1996-05-24 2001-08-16 Eric M. Hoffert Display of media previews
US6288990B1 (en) * 1997-10-21 2001-09-11 Sony Corporation Reproducing apparatus, recording apparatus, and recording medium
US6334026B1 (en) * 1998-06-26 2001-12-25 Lsi Logic Corporation On-screen display format reduces memory bandwidth for time-constrained on-screen display systems
US6349286B2 (en) * 1998-09-03 2002-02-19 Siemens Information And Communications Network, Inc. System and method for automatic synchronization for multimedia presentations
US6415135B1 (en) * 1995-06-12 2002-07-02 Oy Nokia Ab Transmission protocol for file transfer in a DAB system
US20020116361A1 (en) * 2000-08-15 2002-08-22 Sullivan Gary J. Methods, systems and data structures for timecoding media samples
US6449653B2 (en) * 1997-03-25 2002-09-10 Microsoft Corporation Interleaved multiple multimedia stream for synchronized transmission over a computer network
US6453355B1 (en) * 1998-01-15 2002-09-17 Apple Computer, Inc. Method and apparatus for media data transmission
US20020159519A1 (en) * 2000-10-20 2002-10-31 Tabatabai Ali J. Delivery of multimedia descriptions using access units
US6480902B1 (en) * 1999-05-25 2002-11-12 Institute For Information Industry Intermedia synchronization system for communicating multimedia data in a computer network
US6490553B2 (en) * 2000-05-22 2002-12-03 Compaq Information Technologies Group, L.P. Apparatus and method for controlling rate of playback of audio data
US6512778B1 (en) * 1998-01-15 2003-01-28 Apple Computer, Inc. Method and apparatus for media data transmission
US6564263B1 (en) * 1998-12-04 2003-05-13 International Business Machines Corporation Multimedia content description framework
US6611537B1 (en) * 1997-05-30 2003-08-26 Centillium Communications, Inc. Synchronous network for digital media streams
US6631522B1 (en) * 1998-01-20 2003-10-07 David Erdelyi Method and system for indexing, sorting, and displaying a video database
US6665835B1 (en) * 1997-12-23 2003-12-16 Verizon Laboratories, Inc. Real time media journaler with a timing event coordinator
US20040017389A1 (en) * 2002-07-25 2004-01-29 Hao Pan Summarization of soccer video content
US20040103372A1 (en) * 1997-12-22 2004-05-27 Ricoh Company, Ltd. Multimedia visualization and integration environment
US6744763B1 (en) * 1998-01-15 2004-06-01 Apple Computer, Inc. Method and apparatus for media data transmission
US6771703B1 (en) * 2000-06-30 2004-08-03 Emc Corporation Efficient scaling of nonscalable MPEG-2 Video
US6792615B1 (en) * 1999-05-19 2004-09-14 New Horizons Telecasting, Inc. Encapsulated, streaming media automation and distribution system
US6871006B1 (en) * 2000-06-30 2005-03-22 Emc Corporation Processing of MPEG encoded video for trick mode operation

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05242208A (en) * 1992-03-03 1993-09-21 Nippon Telegr & Teleph Corp <Ntt> Device and method for editing multimedium scenario
JPH06150625A (en) * 1992-11-02 1994-05-31 Fujitsu Ltd Sound synchronized movie reproducing system
JPH077727A (en) * 1993-06-16 1995-01-10 Matsushita Electric Ind Co Ltd Multi-medium information transmission reception system
US5799315A (en) * 1995-07-07 1998-08-25 Sun Microsystems, Inc. Method and apparatus for event-tagging data files automatically correlated with a time of occurence in a computer system
JPH09326886A (en) * 1996-06-03 1997-12-16 Nippon Telegr & Teleph Corp <Ntt> Transmitter or storage device for content of multimedia information

Patent Citations (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5808987A (en) * 1990-03-30 1998-09-15 Hitachi, Ltd. Information reproducing system for reproducing interleaved image data and voice data from a recording medium in accordance with a control program reproducing from the recording medium
US5487167A (en) * 1991-12-31 1996-01-23 International Business Machines Corporation Personal computer with generalized data streaming apparatus for multimedia devices
US5333299A (en) * 1991-12-31 1994-07-26 International Business Machines Corporation Synchronization techniques for multimedia data streams
US5471576A (en) * 1992-11-16 1995-11-28 International Business Machines Corporation Audio/video synchronization for application programs
US5902949A (en) * 1993-04-09 1999-05-11 Franklin N. Eventoff Musical instrument system with note anticipation
US5680639A (en) * 1993-05-10 1997-10-21 Object Technology Licensing Corp. Multimedia control system
US6148139A (en) * 1993-10-29 2000-11-14 Time Warner Entertainment Co., L.P. Software carrier with operating commands embedded in data blocks
US5794018A (en) * 1993-11-24 1998-08-11 Intel Corporation System and method for synchronizing data streams
US5822537A (en) * 1994-02-24 1998-10-13 At&T Corp. Multimedia networked system detecting congestion by monitoring buffers' threshold and compensating by reducing video transmittal rate then reducing audio playback rate
US6195701B1 (en) * 1994-03-16 2001-02-27 International Business Machines Corporation Method and apparatus for synchronization and scheduling of multiple data streams and real time tasks
US5602356A (en) * 1994-04-05 1997-02-11 Franklin N. Eventoff Electronic musical instrument with sampling and comparison of performance data
US5642171A (en) * 1994-06-08 1997-06-24 Dell Usa, L.P. Method and apparatus for synchronizing audio and video data streams in a multimedia system
US5768607A (en) 1994-09-30 1998-06-16 Intel Corporation Method and apparatus for freehand annotation and drawings incorporating sound and for compressing and synchronizing sound
US5861880A (en) * 1994-10-14 1999-01-19 Fuji Xerox Co., Ltd. Editing system for multi-media documents with parallel and sequential data
US5826102A (en) * 1994-12-22 1998-10-20 Bell Atlantic Network Services, Inc. Network arrangement for development delivery and presentation of multimedia applications using timelines to integrate multimedia objects and program objects
US6415135B1 (en) * 1995-06-12 2002-07-02 Oy Nokia Ab Transmission protocol for file transfer in a DAB system
US5737531A (en) * 1995-06-27 1998-04-07 International Business Machines Corporation System for synchronizing by transmitting control packet to omit blocks from transmission, and transmitting second control packet when the timing difference exceeds second predetermined threshold
US5701511A (en) * 1995-08-02 1997-12-23 Microsoft Corporation Redbook audio sequencing
US5751280A (en) 1995-12-11 1998-05-12 Silicon Graphics, Inc. System and method for media stream synchronization with a base atom index file and an auxiliary atom index file
US5675511A (en) * 1995-12-21 1997-10-07 Intel Corporation Apparatus and method for event tagging for multiple audio, video, and data streams
US5754783A (en) * 1996-02-01 1998-05-19 Digital Equipment Corporation Apparatus and method for interleaving timed program data with secondary data
US20010014891A1 (en) * 1996-05-24 2001-08-16 Eric M. Hoffert Display of media previews
US6230172B1 (en) * 1997-01-30 2001-05-08 Microsoft Corporation Production of a video stream with synchronized annotations over a computer network
US6006241A (en) * 1997-03-14 1999-12-21 Microsoft Corporation Production of a video stream with synchronized annotations over a computer network
US6173317B1 (en) * 1997-03-14 2001-01-09 Microsoft Corporation Streaming and displaying a video stream with synchronized annotations over a computer network
US6449653B2 (en) * 1997-03-25 2002-09-10 Microsoft Corporation Interleaved multiple multimedia stream for synchronized transmission over a computer network
US6611537B1 (en) * 1997-05-30 2003-08-26 Centillium Communications, Inc. Synchronous network for digital media streams
US6177928B1 (en) * 1997-08-22 2001-01-23 At&T Corp. Flexible synchronization framework for multimedia streams having inserted time stamp
US6288990B1 (en) * 1997-10-21 2001-09-11 Sony Corporation Reproducing apparatus, recording apparatus, and recording medium
US20040103372A1 (en) * 1997-12-22 2004-05-27 Ricoh Company, Ltd. Multimedia visualization and integration environment
US6665835B1 (en) * 1997-12-23 2003-12-16 Verizon Laboratories, Inc. Real time media journaler with a timing event coordinator
US6453355B1 (en) * 1998-01-15 2002-09-17 Apple Computer, Inc. Method and apparatus for media data transmission
US6512778B1 (en) * 1998-01-15 2003-01-28 Apple Computer, Inc. Method and apparatus for media data transmission
US6744763B1 (en) * 1998-01-15 2004-06-01 Apple Computer, Inc. Method and apparatus for media data transmission
US6631522B1 (en) * 1998-01-20 2003-10-07 David Erdelyi Method and system for indexing, sorting, and displaying a video database
US6334026B1 (en) * 1998-06-26 2001-12-25 Lsi Logic Corporation On-screen display format reduces memory bandwidth for time-constrained on-screen display systems
US6016166A (en) * 1998-08-31 2000-01-18 Lucent Technologies Inc. Method and apparatus for adaptive synchronization of digital video and audio playback in a multimedia playback system
US6349286B2 (en) * 1998-09-03 2002-02-19 Siemens Information And Communications Network, Inc. System and method for automatic synchronization for multimedia presentations
US6564263B1 (en) * 1998-12-04 2003-05-13 International Business Machines Corporation Multimedia content description framework
US6792615B1 (en) * 1999-05-19 2004-09-14 New Horizons Telecasting, Inc. Encapsulated, streaming media automation and distribution system
US6480902B1 (en) * 1999-05-25 2002-11-12 Institute For Information Industry Intermedia synchronization system for communicating multimedia data in a computer network
US6490553B2 (en) * 2000-05-22 2002-12-03 Compaq Information Technologies Group, L.P. Apparatus and method for controlling rate of playback of audio data
US6771703B1 (en) * 2000-06-30 2004-08-03 Emc Corporation Efficient scaling of nonscalable MPEG-2 Video
US6871006B1 (en) * 2000-06-30 2005-03-22 Emc Corporation Processing of MPEG encoded video for trick mode operation
US20020116361A1 (en) * 2000-08-15 2002-08-22 Sullivan Gary J. Methods, systems and data structures for timecoding media samples
US20020159519A1 (en) * 2000-10-20 2002-10-31 Tabatabai Ali J. Delivery of multimedia descriptions using access units
US20040017389A1 (en) * 2002-07-25 2004-01-29 Hao Pan Summarization of soccer video content

Non-Patent Citations (19)

* Cited by examiner, † Cited by third party
Title
Agarwal, Nipun, et al., "Synchronization of Distributed Multimedia Data in an Application-Specific Manner", Multimedia '94, San Francisco, CA, Oct. 1994, pp. 141-148. *
Auffret, Gwendal, et al., "Audiovisual-based Hypermedia Authoring: Using Structured Representations for Efficient Access to AV Documents", Hypertext '99, Darmstadt, Germany, Feb. 1999, pp. 169-178 [ACM 1-58113-064-3/99/2]. *
Bulterman, Dick, "Embedded Video in Hypermedia Documents: Supporting Integraion and Adaptive Control", ACM Transactions on Infromation Systems, vol. 13 No. 4, (C) 1995, pp. 440-470 [ACM 1046-8188/95/1000-0440]. *
Chen Herng-Yow et al, "Multisync: A Synchronization Model for Multimedia Systems", IEEE Journal on Selected Areas in Communications, IEEE, Inc. NY, vol. 14, No. 1, 1996, pp. 238-248, XP000548825.
Chen, Herng-Yow, et al., "An RTP-based Synchronized Hypermedia Live Lecture System for Distance Education", Multimedia '99, Orlando, FL, Oct. 1999, pp. 91-99. *
Courtiat, Jean-Pierre, et al., "Towards a New Multimedia Synchronizatin Mechanism and its Formal Specification", Multimedia '94, San Francisco, CA, Oct. 1994, pp. 133-140. *
Ehley, Lynnae, et al., "Evaluation of Multimedia Synchronization Techniques", Proc. of the Intern'l Conf. on Multimedia Computing and Systems, Boston, MA, May 15-19, 1994, pp. 514-519. *
F. Kretz et al, "Coded Representation of multimedia and Hypermedia Information Objects: Towards the MHEG Standard", Signal Processing. Image Communication, Elsevier Science Publishers, Amsterdam, NL, vol. 4, Nr. 2, pp. 113-128 XP000273158 (c) 1992.
Gringeri, Steven, et al., "Robust Compression and Transmission of MPEG-4 Video", Multimedia '99, Orlando, FL, Oct. 1999, pp. 113-120. *
Hac, Anna, et al., "Synchronization in Multimedia Data Retrieval", International Journal of Network Management, vol. 7, (C) 1997, pp. 33-62. *
Herman, Ivan, et al., "A Standard Model For Multimedia Synchronization: PREMO synchronization Objects", Multimedia Systems, vol. 6, No. 2, Mar. 1998, pp. 88-101. *
Hürst, W., et al., "A Synchronization Model for Recorded Presentations and Its Relevance for Information Retrieval", Multimedia '99, Orlando, FL, Oct. 1999, pp. 333-342. *
Jacobs, Martin, et al., "Specification of Synchronizaton in Multimedia Conferencing Services Using the TINA Lifecycle Model", Distrib. Syst. Engng., vol. 3, (C) 1996, pp. 185-196. *
Li, Lain, et al., "MPEG-2 Coded- and Uncoded- Stream Synchronization Control for Real-Time Multimedia Transmission and Presentation over B-ISDN", Multimedia '94, San Francisco, CA, Oct. 1994, pp. 239-246. *
Li, Li, et al., "Real-time Synchronization Control in Multimedia Distributed Systems", ACM SIGCOMM Computer Communication Review, vol. 22, issue 3, Jul. 1992, pp. 79-87 (plus citation page). *
M. D. Eyles, Generic Aspects of Multimedia Presentation BT Technology Journal, GB, BT Laboratories, vol. 13, Nr. 4, pp. 32-43, XP000538878 (c) 1995.
Mukhopadhyay, Sugata, et al., "Passive Capture and Structuring of Lectures", ACM Multimedia '99, Orlando, FL, Oct. 1999, pp. 477-487. *
Steinmetz, Rafl, "Synchronization Properties in Multimedia Systems", IEEE Journal on Selected Areas in Communications, vol. 8, Issue 3, Apr. 1990, pp. 401-412. *
Yuang, Maria C., et al., "DMTS: A Distributed Multimedia Teleworking System", Multimedia Tools and Applications, vol. 7, No. 3, Nov. 1998, pp. 227-240. *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10650862B2 (en) 2009-01-12 2020-05-12 At&T Intellectual Property I, L.P. Method and device for transmitting audio and video for playback
US8731370B2 (en) 2009-01-12 2014-05-20 At&T Intellectual Property I, L.P. Method and device for transmitting audio and video for playback
US9237176B2 (en) 2009-01-12 2016-01-12 At&T Intellectual Property I, Lp Method and device for transmitting audio and video for playback
US20100178036A1 (en) * 2009-01-12 2010-07-15 At&T Intellectual Property I, L.P. Method and Device for Transmitting Audio and Video for Playback
US9981193B2 (en) 2009-10-27 2018-05-29 Harmonix Music Systems, Inc. Movement based recognition and evaluation
US10421013B2 (en) 2009-10-27 2019-09-24 Harmonix Music Systems, Inc. Gesture-based user interface
US10357714B2 (en) 2009-10-27 2019-07-23 Harmonix Music Systems, Inc. Gesture-based user interface for navigating a menu
US9358456B1 (en) 2010-06-11 2016-06-07 Harmonix Music Systems, Inc. Dance competition game
US8702485B2 (en) 2010-06-11 2014-04-22 Harmonix Music Systems, Inc. Dance game and tutorial
US20110306397A1 (en) * 2010-06-11 2011-12-15 Harmonix Music Systems, Inc. Audio and animation blending
US10220303B1 (en) 2013-03-15 2019-03-05 Harmonix Music Systems, Inc. Gesture-based music game
US9483423B2 (en) * 2013-05-17 2016-11-01 Nvidia Corporation Techniques for assigning priorities to memory copies
US20140344528A1 (en) * 2013-05-17 2014-11-20 Nvidia Corporation Techniques for assigning priorities to memory copies

Also Published As

Publication number Publication date
CN1457601A (en) 2003-11-19
WO2002023912A1 (en) 2002-03-21
CN100396099C (en) 2008-06-18
JP2004509427A (en) 2004-03-25
JP4703095B2 (en) 2011-06-15
FR2814027A1 (en) 2002-03-15
FR2814027B1 (en) 2003-01-31
US20040098365A1 (en) 2004-05-20
EP1189446A1 (en) 2002-03-20

Similar Documents

Publication Publication Date Title
US7337175B2 (en) Method of storing data in a multimedia file using relative timebases
CN100502473C (en) Apparatus and method for coordinating synchronization of video and captions
US7386782B2 (en) Method for synchronizing a multimedia file
EP2242043A1 (en) Information processing apparatus with text display function, and data acquisition method
KR100902013B1 (en) Tiled-display system and synchronization method in the system
JP2004007140A (en) Voice reproducing device and voice reproduction control method to be used for the same device
JP2006172432A (en) System and method for converting compact media format files to synchronized multimedia integration language
US7631326B2 (en) Synchronization mechanism for multimedia captioning and audio description
CN111819621B (en) Display device and multi-display system
EP0929044A3 (en) Rich text medium displaying method and picture information providing system
EP3203468A1 (en) Acoustic system, communication device, and program
EP1414019A2 (en) A system for forming synchronous information of melody and image, and a system for synchronously producing melody and image
JP3317140B2 (en) Data transmission method
JP5552993B2 (en) MXF processing equipment
US20040241632A1 (en) Karaoke service method and system by mobile device
JPH11134804A (en) Image and audio synchronization system
JP2000222381A (en) Album preparation method and information processor and information outputting device
JP2000253312A (en) Method and system for producing program with subtitle
JP2000235564A (en) Remote cooperative training method and recording medium recording remote cooperative training program for lecturer&#39;s terminal and trainee&#39;s terminal
EP3609191A1 (en) Virtual reality viewing system, reproduction synchronizing method, and virtual reality viewing program
JP2005222431A (en) Cooperative work system
JPS6378680A (en) Video output device
JP2000011612A (en) Content reproduction-synchronization system, method therefor and recording medium
JP2004242026A (en) Video and audio reproduction device and video and audio reproduction system
JPH06139133A (en) Input/output device

Legal Events

Date Code Title Description
AS Assignment

Owner name: ALCATEL, FRANCE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:COMPS, CHRISTOPHE;BOUDET, DANIEL;SARREMEJEAN, XAVIER;REEL/FRAME:014311/0893

Effective date: 20030220

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

AS Assignment

Owner name: CREDIT SUISSE AG, NEW YORK

Free format text: SECURITY AGREEMENT;ASSIGNOR:LUCENT, ALCATEL;REEL/FRAME:029821/0001

Effective date: 20130130

Owner name: CREDIT SUISSE AG, NEW YORK

Free format text: SECURITY AGREEMENT;ASSIGNOR:ALCATEL LUCENT;REEL/FRAME:029821/0001

Effective date: 20130130

AS Assignment

Owner name: ALCATEL LUCENT, FRANCE

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CREDIT SUISSE AG;REEL/FRAME:033868/0001

Effective date: 20140819

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20160610