WO1998029835A1 - Remote platform independent dynamic multimedia engine - Google Patents

Remote platform independent dynamic multimedia engine Download PDF

Info

Publication number
WO1998029835A1
WO1998029835A1 PCT/US1995/013433 US9513433W WO9829835A1 WO 1998029835 A1 WO1998029835 A1 WO 1998029835A1 US 9513433 W US9513433 W US 9513433W WO 9829835 A1 WO9829835 A1 WO 9829835A1
Authority
WO
WIPO (PCT)
Prior art keywords
file
multimedia
multimedia presentation
story
files
Prior art date
Application number
PCT/US1995/013433
Other languages
French (fr)
Inventor
Gerard Kunkel
Michael Heydt
Jerry Cross
Jason Nocks
Howard Portugal
Alan Mcglade
Original Assignee
Starnet, Incorporated
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Starnet, Incorporated filed Critical Starnet, Incorporated
Priority to AU40035/95A priority Critical patent/AU4003595A/en
Publication of WO1998029835A1 publication Critical patent/WO1998029835A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/48Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data

Definitions

  • the present invention relates to the preparation and presentation of multimedia presentations and, in particular, to a multimedia engine for the preparation and presentation of such multimedia presentations.
  • the present invention comprises a system and process for constructing a continuous multimedia presentation from source data. While there are a number of systems known in the art for creating multimedia presentations and for replaying them with similar hardware, the multimedia engine of the present invention operates on a 24-hour basis assembling multimedia presentations based upon supplied data. This raw data is in the form of component data files, including image files, video files, audio files and text files. The multimedia engine dynamically constructs the presentation based upon this data as instructed by a multimedia scripting language. This language instructs the multimedia engine where, when, and how to assemble multimedia presentation. The language further describes when and how to stitch together multiple presentations. The intent of this invention is to create a low cost means of presenting multimedia information in a continuous form. The novel multimedia engine may be used in cable television systems where the cost of creating programming content for the viewer is of great concern. Using this system and method allows for the creation of programming content dynamically and inexpensively.
  • Another feature of the present invention allows the multimedia presentation to be organized based on scripting files that are transmitted to the multimedia engine, for instance via a satellite feed, or based on function calls that are either inputted at the site of the multimedia engine or are inputted remotely, typically using a telephone line and a modem.
  • the versatility of the present invention allows for a single multimedia engine design to operate, for numerous cable television systems, (1 ) products that promote pay-per-view broadcasting, (2) products that operate a news outlet that can be facilely tailored to highlight local content news, (3) television programming guides and (4) advertising programming. These various programming formats can be controlled from a single remote site.
  • a multimedia presentation system for providing a multimedia presentation has a playback processor for receiving and storing information.
  • the received stored information includes video information, graphics information, audio information and text information.
  • the received stored information further includes multimedia information.
  • a multimedia engine accesses and merges the stored information in accordance with the multimedia information to provide the multimedia presentation.
  • the invention is directed to a multimedia presentation system capable of dynamically constructing a multimedia presentation comprising: a receiving and storing means for receiving and storing commands and data files, which commands comprise script files or function calls; and a multimedia engine comprising a renderer and an application programming interface or a script manager, which multimedia engine comprises: a translating means for translating human-readable script files or human-readable function calls into commands usable by the renderer; and an executing means for executing the translated commands to create a multimedia presentation comprising audio and video components for broadcast, cable transmission or display.
  • the presentation system of the invention advantageously provides a multimedia engine that has both an application programming interface and a script manager.
  • the presentation system of the invention preferably stores and processes, segment files, story files, template files, advertisement files, video files and audio files as described hereinbelow.
  • the invention is directed to a multimedia presentation system capable of dynamically constructing a multimedia presentation for cable transmission to a plurality of cable television users comprising: a receiving and storing means for receiving and storing commands and data files, which commands comprise script files or function calls; at least one segment file, at least one story file and at least one template file stored in the receiving and storing means, wherein said at least one segment file encodes the name of said at least one story file, and said at least one story file encodes text and the name of said at least one template file; and a multimedia engine comprising a renderer, an application programming interface and a script manager, which multimedia engine comprises: a translating means of translating human-readable script files or human-readable function calls into commands usable by the renderer; and an executing means of executing the translated commands to create a multimedia presentation comprising audio and video components for cable transmission.
  • a multimedia presentation system capable of dynamically constructing a multimedia presentation for cable transmission to a plurality of cable television users comprising: a receiving and storing means for receiving and storing commands and
  • Advertisement file - lists the advertisements that are scheduled for a defined period of time, typically a day.
  • API application programming - the code module in the multimedia engine that interface (API) translates the human-readable function calls into instructions that are capable of being executed by the multimedia engine.
  • Dynamically constructing a constructing the multimedia presentation as the multimedia presentation script and data files are being input from the receiving and storing means for instance, within about 10 seconds of the input of the script and data files.
  • the script and data files already within the receiving and storing means can be substituted before the scheduled broadcast, cable transmission or display time of a multimedia presentation, so long as the substitution occurs before the time and date that the segment, as defined by the segment file, in which the script and data files are to be presented is scheduled for broadcast, cable transmission or display.
  • Function call call - an instruction, with zero or more arguments, to or command perform some task.
  • Hunt mode one of the multimedia engine's modes of operation, in which the engine autonomously searches for, then parses, a sequence of segment files to produce a multimedia presentation.
  • MPEG - a compressed digital audio/video bitstream format developed by the Motion Pictures Expert Group, a part of the International Standards Organization (ISO).
  • Message file - lists all of the messages, such as public interest information, together with their start times and durations, which are scheduled for broadcast during a given period of time, such as a day.
  • Metafile - a set of translated commands which are to be executed by the renderer.
  • Multimedia - pertaining to or employing one or more media, including (but not limited to): full-motion video, audio, computer graphics, text and still images.
  • Multimedia engine - a computer programmed to read in command information (which can, for example, be in the form of scripting files or function calls) and various data files and to output a multimedia presentation, for example, for display or broadcast.
  • Page - consists of a collection of audio and visual elements which begin to be presented in a multimedia presentation at the same time.
  • Rasterizer abstract command e.g., "put a line from x to y" and converts it into the actual pixel information that will appear on a screen, for instance in a multimedia presentation.
  • Renderer - is made up of a graphics renderer, graphics rasterizer, font rasterizer, video decoder interface, graphics hardware interface, digital video subsystem and graphics generator/video overlay card, or the functional equivalents of these elements.
  • the renderer further includes one or both of an audio decoder interface and an audio card, or the functional equivalents of these elements.
  • Scripting file - a file containing commands which define the format, appearance, timing and content of a multimedia presentation that will be outputted by the multimedia engine.
  • the scripting files also contain references to any other files that may be needed for presentations, such as audio or video files.
  • Script manager the code module in the multimedia engine that translates script files into commands, which can be organized into metafiles, which commands or metafiles are sent to the renderer - the script manager performs the following functions: parsing; hunting; control; and timing.
  • Segment file - a list of the story and advertisement files that defines the content for a given period of time in a multimedia presentation.
  • Story file - a file which defines a sequence of multimedia events which make up a portion of a multimedia presentation.
  • Tags - an identifier for an element in a template of a story file which defines what kind of content will be presented in a page: text, audio file, video file, image file or graphical element such as line, box, and the like.
  • Tags mark the start of a line in a script file and are indicated by an opening " ⁇ " bracket and a closing ">" bracket.
  • Template file - a file which defines the size, position, style, color, layout, and general appearance of the visual elements which appear on a page in a multimedia presentation.
  • each of the elements in a template file has a tag, which matches a tag for that element in the story file.
  • Fig. 1 is a block diagram representation of a data delivery uplink and downlink system having a cable television distribution headend downlink including the multimedia engine of the present invention
  • Fig. 2 is a more detailed block diagram representation of the cable system headend of the data delivery uplink and downlink system of Fig. 1
  • Fig. 3 is a block diagram representation of the hierarchy of the textual script files which are applied to the playback computer memory of the multimedia engine of Fig. 1 ;
  • Fig. 4 is a block diagram representation of the relationship of types of multimedia scripting language textual files which are applied to the playback computer memory of the multimedia engine of Fig. 1 ;
  • Fig. 5 is a block diagram representation of a parsing process for the parsing of textual script files which are applied to the playback computer memory of the multimedia engine of Fig. 1 ;
  • Fig. 6 is a block diagram representation of a textual script file line parsing process for the parsing of the individual lines of textual script files which are applied to the playback computer memory of the multimedia engine of Fig. 1 ;
  • Fig. 7 is a block diagram representation of a segment event generation process for generating the segment events of a segment file which is applied to the playback computer memory of the multimedia engine of Fig.
  • Fig. 8 is a block diagram representation of a story event generation process for generating the story events of a story file which is applied to the playback computer memory of the multimedia engine of Fig. 1 ;
  • Fig. 9 is a block diagram representation of the determination and opening of an advertising list file and the retrieving of an ad record from an advertising list which is applied to the playback computer memory of the multimedia engine of Fig. 1 ;
  • Fig. 10 is a block diagram representation of an advertising event generation process for generating the advertising events within an advertising record which is applied to the playback computer memory of the multimedia engine of Fig. 1 ;
  • Fig. 11 is a block diagram representation of a process control overview of the multimedia engine of Fig. 1 ; and Fig. 12 is a block diagram representation of the queue events generation for a page of a multimedia presentation.
  • a data delivery uplink and downlink system 100 including a cable television downlink site 134 (referred to as the headend).
  • the cable television distribution headend 134 receives and processes multimedia information (including full motion video files, image files, audio files and script files) which is transmitted from a remote uplink site 101 within the system 100.
  • multimedia information including full motion video files, image files, audio files and script files
  • the transmission of the multimedia information from the remote uplink site 101 to the cable television distribution headend 134 is performed by way of an uplink dish 102 at the remote programming uplink site 101 and a satellite transponder 104.
  • Broadcast multimedia scripting data included within the multimedia information for use in multimedia presentations may be transmitted in this manner to a plurality of cable television distribution headends 134, 136.
  • Each of the cable television headends 134, 136 which receives the broadcast multimedia scripting data includes a receiver/demodulator unit 108, a communications server computer 112 and at least one playback computer 124 in order to provide multimedia presentations according to a predetermined broadcast multimedia scripting language for the users of a cable television system such as the cable television users 135, 142.
  • the broadcast multimedia scripting data from the remote programming uplink site 101 is received by a satellite receive dish 106 at the cable television headend 134 and passed to the receiver/demodulation unit 108.
  • the broadcast multimedia data may be provided in accordance with any one of a number of multimedia scripting languages understandable to those skilled in the art which are suitable for indicating how to merge and present received information to provide a multimedia presentation.
  • the scripting language allows for story, segment, advertisement and template files to be organized as described in the Appendix and hereinbelow. Since the broadcast multimedia scripting data is transmitted by way of the satellite transponder 104 the differing cable television network headends 134, 136 may receive and present the same or different broadcast multimedia scripting data simultaneously to their cable television network users 135, 142.
  • the broadcast multimedia scripting data is received by the cable television headend 134 from the satellite receive dish 106 in an analog form in the preferred embodiment of the cable television network headend 134. It is applied by way of a coaxial transmission cable 107 to the receiver/demodulator unit 108.
  • the receiver/demodulator unit 108 converts the analog signal data received by way of the transmission cable 107 into a digital data stream representing the broadcast multimedia scripting data from the remote uplink site 101.
  • the digital data stream formed by the receiver/demodulator unit 108 is applied to a transmission cable 110.
  • the digital data stream of the transmission cable 110 is applied to the communications server (computer) 112.
  • the communications server 112 receives the digital data stream and stores it within the communications server 112.
  • the data stored in this manner is distributed later within the cable television headend 134.
  • the communications server 112 also communicates with a conventional telephone system 118 through a telephone modem line 116 which is coupled to conventional telephone system lines 119.
  • the telephone modem line 116 allows the communications server 112 to send diagnostic information to the remote programming uplink site 101 by way of the telephone system lines 119.
  • the remote uplink site 101 may dial into the communications server 112 of the cable television network headend 134 by way of the telephone system lines 119 and a telephone modem line 120. This allows the remote uplink site 101 to perform remote diagnostics of the cable television distribution headend 134.
  • new broadcast multimedia scripting data When new broadcast multimedia scripting data is stored within the communications server 112 it is moved across a bidirectional peer-to-peer network cable 122 within the cable television headend 134 to the playback computer 124.
  • the playback computer 124 As described in more detail hereinbelow, the playback computer 124 is provided with its own playback computer disk for storage of the data received in this manner.
  • the playback computer 124 places the new data into predetermined subdirectories on the disk. This permits the data within the playback computer data storage to be accessed and merged as required by the multimedia engine 128 according to the received scripting information in order to assemble and provide multimedia presentations for the cable television network users 135.
  • the playback computer 124 When the multimedia engine 128 provides such a multimedia presentation, the playback computer 124 outputs analog audio signals to the cable television network users 135 by way of the audio presentation line 130. Additionally, the playback computer 124 outputs video signals to the cable users 135 of the cable television headend 134 by way of the video presentation line 132.
  • the cable users 142 of the cable television headend 136 within the data delivery uplink and downlink system 100 may receive a multimedia presentation in a similar manner by way of the presentation lines 138, 140.
  • FIG. 2 there is shown a more detailed block diagram representation of cable television headend 134.
  • the broadcast multimedia scripting data transmitted from the remote programming uplink site 101 is received by a satellite receive dish 106 at the cable television headend 134 and passed to the receiver/demodulation unit 108 as previously described.
  • the data is then transmitted by way of the transmission cable 110 to the communications server 112.
  • the communications server 112 receives the digital data by way of a high-speed transport adapter 210 and writes it to a hard disk drive 216 within the communications server 112.
  • the data written to the hard disk drive 216 in this manner is controlled by a disk drive controller 214 and stored for later distribution within the cable television headend 134.
  • a modem Also present within the communications server 112 is a modem
  • the modem 212 such as a modem that transmits at 14,400 baud or better.
  • the modem 212 is used for telecommunication of information such as status information by way of the telephone line 116.
  • a hardware watchdog adapter 218 is provided in the communications server 112 to detect any system lock-up errors.
  • System lock-up errors are defined as errors in which operation of the software fails to communicate with the hardware watchdog adapter 218 in a predetermined manner at predetermined intervals. When the software fails to do this, an error is indicated and the error condition is determined by the hardware watchdog adapter 218. When this occurs, the watchdog adapter 218 resets the computer system.
  • a video graphics array (VGA) display adapter 220 is also provided within the communications server 112.
  • the VGA display adapter 220 permits a user of the cable network headend 134 to couple a conventional video monitor (not shown) to the communications server 112 in order to view the operations of the system.
  • a network interface card 222 is provided within the communications server 112 to permit peer-to-peer communications between the communications server 112 and any other computers within the cable network headend 134.
  • the peer-to-peer network interface card 222 permits communication between the communications server 112 and the playback computer 124.
  • new broadcast multimedia scripting data is received and stored by the communications server 112 some of the new data is moved across the peer-to-peer network cable 122 within the cable television headend 134 to the playback computer 124.
  • the data is transmitted through the network interface card 224 within the playback computer 124.
  • the playback computer 124 may be provided with a separate playback computer hard disk 242 for storage of the broadcast data received by way of the communications server 112. Storage of data on the playback computer disk 242 is controlled by a disk drive controller 240. The playback computer 124 places the new data into predetermined subdirectories on the hard disk 242. Alternately, if the data is MPEG encoded it may be stored on a digital video hard disk 238 by way of an MPEG digital video playback decoder board 234, in accordance with a packing list which may be included within the transmitted broadcast data. Storage on disks 238, 242 permits the new data within the playback computer 124 to be accessed as required by the multimedia engine 128 in order to assemble and provide multimedia presentations for the cable television network users 135.
  • the hardware watchdog 226 of the playback computer 124 performs in substantially the same manner as previously described with respect to the hardware watchdog adapter 218 of the communications server 112.
  • Audio signals for presentation to the cable television users 135 may be stored on either the playback computer disk 238 or the local hard disk 242 depending on whether they are MPEG encoded.
  • the playback computer 124 outputs analog audio signals to the cable television network users 135 by way of a digital audio board 228 if the digital audio data is accessed from the playback computer disk 242. If the digital audio is embedded within an MPEG digital video stream, the audio signal is first output via the MPEG digital audio playback board 236, and is then routed to the input of the digital audio board 228 in analog form.
  • a mixer (not shown) within the digital audio board 228 combines the analog audio signals to create the final audio out signal in analog form.
  • Video output from the playback computer 124 may operate in substantially the same way as the audio output.
  • the playback computer 124 outputs analog video signals to the cable television network users 135 by way of a video overlay board 230 (which may be an NTSC video overlay board [for video consistent with North American broadcast standards] or a PAL video overlay board [for video consistent with European broadcast standards] or another board designed to accommodate another broadcast standard such as a digital or analog high definition television standard).
  • NTSC video overlay board for video consistent with North American broadcast standards]
  • PAL video overlay board for video consistent with European broadcast standards
  • Another board designed to accommodate another broadcast standard such as a digital or analog high definition television standard.
  • Video stored in the receiving and storing means of the multimedia engine and video output of the multimedia engine preferably has the resolution of NTSC (512 x 486 pixels) or PAL, or better.
  • the multimedia engine software for performing the operations of the multimedia engine 128 by the playback computer 124 is stored on the playback computer disk 242.
  • the multimedia engine 128 operating on the playback computer 124 may be adapted to constantly run in a hunt mode wherein the multimedia engine 128 hunts or searches for the arrival of new data.
  • the information files which make up this new data can be formatted according to the multimedia scripting language specification set forth herein in the Appendix.
  • the four types of information files and multimedia scripting files which may be stored within the local storage of the playback computer 124 and processed by the multimedia engine 128.
  • the four types are: textual script files, bit mapped graphic image files, digitized audio files and video files.
  • the four types of textual script files are: segment files, story files, template files and advertisement files.
  • the segment files contain a basic sequence of events for a presentation of a multimedia sequence by the multimedia engine 128. They reference two of the four other types of textual script files: story files and advertisement files.
  • the opening line of the segment file recites the time of day and the date when the presentation elements defined by the segment file should be broadcast.
  • Each line in the segment file recites a relative start time (e.g., "00:00:00” for the start time of the segment, or "00:01 :15” for 1 minute, 15 seconds after the start time of the segment) and a duration (e.g., "00:15” for 15 seconds and "01 :00" for 1 minute).
  • Story files contain all of the details for executing the individual audio and video transitions and events which represent a series of pages or a single page of a presentation.
  • the lines tagged " ⁇ page>” indicate the start of each page and indicate the time length of the page ( 6, 7, 4 seconds respectively for the three pages exemplified above).
  • the lines tagged " ⁇ tmp_>” identify the template file that will be used for a time defined on the line.
  • the line tagged " ⁇ trns>” identifies a protocol for fading into a page.
  • the lines tagged " ⁇ back>” and “ ⁇ img_>” identify image files encoding images that will be broadcast as part of the multimedia presentation for prescribed periods of time that are defined on the corresponding line.
  • the image files identified in the " ⁇ back>” lines are for full-screen images.
  • the lines tagged " ⁇ wav_>” identify audio files that will be played for a period of time defined on the line.
  • the lines tagged " ⁇ hed_>” and “ ⁇ txt_>” identify text that will be overlaid on the broadcast images for the periods of time prescribed on the corresponding line.
  • the lines reference " ⁇ box_>” indicates that a box should be overlaid on the broadcast images for the period of time indicated on the line.
  • the template file for the page will have a corresponding tagged " ⁇ box_>” line that will include a definition of the box geometry.
  • Template files contain the geometry for rendering a page in a multimedia presentation.
  • Each of the tags (“ ⁇ txt1 >”, “ ⁇ img1 >”, “ ⁇ hed1 >”, etc.) contained in a template file corresponds to a matching tag on the corresponding page in a story file.
  • SAMPLE template file ⁇ Ver>2.0
  • ⁇ txt1 >260119012001200111241241 hvb
  • a single advertisement file lists all of the advertisements which are scheduled for an individual day of the year.
  • the multimedia engine 128 proceeds sequentially through the advertisement file and keeps track of which advertisements have been presented.
  • SAMPLE advertisement file filename: 0216.AD ; advertising play list
  • 01/28/94 creationdate 01/28/93 createdfor: NEWSCHANNEL systemid: 2442, Coatesville 00:10
  • the entries following the time entry which indicates the time of day at which to play an advertisement are one or more video files or story file names separated by pipe symbols "
  • the generalized format of textual script files stored within the local playback computer disk of the playback computer 124 within the multimedia engine 128 may be as follows:
  • Other designates a delineation between fields
  • the prefix ⁇ TAG> e.g. " ⁇ box_>”, “ ⁇ txt_>”, etc.
  • StartTime indicates the starting time of the textual script file, relative to the start time for a segment file, in the conventional hours, minutes, seconds format, HH:MM:SS
  • Duration indicates the time duration or dwell in the conventional minutes, seconds, format, MM:SS, during which the object remains active.
  • Device refers to the physical devices which may be referenced in the textual script files.
  • the physical devices may include a host computer, a digital video subsystem, an external tape deck and a laser disc player.
  • the other field may include additional information required to complete the current scripting line format. For instance in the case of digital video file reference, the other entry is the name of the digital video file.
  • FIG. 3 there is shown a block diagram file representation 300 of the hierarchical relationship of the four types of textual script files within the playback computer 124 of the multimedia engine 128.
  • block diagram representation 400 illustrating the relationship between the various scripting files.
  • the textual script files are one of four types of information files which may be stored within the playback computer 124.
  • the application program 302 creates a number of segment files such as the segment files 304 for use in providing multimedia presentations within the multimedia engine 128.
  • the application program 302 may be any entity operated by any user of the multimedia engine 128 which is able to specify a multimedia presentation for assembly and presentation by the multimedia engine 128.
  • the segment files 304 shown in the block diagram representation 300 may contain any number of story files 306.
  • a story file 306 included within a segment file 304 is a collection of story events wherein each of the story events has a relative start time and a predetermined duration.
  • Each story file 306 referenced within the segment file 304 must have a reference to a specific time at which the story file 306 is to be presented by the multimedia engine 128 to the cable users 135.
  • a story file 306 is thus a logical sequence of multimedia events with a specific application within a multimedia presentation by the multimedia engine 128.
  • the application may be, for example, a news story, a movie promotion, an event promotion, or any multimedia information presentation.
  • the story events within the story files 306 contain the information required for the multimedia engine 128 to execute the audio and visual transitions within the presentation of a multimedia story by applying signals to the audio line 130 and the video line 132 of the cable television headend 134 as previously described.
  • the following is an example of a number of story files 306 which may be associated with each other to form a portion of a segment file 304 by the multimedia engine 128:
  • the prefix ⁇ stry> at the beginning of each line identifies the file types of the four entries in this segment file 304 as story files 306.
  • the various fields within the story files 306, are delineated by pipe symbols "
  • the time values 00:00:00 after the ⁇ stry> prefix indicate the starting tfmes of the story files 306 relative to the starting time for segment file 304. For example, 00:00:00 in this field identifies the first story file 306 as the one within the segment file 304 which begins at time zero.
  • the 01 :00 in the duration field delineated by a single pipe symbol, indicates that the first story file 306 has a duration of one minute.
  • the device fields in this example are empty.
  • the last field indicates that the name of the first story file 306 is "highlite.sty".
  • the second story file 306 named "schedule.sty" has 00:01 :00 in its second field, indicating that it starts one minute into the segment file 304 represented by this example.
  • the story file 306 named "schedule.sty” lasts for fifteen seconds as indicated by 00:15 in the duration field.
  • the story file 306 named "highlite.sty” starts at time zero which is the beginning of the presentation of the segment file 304 and runs for one minute.
  • the story file 306 named "schedule.sty” begins at one minute and runs for fifteen seconds.
  • the story file 306 named "howtoord.sty” begins at one minute and fifteen seconds and runs for another fifteen seconds, and the story file 306 named "upnexttv.sty” begins at one minute and thirty seconds and runs for thirty seconds.
  • the block diagram file representation 300 of textual script files stored within the playback computer 124 also sets forth advertising list files 308.
  • the advertising files 308 are sequential lists of advertising insertion lines and other advertising identifications. Advertising files 308 are used to insert advertising from a separate device or from a story file 306 into the multimedia presentation assembled and prepared by the multimedia engine 128.
  • the advertisement information to be presented to the cable users 135 in this manner during a multimedia presentation is accessed from the local memory within the playback computer 124 by the multimedia engine 128 as required in order to properly insert it into the presentation.
  • the accessing of advertising information by the playback computer 124 in accordance with an advertising file 308 occurs when the multimedia engine 128 encounters a ⁇ advr> prefix within a segment file 304 as shown in the following example:
  • an advertisement file whose file name is described by the current month and day (as detailed below), is referenced.
  • the contents of samplead.ad are read and scanned for a time pointer.
  • This time pointer or time reference may point to a digital video filename or to a another story file 306.
  • the advertising content may be scheduled separately from the rest of the presentation. This permits multimedia content to be managed by an editorial group which is separate from those managing the advertising material. This kind of separation is common in the art of commercial print publishing. It also permits the advertisement information to be used independently of the advertisement information time slots of the multimedia presentation.
  • the file name format for advertising list files 308 is of the form MMDD.ad in order to indicate the month and day of playback.
  • a list of advertising list files 308 may be of the form:
  • the last filename "1231.ad” is an advertising list file 308 which is played on
  • Fig. 5 shows a block diagram representation of a script file parsing method 500 for parsing various textual script files within the multimedia engine 128 of the present invention.
  • the script file parsing method 500 may be used to parse segment files 304, story files 306 and template files 310.
  • execution of the script file parsing method 500 proceeds by way of decision node 504 of decision block 502 to the script file line parsing process 600 for parsing of the individual file lines within the story file 306. Execution of the script file line processing process 600 described hereinbelow with respect to Fig. 6.
  • the pages of the parsed story file 306 are identified.
  • the appropriate records within the local storage of the playback computer 124 are associated with the various identified pages as shown in block 512.
  • the template files 310 are then loaded opened, parsed, and converted into metafile objects as described with other text files, block 524. If the textual script file being parsed by the script file parsing method 500 is a segment file 304, as determined in decision block 502, execution of the script file parsing method 500 proceeds from decision block 502 by way of decision node 506.
  • the segment file 304 being parsed is then processed by the file line parsing process 600 as previously described. Execution exits the file line parsing process 600 by way of exit pathway 614 and reenters the script file parsing process 500 at block 518. The duration of the segment file 304 is then calculated in block 518. All of the story files 306 within the segment file 304 being parsed by the parsing process 600 are identified, loaded and updated as shown in blocks 520, 522. If the textual script file being parsed by the script file parsing method 500 is a template file 310, as determined in decision block 502, execution of the script file parsing method 500 proceeds from the decision block 502 by way of the decision node 508. The template files 310 are then opened and parsed as shown in block 600. They are converted into metafile objects with other text files as shown in block 524 in a manner described hereinbelow.
  • a block diagram representation of the script file line parsing process 600 there is shown a block diagram representation of the script file line parsing process 600.
  • the script file line parsing process 600 may be used for the parsing of individual lines within the various textual script files. This parsing may occur within the parser block 1106 (Fig. 11) of the presentation process overview 1100 operating within the multimedia engine 128 as further described hereinbelow.
  • the script file line parsing process 600 may be used to separate the lines of textual script files including segment files 304, story files 306, and template files 310 for processing by the script file parsing method 500. Execution of the script file line parsing process 600 begins with a textual file being opened as shown in block 602.
  • Fig. 7 shows a block diagram representation of the segment event generation process 700 for generating segment events to prepare multimedia presentations by the multimedia engine 128.
  • the first step in generating the events for a segment 304 by the process 700 is preparation of an advertising record for the segment 304 as shown in block 702 if required. Only applications that take advantage of dynamic advertising insertion use this feature. In most cases block 702 is not executed.
  • the next step is the reading of a record as shown in block 708. If decision block 712 determines that the file is not done, the record is checked to see if it is a story record in decision block 732. If not, it is an advertising record and an advertisement event for the multimedia event queue is generated 748. The process continues with the next record being read in block 708 and checked in decision block 712.
  • next record read is a story record
  • the process for generating story events 800 is executed as described hereinbelow.
  • the next line is checked to see if it is the end of the file as shown in decision block 712. Once the end of the file is reached, as determined by block 712, a determination is made whether the system is currently in hunt mode as shown in decision block 716. If the software is in hunt mode it calculates the specified offset event times to run as shown in block 720 and exits at terminal 724. If the software is not in hunt mode the multimedia engine 128 adds a stop event to the queue in block 728 and exits the operation at terminal 724.
  • Fig. 8 shows the story event generation process 800 for generating events corresponding to a story file 306 by the multimedia engine 128 of the present invention.
  • a determination is made in decision block 804 whether the story file 306 being processed by the multimedia engine 128 is present within the directory of the playback computer 124. If the story file 306 is not present in the playback computer memory 124, the events necessary for a predetermined default presentation are generated as shown in block 820 and execution by the multimedia engine 128 exits the story event generation process 800 at terminal block 822.
  • the story event generation process 800 begins sequentially processing the pages within the story file 306. In order to do this, a record is read in block 806 and a determination is made in decision 808 whether the end of the story file 306 has been encountered. If the end of the story file 306 has been encountered, a predetermined end story event is added to the output of the story by the event generation process 800 as shown in block 824. A determination is then made in decision block 828 whether a stop is specified within the story file 306 being processed. If a stop has not been specified execution of the story event generation process 800 ends at terminal block 822.
  • a stop event is scheduled as shown in block 832 and execution exits the story event generation process 800 by way of terminal block 822. If the end of the story file 306 being processed by the story event generation process 800 is not encountered, as determined in decision block 808, a determination is made whether the page of the file being processed should be displayed. This page display determination is made in decision block 812. (For instance, the multimedia engine may decide to skip a page if needed to stay in synch with timing indicated in the script file.) If the page should not be displayed, execution proceeds to decision 808 where another determination is made whether the end of the story file 306 has been encountered.
  • FIG. 9 shows the segment event generation process 900 for determining whether a segment file 304 being processed by the multimedia engine 128 contains an advertising list file 308 and for processing an advertising list file 308 when it is present.
  • a determination is made in decision block 902 whether the segment file 304 contains an advertising list file 308. If the segment 304 does not contain an advertising list file 308, execution exits the segment processing process 900 by way of terminal block 904. If the segment file 304 does contain an advertising list file 308, a determination of the advertising file name is made, as shown in block 908.
  • the record for the current HH:MM is obtained in block 924.
  • the record obtained in block 924 is returned by the segment processing process 900, as shown in the block 928 and the process 900 is exited by way of terminal block 904.
  • Fig. 10 shows the advertising event generation process 1000 for generating advertising events within the multimedia engine 128 of the present invention.
  • Execution of the advertising event generation process 1000 begins with a determination whether the advertising record to be processed is present in memory. This determination uses the advertising record process 900 described hereinabove. This determination is indicated in decision block 1004. If the event is not present a default static screen event is scheduled as shown in block 1008 and execution exits the advertising event generation process 1000 by way of terminal block 1010.
  • the advertising record that is returned via record generation 900 is a series of file pointers that point to one or more digital video files or story files in memory. One advertisement is selected from the record returned. The first time the multimedia engine 128 accesses an advertisement in this method a round-robin counter is set. Upon subsequent calls to this record the counter is incremented and the next advertisement is played. When the end of the series is encountered the round-robin counter is reset to one and the first advertisement is replayed.
  • an application programming interface 1102 provides an interface for applications to allow them to control the multimedia engine 128.
  • the application programming interface 1102 applies commands scripting information from the external applications (not shown) to the various primary control subsystems of the multimedia engine 128. Commands can be applied, for example, to the parsing subsystem 1106, the hunting subsystem 1108, the control subsystem 1114 and directly to the graphics rendering subsystem 1118.
  • the application programming interface 1102 permits a consistent view of the multimedia engine 128 for the external applications (i.e., a computer program whose purpose is to control the multimedia engine).
  • the parsing subsystem 1106 reads the passive operation data files 1104 and generates the metafiles 1110 which represent the non-timing related items in the passive operation data files (PODF) or script files 1104.
  • a metafile 1110 is a set of records which represent a series of calls by the application programming interface 1102 which are to be made to the graphics rendering subsystem 1118. The records of the metafile 1110 are played sequentially when all of the corresponding calls of the application programming interface 1102 are made. The metafiles 1110 are then scheduled into the timing subsystem 1116.
  • the hunting subsystem 1108 may request that the hunting subsystem 1108 identify a set of passive operation data files or PODF's 1104 which should be run at a particular time.
  • the hunting subsystem 1108 checks the system time and identifies the PODF's 1104 which have time encoded filenames corresponding to the current system time.
  • the parsing subsystem 1106 schedules the set of files for presentation by the multimedia engine 128.
  • External applications may also use the application programming interface 1102 to directly access the graphics rendering subsystem 1118 as previously described. This permits an external application to directly control the output of a presentation. This direct control allows an alternate means of controlling the multimedia presentation rather than the scripted format provided by the parsing subsystem 1106 and hunting subsystem 1108 (generally, the components of the script manager 1150) of the process overview 1100.
  • External applications using the multimedia engine 128 by way of the application programming interface 1102 may control the timing of the multimedia engine 128. This is done by an external application (not shown) having a timing loop for calling a polling function in the application programming interface 1102.
  • This polling function in the application programming interface 1102 distributes the polling into the control subsystem 1114 which controls the timing subsystem 1116 and the hunting subsystem 1108.
  • the poll is transferred to the timing subsystem 1116. If the timing subsystem 1116 identifies an event as requiring action, it returns that event to the control subsystem 1114.
  • a metafile 1110 is then retrieved from the event and it is passed into the graphics rendering subsystem 1118 for processing. If the timing subsystem 1116 is then empty, either the hunting subsystem 1108 is called to schedule another set of files through hunting or playback is stopped.
  • An external application can also directly stop the presentation by way of the application programming interface 1102.
  • the graphics rendering subsystem 1118 which operates in a plurality of modes.
  • the graphics rendering subsystem 1118 supports a set of functions which directly control multimedia presentation hardware within the multimedia engine 128. These functions can be called by external applications by way of the application programming interface 1102, or by a metafile 1110.
  • the graphics rendering subsystem 1118 provides a number of features within the multimedia engine 128. It may provide a hardware independent application program interface which external applications can use. This is possible because all of the functions in the graphics rendering subsystem 1118 are propagated to a similar function in the hardware abstraction layer 1128. It provides high level control of the font rasterizer 1122, the graphics rasterizer 1120, and the digital video decoder 234 to provide a logical multimedia presentation. This logical presentation is coordinated by preventing the user from requesting an illogical sequence of actions, by providing control of transitions from complete graphical presentations to video presentations, and by handling exceptional cases such as video playback failure.
  • the graphics rasterizer 1120 of the process overview 1100 is called by the graphics rendering subsystem 1118 to provide raster display operations. This includes drawing lines and polygons as well as displaying graphics files 1124.
  • the graphics rasterizer 1120 loads the graphics files 1124 if required and calls the graphics hardware interface 1134 section of the hardware abstraction layer 1128 to draw an item on the graphics hardware (graphics generator/video overlay card) 230.
  • the font rasterizer 1122 is called by the graphics rendering subsystem 1118 to provide anti-aliased font-based raster display operations.
  • the font rasterizer 1122 is provided with a font specification, text content, and position and provides a font from a font file 1126. It also performs anti-aliasing and calls the graphics hardware interface 1134 section of the hardware abstraction layer 1128 to display the desired text in the appropriate font.
  • the font rasterizer 1122 also permits caching of font rasterization to provide greater performance.
  • the graphics renderer 1118 directly calls functions in the hardware abstraction layer 1128 for controlling the digital video decoder 234 and the digital video file storage of disk 238.
  • the digital video decoder 234 includes its own application program interface which is abstracted by the video decoder interface 1132 of the hardware abstraction layer 1128. Through the video decoder interface 1132, the graphics rendering subsystem 1118 can request that the video decoder 234 play a particular video file. The video decoder interface 1132 first verifies the existence of the digital video file on the disk 238, and if the file exists, starts the video decoder 234 playback of that file. The video decoder 234 may send asynchronous messages to the multimedia engine 128. These messages are directed to the external application, whose responsibility it is to return them to the playback computer 124 by way of the application programming interface 1102.
  • These messages are asynchronous and are relayed by the application programming interface 1102 to the control module 1114 and then to the graphics rendering subsystem 1118.
  • the control module 1114 may monitor these messages in the case of serious errors on the video decoder 234 which require the halting or modification of the presentation in the timing subsystem 1116.
  • the graphics rendering subsystem 1118 receives an asynchronous message from the video decoder 234 it applies the message to the video decoder interface 1132 which determines how to handle the message. Based upon the decision made by the video decoder interface 1132, the graphics rendering subsystem 1118 may make modifications in the presentation.
  • FIG. 12 there is shown a block diagram representation 1200 of a method for generating queue events for a page of a story file 306 within the multimedia engine 128 of the present invention. Entry of the method of block diagram representation 1200 is by way of decision block 812 and path 813 of the story event process 800.
  • decisions block 812 determines whether a page of a story is to be displayed. If the page is to be displayed its queue events must be generated.
  • the story page is applied to block 1202 which determines the time of the page in the story, and possibly the position of the story within a segment.
  • Process block 1204 then creates an empty metafile 1110. Records are added to the metafile 1100 to correspond to the items on the story page.
  • a page is read in block 1206.
  • Decision block 1208 determines whether the file is finished. If the file is finished block 1226 creates an event in the event queue for the time identified by block 1202 and process 1228 places the metafile 1100 inside the event created by process 1226. The procedure of diagram 1200 is then terminated.
  • decision block 1210 determines if the story page record represents a change of a template 310. If it is, block 1216 attempts to select the specified template. Process control is then returned to decision block 1208 from block 1216. If the story page record is not a change of template, execution proceeds to decision block 1212. Decision block 1212 determines if the story page record requires a template 310. If it does, control passes to block 1218, otherwise it passes to block 1214. Block 1218 looks up the corresponding template record specified by the story page record. If the record does not exist, the record is ignored and execution proceeds from block 1220 to block 1224 and block 1208.
  • control process 1222 merges the story page and template information into a single record to be placed within the metafile 1100.
  • Block 1214 takes the output of block 1212 and/or block 1222 and creates a record with the metafile created by block 1204 which represents the record identified of created by block 1212 or block 1222. Control then passes back to decision 1208.
  • SMSL StarNet Multime ⁇ ia Sc ⁇ pt g Language
  • This definition includes what the StarNet Multime ⁇ ia Engine (SMME) does, how it does it. the input data necessary for it's operation and the output it produces as a result of that ongoing operation.
  • SMME StarNet Multime ⁇ ia Sc ⁇ pt g Language
  • the SMME is a software comman ⁇ interpreter wnich runs in a product specific computer at a cable head end and produces output consisting of still pictures, full motion video, rendered text and graphics and sound which is in turn fed to the cable system for broa ⁇ castmg.
  • the SMSL is a the multimedia sc ⁇ pti ⁇ g language invented by Gerard Kunkel of StarNet. and refineo by Michael Heydt of StarNet.
  • the language addresses all of the capabilities of the SMME in a simple collection of file formats ail stored as ASCII text files.
  • the SMSL was orgmatly conceived as an all purpose multimedia sc ⁇ pting language for use in passive or interactive television. As such, the language will be expanded dunng the fall of 1994 to address the new mtera ⁇ ive needs or television set-top boxes (STBs). Of note will be client- server nature of data access to feed the engine, expanded graphics capabilities of the new dedicated graphics chips to do graphic sprites. Other operator needs will be identified as the API for the set-top box is disclosed. In the meantime, the general Windows 3.1 API will be used as a model for additional operators.
  • the SMME takes as input the SMSL files which consist of presentation sequencing information, and data files in the form of graphic images, digital video sequences and digitized sound. Some of the control information that the SMME receives directs it to access several different types of playback devices. in order to produce the necessary output. These playback devices include, and are currently limited to: digital video subsystems (DVSS from Scientific Atlanta), digital video adapters (Optibase PC Motion), video tape recorders (VTRs) and laser disk players.
  • DVSS digital video subsystems
  • Optibase PC Motion digital video adapters
  • VTRs video tape recorders
  • laser disk players laser disk players.
  • SMMEFRNT.EXE is software application that interfaces between data files and the engine itself.
  • SMMEFRNT is a software front end that allows a user to define one of two operating for multimedia playback. There are other uses for the SMMEFRNT application that are not relevant here. For more detail see the document StarNet Multimedia Engine Front End Application.
  • the SMME takes as input 4 fundamentally different types of information: textual sc ⁇ pt files, bit mapped graphic images, digitized sound, and digital or analog full-motion video. This information comes from four different storage dev ⁇ ces:hard disk, digital video subsystem (which may actually be a hard disk attached to another device), video tape and laser disk. This document is responsible for desc ⁇ bing the contents of only the textual sc ⁇ pt files.
  • the Segment file contains the basic sequence of events for the presentation of a multimedia sequence. It references two other types of fi!es:Sto ⁇ es (.STYs) and the Advertisements file (.AD).
  • the CONTROL files (.CTL) are used only in the News Channel product and have no bea ⁇ ng on the operation of the SMME.
  • the term Story comes from the News Channel project (see other documentation) and is defined as a logical sequence of multimedia events with a specific application.
  • the application may be a news story, a movie promotion, an event promotion, an advertisement, etc.
  • the Story file contains all of the details for executing the individual audio and visual transitions and events which represent a Story.
  • T e SEGMENT file is a collection of Stones and Advertisements.
  • Advertisement Files There is only one Advertisements file wnich lists all of the Advertisements which are scheduled for a certain amount of time.
  • the SMME keeps track of which a ⁇ s have been played and simply runs sequentially through the file.
  • the template file contains the geometry for rende ⁇ ng a page in the multimedia presentation.
  • Each of the tags contained in the TPL file corresponds to a matching tag in the STORY file.
  • Each of the SMME textual control files is w ⁇ tten using the language specification called the StarNet Multimedia Sc ⁇ pting Language (SMSL). This language borrows the construct of a number of existing languages: C, SGML, and the Windows INI file format.
  • SMSL StarNet Multimedia Sc ⁇ pting Language
  • ⁇ TAG> one of the tags desc ⁇ bed in the SMEPDL section of this document.
  • the SEGMENT file is a list of events that the multimedia engine will play.
  • the SEGMENT file points to STORY files that contain the actual data for display .
  • the SEGMENT file is a higher level that simply sequences STORY files. This relationship is illustrated in the following diagram.
  • the name of the SEGMENT file is very important to the correct operation of the SMME.
  • the DOS eight character filename is used to represent the exact minute of the day/mo ⁇ th/year that the SEGMENT is to begin operation.
  • the STORY file is a collection of events and pointers.
  • the events listed in the STORY file occupy one line per event.
  • Each event reference must have a relative starting time and a relative duration (dwell).
  • the starting time is relative to the start of the STORY sequence.
  • the duration is relative to the start of the individual event.
  • Each event is labeled with a tag denoted in greater and less-than symbols ( ⁇ >).
  • Barker Story File scr.e ⁇ ' ie screen
  • DESCRIPTION Signifies the creation of a new menu of choices.
  • the subseoue ⁇ t menu items are listed in se ⁇ es following the initial menu item.
  • the engine assumes base 0 for both the menu and item numbering scheme.
  • DESCRIPTION Specifies the template file to use for building a menu list in the interactive presentation.
  • DESCRIPTION Specifies the bacKgrou ⁇ bitmap file to use for building a menu list in the interactive presentation.
  • DESCRIPTION Signifies a change of pages in the presentation, it is used p ⁇ ma ⁇ iy as a device for the News Channel but can be applied to any product that requires tracking the changing of pages.
  • DESCRIPTION Denotes a line of text to be rendered to the display.
  • DESCRIPTION Denotes a line of text to be rendered to the display.
  • SYNTAX SYNTAX.
  • DESCRIPTION Signifies the inclusion of a TARGA raster file. The actual file is referenced by the trailing filename.
  • DESCRIPTION Specifies the template file to be used for the current page.
  • the actual file is referenced by the trailing filename. This specification should come immediately after the ⁇ page> tag.
  • DESCRIPTION Signifies the inclusion of a headline. The actual text is referrenced by the last text entry following the transition designation.
  • DESCRIPTION Signifies the inclusion of a time stamp on the video display. The location, color, font and size is determined by the corresponding reference in the template file. There are different types of display format for time. The following list desc ⁇ oes the available formats.
  • DESCRIPTION Signifies the inclusion of a digital video clip file.
  • the actual file is referenced by the trailing filename or by the starting and ending address on the device.
  • DESCRIPTION Signifies the inclusion of a clock on the video display which will be updated at the specified interval.
  • the location, color, font and size ⁇ determined by the corresponding reference in the template file.
  • DESCRIPTION Signifies the inclusion of a text item which will be flashed on and off on the screen at the defined intervals. On tme specifies the duration that the text will be displayed on the screen, and off time specifies the time interval that the item will be removed from the screen before it is displayed again.
  • DESCRIPTION Signifies the inclusion of a story within a segment file.
  • the filename points to a complete story file that will in turn desc ⁇ be the individual multimedia sequences.
  • DESCRIPTION Signifies the inclusion of an advertisement in the sequence of events in the segment file.
  • the filename points to the advertisement piayiist supplied to the system. That piayiist contains all of the information on where to find the advertisement and what device to get the ad from.
  • the last entry in this command line is the frame weight It is measured in pixels.
  • the color of the frame is determined by the current foreground color as set by the ⁇ c ⁇ lr> tag. If the box is filled the fill color is determined by the current background color.
  • DESCRIPTION Signifies the inclusion of a message line. The text for this message is obtained from the message file on disk.
  • DESCRIPTION Specifies a gradient fill. This is a global reference that can be applied, but must also be restored when necessary. The following types may be applied to any filled box.
  • DESCRIPTION Signifies the inclusion of a line of text. This operator is idenbcal to the txt operator.
  • DESCRIPTION Signifies the inclusion of an image.
  • DESCRIPTION Signifies the inclusion of an vt ⁇ eo clip. If the type field is 0. the video clip will be played full screen and all other items will be rendered ontop of the video. If the type field is 1 , the video will be played in the window spe ⁇ fied by (x1 ,y1 Hx2,y2).
  • the devices entry within the story and segment files refers to the physical device attached to the computer system.
  • the four letter acronym tells the Multimedia Engine which device will be used to access data. Below is a description of all of the device types supported by the Engine. comp
  • DESCRIPTION Specifies that the data for this instruction is available on the host computer. It is assumed that the d ⁇ ve ana directory are the same as the location of the Engine. If the application using the Engine nas a separate set of directo ⁇ es setup by its INI file, then that d ⁇ ve and directory designation is used. Como is the default device and should never actually be seen in any of these files. If the device parameter is null but the tag actually uses the device parameter, the engine will assume comp. dvs ⁇
  • DESCRIPTION Specifies that the data for this instruction is available on the digital video subsystem.
  • the Scientific Atlanta Playback board is the source.
  • the actual board designation is determined by the numeric value contained in the tag. For instance, if the tag specified dvsl then the first SA Playback board will be used. tape
  • DESCRIPTION Specifies that the data for this instruction is available on an attached laser disc player.
  • transitions entry within the story and segment files refers to the type of video transition that is to be used between video pages. This transition is called by the Engine used the Targa transitional effects that are resident on the Targa board. A limited set of transitions exist for this display board, and a smaller number of transitions have been engineered into the Engine.
  • the duration of effects in the engine is determined in milliseconds, not frames.
  • DESCRIPTION Specifies that the transition will be a fade.
  • the numeric value contained in this tag instructs the Engine as to the duration of the fade from one page to the next. The value is measured in milliseconds. Therefore a value of 1000 would request a fade of one second.
  • DESCRIPTION Specifies that the transition will be a wipe.
  • the numeric value contained in this tag instructs the Engine as to the type of wipe from one page to the next
  • the available wipe types are requested from the following table of wipe types.
  • DESCRIPTION Specifies that the transition will be a diagonal wipe.
  • the numeric value contained in this tag instructs the Engine as to the type of wipe from one page to the next and its duration.
  • the first n denotes which wipe type from the following table will be used.
  • D1 nnnn wipe diagonally from top left to lower ⁇ ght across nnnn milliseconds
  • D2nnnn wipe diagonally from lower left to upper ⁇ ght across nnnn milliseconds
  • DZnnnn wipe diagonally from upper ⁇ ght to lower left across nnnn milliseconds
  • Inline formatting allows the individual lines of the STORY file to address multiple character rendering formats. For instance, the News Channel may require the use of italic or underlined text in a single line of text. This is prevelent in headlines when emphasis is required or when the name of a publication is used within the sentence. All of the inline formatting or commands are identified by the backslash. This was used to help keep the Lenfest Multimedia Language in line with some of the other common languages, such as C.
  • DESCRIPTION Specifies that the current point size should be changed to nnn points.
  • the numeric value contained in this tag instructs the Engine as to the point size. Therefore a value of 030 would request a cnange in point size to 30 points.
  • the value nnn is fixed in length to 3 characters.
  • DESCRIPTION Specifies that the current font should be cna ⁇ ged to fontname. Since the fontname is of undetermined length there must be an accompanying ⁇ f to mark the eno of the fontname.
  • the default font is the current font.
  • DESCRIPTION Specifies that the current leading should be changed to nnn points.
  • the numeric value contained in this tag instructs the Engine as to the leading size. Therefore a value of 030 would request a cnange in leading to 30 points.
  • the value nnn is fixed in length to 3 characters.
  • DESCRIPTION Specifies that a font att ⁇ bute is being requested.
  • the numenc value contained in this tag instructs the Engine as to the type of att ⁇ bute change.
  • the value nn is fixed in length to 2 characters.
  • the att ⁇ bute values can be added together to create composite attribute type such as 06, bold-italic: or 10, bold-underline.
  • DESCRIPTION Specifies that a the red component in RGB of the current font color is being requested to change. Trie numeric value contained in this tag instructs the Engine as to the type of the new red value, ranging from 0 to 255 with 0 being the darkest and 255 the brightest. The value nnn is fixed in length to 3 characters.
  • DESCRIPTION Specifies that a the green component in RGB of the current font color is being requested to change.
  • the nume ⁇ c value contained in this tag instructs the Engine as to the type of the new red value, ranging from 0 to 255 with 0 being the darkest and 255 the b ⁇ ghtest.
  • the value nnn is fixed in length to 3 characters.
  • DESCRIPTION Specifies that a the blue component in RGB of the current font color is being requested to change.
  • the nume ⁇ c value contained in this tag instructs the Engine as to the type of the new red value, ranging from 0 to 255 with 0 being the darkest and 255 the brightest.
  • the value nnn is fixed in length to 3 characters.
  • the advertising list file is used to insert advertising from a separate device, or as a story file into the multimedia presentation.
  • the reason for having a separate list is that the insertion of advertising can be more, or less, dynamic than the multimedia presentation.
  • the News Channel it requires news programming to be fluid throughout the day. It cannot be burdened by managing the trafficking of advertising.
  • the advertising list file is a sequential list of advertising insertion times and other IDs. This file shouid be read dunng the multimedia presentation as needed to property insert the next ad in the list whenever the ⁇ advr tag placed in the SEGMENT file for that presentation.
  • the creation of advertising playlists is handled by the ADLIST.EXE application.
  • the application functions very much like the SEGMENT.EXE application which combines stones to create segments.
  • ADUST combines individual advertisements into the advertisement list file and manages the files across multiple days.
  • the advertising play list must be delivered to the Playback Chassis for each day of service.
  • the filename format of this file is as follows:
  • AD which represents January 1 through December 31 of the current year. This file must reside in the following directory:
  • the barcode can be replaced by a Story filename if the user wants to playback a multimedia advertisement rather than a digital video file. It would appear as follows:
  • the local creation system musi follow a tree ⁇ directory structure that is referenced in the ADUST.INI file.
  • the tree should be as follows:
  • ADUST.INI file is as follows
  • the ADUST.INI file will be maintained by the ADUST.EXE software. This application will read and write the .AD files for each site. It will also be the list manager for the available MPEG spots and relative .STY files composed as advertisements.
  • the message iist file is used to insert a dynamic message from a separate text file into the multimedia presentation.
  • the message list file is a sequential list of message insertion times and other IDs. This fiie should be read during the multimedia presentation as needed to properly insert the next message in the list whenever the ⁇ msg ⁇ > tag place ⁇ in the STORY file for that presentation.
  • MSGLIST.EXE The creation of message playiists is handled by the MSGLIST.EXE application.
  • the application functions very much like the ADUST.EXE application.
  • MSGLIST combines individual messages into the message list file and manages the files across multiple days.
  • the message play iist must be delivered to the Playback Chassis for each day of service.
  • the filename format of this file is as follows:
  • the creation of an activities log file is extremely important to the overall operations of the system.
  • the name of the log file is determined by the system ID. This way the person viewing the log file that comes back from the field will know which machine it has come from.
  • the information needed in the log file is as follows:
  • the LS acronym stands for Load Segment.
  • the PS acronym stands for Play Story.
  • the creation of advertising entries withing the log file is important from a billing perspective and from a basic system maintenance perspective.
  • the log file should report back the basic success or failure of an advertising insertion.
  • a successful insertion is referenced as a 1 and failure as a 0. It is the first entry item within an entry line. The remainder of the line is exactly the same as advertising INI file. This will allow for simple searches and ret ⁇ eval of valuable diagnostic information.
  • Advertisement logging will occur in the mm engine's log file and will be split out if necessary to a separate log using another program. adl-i 0 ⁇ : 00 : 00 i 30 : , 234 I 2345 I Gillette
  • Tr s tne is used to aescr ⁇ oe tr.e jvstem oro. ne ioc one oi any or the StarNet ?c-oased oroductr . riutory:
  • systemI the system id is used by communications to uniquely identify a i nb ⁇ uno communications
  • the system name is the text name :or the system.
  • the system pnone data is the pnone a mo ⁇ r :or dialing into tha ⁇ ,acn ⁇ ne.
  • the system phone voice is the pnone number :or camng the system contact, the system pnone contact is the name or tha system representative t me zone 13 measured in numoer of ncurs offset :rcm GMT.
  • parentoperator-TCI sys emoparator-Suburban Cable contactl-John Smith pnon ⁇ l-(215) 555-1212 contact2-Jane Jones pnone2- ⁇ 215) 555-1212 syscemaodreas-Coacesville, PA n ⁇ imberorsuDS- ⁇ O, 000 t ⁇ mezone-0 i irectories I empiat ⁇ -c: ⁇ template fcnt«c: ⁇ oats noo ⁇ -c: * iuaox c ⁇ age-c: oacigrouno-c: ⁇ vHe: vwav ⁇ ssage-c: 'message
  • starnet product line identifies which service is on this system tha starnet phone nome number is tne number tnat th s machine wi i call when a proDiem occurs the mp ⁇ g line tells the system if MPEG files are to be used. ppvBum ⁇ r- ⁇ 215) 555-1212
  • Days of we ⁇ K are listed as integer values tnat correspond to a cit-masic in the reading and writing software. This is oon ⁇ to conserve space.
  • the individual integer vaiu ⁇ s for days of the w ⁇ are:
  • Tha ⁇ iff.rent de inition : per c-.ar.ne. a * avc ssoaratao oy piping symool; .
  • Tha time is expressed s single .r.te tfer ⁇ etvreen " ana li .-. :.-. t represents * .. _ r.our t ne i enua. tc I2:30om ana 14 ' ⁇ iii ⁇ o ⁇ » » 'is ⁇

Abstract

A multimedia presentation system for providing a multimedia presentation has a playback processor (124) for receiving and storing information. The received and stored information includes video information, graphics information, audio information and text information. The received and stored information further includes multimedia information. A multimedia engine (128) accesses and merges the stored information in accordance with the multimedia information to provide the multimedia presentation. The multimedia information includes scripting information (302) and segments (304) which may include stories (306) or advertisements (308). The stories include story events having event starting times and event durations. The stories also include video files (316) and audio files (314). Additionally, the stories include templates (310) which are used for rendering images by positioning them.

Description

REMOTE PLATFORM INDEPENDENT DYNAMIC MULTIMEDIA ENGINE
The present invention relates to the preparation and presentation of multimedia presentations and, in particular, to a multimedia engine for the preparation and presentation of such multimedia presentations.
The present invention comprises a system and process for constructing a continuous multimedia presentation from source data. While there are a number of systems known in the art for creating multimedia presentations and for replaying them with similar hardware, the multimedia engine of the present invention operates on a 24-hour basis assembling multimedia presentations based upon supplied data. This raw data is in the form of component data files, including image files, video files, audio files and text files. The multimedia engine dynamically constructs the presentation based upon this data as instructed by a multimedia scripting language. This language instructs the multimedia engine where, when, and how to assemble multimedia presentation. The language further describes when and how to stitch together multiple presentations. The intent of this invention is to create a low cost means of presenting multimedia information in a continuous form. The novel multimedia engine may be used in cable television systems where the cost of creating programming content for the viewer is of great concern. Using this system and method allows for the creation of programming content dynamically and inexpensively.
Another feature of the present invention allows the multimedia presentation to be organized based on scripting files that are transmitted to the multimedia engine, for instance via a satellite feed, or based on function calls that are either inputted at the site of the multimedia engine or are inputted remotely, typically using a telephone line and a modem.
The versatility of the present invention allows for a single multimedia engine design to operate, for numerous cable television systems, (1 ) products that promote pay-per-view broadcasting, (2) products that operate a news outlet that can be facilely tailored to highlight local content news, (3) television programming guides and (4) advertising programming. These various programming formats can be controlled from a single remote site.
Summary of the Invention A multimedia presentation system for providing a multimedia presentation has a playback processor for receiving and storing information. The received stored information includes video information, graphics information, audio information and text information. The received stored information further includes multimedia information. A multimedia engine accesses and merges the stored information in accordance with the multimedia information to provide the multimedia presentation.
The invention is directed to a multimedia presentation system capable of dynamically constructing a multimedia presentation comprising: a receiving and storing means for receiving and storing commands and data files, which commands comprise script files or function calls; and a multimedia engine comprising a renderer and an application programming interface or a script manager, which multimedia engine comprises: a translating means for translating human-readable script files or human-readable function calls into commands usable by the renderer; and an executing means for executing the translated commands to create a multimedia presentation comprising audio and video components for broadcast, cable transmission or display. The presentation system of the invention advantageously provides a multimedia engine that has both an application programming interface and a script manager. The presentation system of the invention preferably stores and processes, segment files, story files, template files, advertisement files, video files and audio files as described hereinbelow.
In another embodiment, the invention is directed to a multimedia presentation system capable of dynamically constructing a multimedia presentation for cable transmission to a plurality of cable television users comprising: a receiving and storing means for receiving and storing commands and data files, which commands comprise script files or function calls; at least one segment file, at least one story file and at least one template file stored in the receiving and storing means, wherein said at least one segment file encodes the name of said at least one story file, and said at least one story file encodes text and the name of said at least one template file; and a multimedia engine comprising a renderer, an application programming interface and a script manager, which multimedia engine comprises: a translating means of translating human-readable script files or human-readable function calls into commands usable by the renderer; and an executing means of executing the translated commands to create a multimedia presentation comprising audio and video components for cable transmission.
Definitions
The following terms shall have, for the purposes of this application, the meaning set forth below. In particular, for the purpose of interpreting the claims, the term definitions shall control over any assertion of a contrary meaning based on other text found herein:
Advertisement file - lists the advertisements that are scheduled for a defined period of time, typically a day.
Application programming - the code module in the multimedia engine that interface (API) translates the human-readable function calls into instructions that are capable of being executed by the multimedia engine.
Dynamically constructing a constructing the multimedia presentation as the multimedia presentation script and data files are being input from the receiving and storing means, for instance, within about 10 seconds of the input of the script and data files. In a preferred embodiment, the script and data files already within the receiving and storing means can be substituted before the scheduled broadcast, cable transmission or display time of a multimedia presentation, so long as the substitution occurs before the time and date that the segment, as defined by the segment file, in which the script and data files are to be presented is scheduled for broadcast, cable transmission or display.
Function call, call - an instruction, with zero or more arguments, to or command perform some task.
Hunt mode - one of the multimedia engine's modes of operation, in which the engine autonomously searches for, then parses, a sequence of segment files to produce a multimedia presentation.
MPEG - a compressed digital audio/video bitstream format developed by the Motion Pictures Expert Group, a part of the International Standards Organization (ISO). Message file - lists all of the messages, such as public interest information, together with their start times and durations, which are scheduled for broadcast during a given period of time, such as a day.
Metafile - a set of translated commands which are to be executed by the renderer.
Multimedia - pertaining to or employing one or more media, including (but not limited to): full-motion video, audio, computer graphics, text and still images.
Multimedia engine - a computer programmed to read in command information (which can, for example, be in the form of scripting files or function calls) and various data files and to output a multimedia presentation, for example, for display or broadcast. Page - consists of a collection of audio and visual elements which begin to be presented in a multimedia presentation at the same time.
Parsing - taking a stream of information and breaking it out into the commands and arguments which a program can understand.
- a program or subroutine which takes an
Rasterizer abstract command, e.g., "put a line from x to y" and converts it into the actual pixel information that will appear on a screen, for instance in a multimedia presentation. Renderer - is made up of a graphics renderer, graphics rasterizer, font rasterizer, video decoder interface, graphics hardware interface, digital video subsystem and graphics generator/video overlay card, or the functional equivalents of these elements. Preferably, the renderer further includes one or both of an audio decoder interface and an audio card, or the functional equivalents of these elements.
Scripting file - a file containing commands which define the format, appearance, timing and content of a multimedia presentation that will be outputted by the multimedia engine. The scripting files also contain references to any other files that may be needed for presentations, such as audio or video files.
Script manager - the code module in the multimedia engine that translates script files into commands, which can be organized into metafiles, which commands or metafiles are sent to the renderer - the script manager performs the following functions: parsing; hunting; control; and timing.
Segment file - a list of the story and advertisement files that defines the content for a given period of time in a multimedia presentation. Story file - a file which defines a sequence of multimedia events which make up a portion of a multimedia presentation. Tags - an identifier for an element in a template of a story file which defines what kind of content will be presented in a page: text, audio file, video file, image file or graphical element such as line, box, and the like. Tags mark the start of a line in a script file and are indicated by an opening "<" bracket and a closing ">" bracket.
Template file - a file which defines the size, position, style, color, layout, and general appearance of the visual elements which appear on a page in a multimedia presentation. In a preferred embodiment, each of the elements in a template file has a tag, which matches a tag for that element in the story file.
Brief Description of the Drawings The foregoing summary, as well as the following detailed description of preferred embodiments of the invention, will be better understood when read in conjunction with the appended drawings. For the purpose of illustrating the invention, there is shown in the drawings embodiments which are presently preferred. It should be understood, however, that the invention is not limited to the precise arrangements and instrumentalities shown. In the drawings:
Fig. 1 is a block diagram representation of a data delivery uplink and downlink system having a cable television distribution headend downlink including the multimedia engine of the present invention; Fig. 2 is a more detailed block diagram representation of the cable system headend of the data delivery uplink and downlink system of Fig. 1 ; Fig. 3 is a block diagram representation of the hierarchy of the textual script files which are applied to the playback computer memory of the multimedia engine of Fig. 1 ;
Fig. 4 is a block diagram representation of the relationship of types of multimedia scripting language textual files which are applied to the playback computer memory of the multimedia engine of Fig. 1 ;
Fig. 5 is a block diagram representation of a parsing process for the parsing of textual script files which are applied to the playback computer memory of the multimedia engine of Fig. 1 ; Fig. 6 is a block diagram representation of a textual script file line parsing process for the parsing of the individual lines of textual script files which are applied to the playback computer memory of the multimedia engine of Fig. 1 ;
Fig. 7 is a block diagram representation of a segment event generation process for generating the segment events of a segment file which is applied to the playback computer memory of the multimedia engine of Fig.
1 ;
Fig. 8 is a block diagram representation of a story event generation process for generating the story events of a story file which is applied to the playback computer memory of the multimedia engine of Fig. 1 ; Fig. 9 is a block diagram representation of the determination and opening of an advertising list file and the retrieving of an ad record from an advertising list which is applied to the playback computer memory of the multimedia engine of Fig. 1 ; Fig. 10 is a block diagram representation of an advertising event generation process for generating the advertising events within an advertising record which is applied to the playback computer memory of the multimedia engine of Fig. 1 ;
Fig. 11 is a block diagram representation of a process control overview of the multimedia engine of Fig. 1 ; and Fig. 12 is a block diagram representation of the queue events generation for a page of a multimedia presentation.
Detailed Description Referring to the drawings, wherein the same reference numerals are used to designate the same elements throughout, there is shown in Fig. 1 a data delivery uplink and downlink system 100 including a cable television downlink site 134 (referred to as the headend). The cable television distribution headend 134 receives and processes multimedia information (including full motion video files, image files, audio files and script files) which is transmitted from a remote uplink site 101 within the system 100. The transmission of the multimedia information from the remote uplink site 101 to the cable television distribution headend 134 is performed by way of an uplink dish 102 at the remote programming uplink site 101 and a satellite transponder 104. Broadcast multimedia scripting data included within the multimedia information for use in multimedia presentations may be transmitted in this manner to a plurality of cable television distribution headends 134, 136.
Each of the cable television headends 134, 136 which receives the broadcast multimedia scripting data includes a receiver/demodulator unit 108, a communications server computer 112 and at least one playback computer 124 in order to provide multimedia presentations according to a predetermined broadcast multimedia scripting language for the users of a cable television system such as the cable television users 135, 142. The broadcast multimedia scripting data from the remote programming uplink site 101 is received by a satellite receive dish 106 at the cable television headend 134 and passed to the receiver/demodulation unit 108.
It will be understood that the broadcast multimedia data may be provided in accordance with any one of a number of multimedia scripting languages understandable to those skilled in the art which are suitable for indicating how to merge and present received information to provide a multimedia presentation. Preferably, the scripting language allows for story, segment, advertisement and template files to be organized as described in the Appendix and hereinbelow. Since the broadcast multimedia scripting data is transmitted by way of the satellite transponder 104 the differing cable television network headends 134, 136 may receive and present the same or different broadcast multimedia scripting data simultaneously to their cable television network users 135, 142.
The broadcast multimedia scripting data is received by the cable television headend 134 from the satellite receive dish 106 in an analog form in the preferred embodiment of the cable television network headend 134. It is applied by way of a coaxial transmission cable 107 to the receiver/demodulator unit 108. The receiver/demodulator unit 108 converts the analog signal data received by way of the transmission cable 107 into a digital data stream representing the broadcast multimedia scripting data from the remote uplink site 101. The digital data stream formed by the receiver/demodulator unit 108 is applied to a transmission cable 110. The digital data stream of the transmission cable 110 is applied to the communications server (computer) 112. The communications server 112 receives the digital data stream and stores it within the communications server 112. The data stored in this manner is distributed later within the cable television headend 134. The communications server 112 also communicates with a conventional telephone system 118 through a telephone modem line 116 which is coupled to conventional telephone system lines 119. The telephone modem line 116 allows the communications server 112 to send diagnostic information to the remote programming uplink site 101 by way of the telephone system lines 119. In addition, the remote uplink site 101 may dial into the communications server 112 of the cable television network headend 134 by way of the telephone system lines 119 and a telephone modem line 120. This allows the remote uplink site 101 to perform remote diagnostics of the cable television distribution headend 134. When new broadcast multimedia scripting data is stored within the communications server 112 it is moved across a bidirectional peer-to-peer network cable 122 within the cable television headend 134 to the playback computer 124. As described in more detail hereinbelow, the playback computer 124 is provided with its own playback computer disk for storage of the data received in this manner. The playback computer 124 places the new data into predetermined subdirectories on the disk. This permits the data within the playback computer data storage to be accessed and merged as required by the multimedia engine 128 according to the received scripting information in order to assemble and provide multimedia presentations for the cable television network users 135. When the multimedia engine 128 provides such a multimedia presentation, the playback computer 124 outputs analog audio signals to the cable television network users 135 by way of the audio presentation line 130. Additionally, the playback computer 124 outputs video signals to the cable users 135 of the cable television headend 134 by way of the video presentation line 132. The cable users 142 of the cable television headend 136 within the data delivery uplink and downlink system 100 may receive a multimedia presentation in a similar manner by way of the presentation lines 138, 140.
Referring to Fig. 2, there is shown a more detailed block diagram representation of cable television headend 134. The broadcast multimedia scripting data transmitted from the remote programming uplink site 101 is received by a satellite receive dish 106 at the cable television headend 134 and passed to the receiver/demodulation unit 108 as previously described. As also previously described, the data is then transmitted by way of the transmission cable 110 to the communications server 112. The communications server 112 receives the digital data by way of a high-speed transport adapter 210 and writes it to a hard disk drive 216 within the communications server 112. The data written to the hard disk drive 216 in this manner is controlled by a disk drive controller 214 and stored for later distribution within the cable television headend 134. Also present within the communications server 112 is a modem
212, such as a modem that transmits at 14,400 baud or better. The modem 212 is used for telecommunication of information such as status information by way of the telephone line 116.
A hardware watchdog adapter 218 is provided in the communications server 112 to detect any system lock-up errors. System lock-up errors are defined as errors in which operation of the software fails to communicate with the hardware watchdog adapter 218 in a predetermined manner at predetermined intervals. When the software fails to do this, an error is indicated and the error condition is determined by the hardware watchdog adapter 218. When this occurs, the watchdog adapter 218 resets the computer system.
A video graphics array (VGA) display adapter 220 is also provided within the communications server 112. The VGA display adapter 220 permits a user of the cable network headend 134 to couple a conventional video monitor (not shown) to the communications server 112 in order to view the operations of the system.
A network interface card 222 is provided within the communications server 112 to permit peer-to-peer communications between the communications server 112 and any other computers within the cable network headend 134. For example, the peer-to-peer network interface card 222 permits communication between the communications server 112 and the playback computer 124. When new broadcast multimedia scripting data is received and stored by the communications server 112 some of the new data is moved across the peer-to-peer network cable 122 within the cable television headend 134 to the playback computer 124. The data is transmitted through the network interface card 224 within the playback computer 124.
The playback computer 124 may be provided with a separate playback computer hard disk 242 for storage of the broadcast data received by way of the communications server 112. Storage of data on the playback computer disk 242 is controlled by a disk drive controller 240. The playback computer 124 places the new data into predetermined subdirectories on the hard disk 242. Alternately, if the data is MPEG encoded it may be stored on a digital video hard disk 238 by way of an MPEG digital video playback decoder board 234, in accordance with a packing list which may be included within the transmitted broadcast data. Storage on disks 238, 242 permits the new data within the playback computer 124 to be accessed as required by the multimedia engine 128 in order to assemble and provide multimedia presentations for the cable television network users 135.
Also in the playback computer 124 is a hardware watchdog 226. The hardware watchdog 226 of the playback computer 124 performs in substantially the same manner as previously described with respect to the hardware watchdog adapter 218 of the communications server 112.
Audio signals for presentation to the cable television users 135 may be stored on either the playback computer disk 238 or the local hard disk 242 depending on whether they are MPEG encoded. When the multimedia engine 128 provides a multimedia presentation, the playback computer 124 outputs analog audio signals to the cable television network users 135 by way of a digital audio board 228 if the digital audio data is accessed from the playback computer disk 242. If the digital audio is embedded within an MPEG digital video stream, the audio signal is first output via the MPEG digital audio playback board 236, and is then routed to the input of the digital audio board 228 in analog form. A mixer (not shown) within the digital audio board 228 combines the analog audio signals to create the final audio out signal in analog form.
Video output from the playback computer 124 may operate in substantially the same way as the audio output. When the multimedia engine 128 provides a multimedia presentation where the digital video data is accessed from the playback computer disk 242, the playback computer 124 outputs analog video signals to the cable television network users 135 by way of a video overlay board 230 (which may be an NTSC video overlay board [for video consistent with North American broadcast standards] or a PAL video overlay board [for video consistent with European broadcast standards] or another board designed to accommodate another broadcast standard such as a digital or analog high definition television standard). These files are normally stored as still frame bit maps that get displayed by the multimedia engine 128 for use as backgrounds. If the video is stored as full-motion video in MPEG format on the MPEG digital video disk drive 238 it is accessed by way of the MPEG digital video playback board 234 and applied to the video overlay board 230 in analog form. Video stored in the receiving and storing means of the multimedia engine and video output of the multimedia engine preferably has the resolution of NTSC (512 x 486 pixels) or PAL, or better.
The multimedia engine software for performing the operations of the multimedia engine 128 by the playback computer 124 is stored on the playback computer disk 242. The multimedia engine 128 operating on the playback computer 124 may be adapted to constantly run in a hunt mode wherein the multimedia engine 128 hunts or searches for the arrival of new data. The information files which make up this new data can be formatted according to the multimedia scripting language specification set forth herein in the Appendix.
There are four types of information files and multimedia scripting files which may be stored within the local storage of the playback computer 124 and processed by the multimedia engine 128. The four types are: textual script files, bit mapped graphic image files, digitized audio files and video files. Within the category of textual script files there are four types which may be read and interpreted by the multimedia engine 128. The four types of textual script files are: segment files, story files, template files and advertisement files. The segment files contain a basic sequence of events for a presentation of a multimedia sequence by the multimedia engine 128. They reference two of the four other types of textual script files: story files and advertisement files. SAMPLE segment file: ;c:\mmdata\segment\09012125.seg - 09/01/94 - 09:25PM
<stry>00: 00:00 | 01 :00 |0\Ev121000.sty <stry>00: 01 :00100:15 |0\H2121260.sty <stry>00: 01 :15100:30 | 0\Sp121261.sty <stry>00: 01 :45100:30 |0\HT121263.sty <stry>00: 02:15100:30 |0\Lo121271.sty <stry>00: 02:45 | 01 :00 | 0\Ev121023.sty <stry>00: 03:45100:30 |0\Sp121283.sty <stry>00: 04:15100:30 |0\HT121291.sty <stry>00: 04:45100:15 |0\Sy121293.sty
Note that the opening line of the segment file recites the time of day and the date when the presentation elements defined by the segment file should be broadcast. Each line in the segment file recites a relative start time (e.g., "00:00:00" for the start time of the segment, or "00:01 :15" for 1 minute, 15 seconds after the start time of the segment) and a duration (e.g., "00:15" for 15 seconds and "01 :00" for 1 minute).
Story files contain all of the details for executing the individual audio and video transitions and events which represent a series of pages or a single page of a presentation.
SAMPLE Story file: ; FileName: C:\Windows\NTVWork\0003552.STY
; Date & Time Modified: 08-02-1994 17:24:26 <page>00:00:00| 00:06| 1 <tmp1 >00:00:00100:061 d.tpl <trns>00:00:00100:061 F0500 < back >00:00:00100: 171 1 bkapna.tga <wav1 >00:00:00| 00:171 100035521.WAV <hed1 l>00:00:00 |00:06 | |Altman\napologizes <txt1 >00:00:00|00:06| (WASHINGTON - The Treasury's number two official is apologizing to senators who've questioned his honesty in Whitewater.
<box1 > 00:00:00100:061 <img1 > 00:00:00100:061 100035521 GA
<page>00:00:06| 00:07|2 <tmp1 >00:00:06100:071 d.tpl <hed1 >00:00:06|00:07| |Altman\napologizes
<txt1 >00:00:06| 00:07 | | In prepared testimony, Roger Altman says he never meant to mislead the Senate Banking Committee. \n He also says he didn't try to hinder
<box1 >00:00:06|00:07| <img1 >00:00:06|00:07| 100035521 GA <page>00:00:13 |00:04|3 <tmp1 >00:00:13 |00:04| d.tpl
<hed1 >00:00:13 |00:06| |Altman\napologizes
<txt1 >00:00:13 |00:06| |the probe of an Arkansas savings and loan with ties to the Clintons.
<box1 >00:00:13100:06| < imgl > 00:00: 13100:06100035521 TGA
In the above example, the lines tagged "<page>" indicate the start of each page and indicate the time length of the page ( 6, 7, 4 seconds respectively for the three pages exemplified above). The lines tagged "<tmp_>" identify the template file that will be used for a time defined on the line. The line tagged "<trns>" identifies a protocol for fading into a page. The lines tagged "<back>" and "<img_>" identify image files encoding images that will be broadcast as part of the multimedia presentation for prescribed periods of time that are defined on the corresponding line. The image files identified in the "<back>" lines are for full-screen images. The lines tagged "<wav_>" identify audio files that will be played for a period of time defined on the line. The lines tagged "<hed_>" and "<txt_>" identify text that will be overlaid on the broadcast images for the periods of time prescribed on the corresponding line. The lines reference "<box_>" indicates that a box should be overlaid on the broadcast images for the period of time indicated on the line. The template file for the page will have a corresponding tagged "<box_>" line that will include a definition of the box geometry.
Template files contain the geometry for rendering a page in a multimedia presentation. Each of the tags ("<txt1 >", "<img1 >", "<hed1 >", etc.) contained in a template file corresponds to a matching tag on the corresponding page in a story file. SAMPLE template file <Ver>2.0
<txt1 >260119012001200111241241 hvb |0|0|0|0|0|0|0
<img1>46|102|204|284|0|0|0|0 <hed1 >2601901200184101421421 uvbli |0|0|0|0|0|0|0
<box1>44|100|208|288|0|1|0|0|0|0|0|0|0
A single advertisement file lists all of the advertisements which are scheduled for an individual day of the year. The multimedia engine 128 proceeds sequentially through the advertisement file and keeps track of which advertisements have been presented.
SAMPLE advertisement file: filename: 0216.AD ; advertising play list
01/28/94 creationdate: 01/28/93 createdfor: NEWSCHANNEL systemid: 2442, Coatesville 00:10|17|18|19|20
00:20121122124125 00:30| 17| 18| 19|20 00:40121 122124125 00:50 | 17| 18| 19|20
23:50| 17| 18 | 19|20
Figure imgf000020_0001
In the above example, the entries following the time entry which indicates the time of day at which to play an advertisement, are one or more video files or story file names separated by pipe symbols " | ".
The generalized format of textual script files stored within the local playback computer disk of the playback computer 124 within the multimedia engine 128 may be as follows:
< TAG > StartTime | Duration | Device | Other wherein the pipe symbol " |" designates a delineation between fields, the prefix <TAG> (e.g. "<box_>", "<txt_>", etc.) indicates the object type within the textual script file, StartTime indicates the starting time of the textual script file, relative to the start time for a segment file, in the conventional hours, minutes, seconds format, HH:MM:SS, and Duration indicates the time duration or dwell in the conventional minutes, seconds, format, MM:SS, during which the object remains active. Also within the generalized textual script file format, Device refers to the physical devices which may be referenced in the textual script files. The physical devices may include a host computer, a digital video subsystem, an external tape deck and a laser disc player. The other field may include additional information required to complete the current scripting line format. For instance in the case of digital video file reference, the other entry is the name of the digital video file. Referring to Fig. 3, there is shown a block diagram file representation 300 of the hierarchical relationship of the four types of textual script files within the playback computer 124 of the multimedia engine 128. Referring to Fig. 4, there is shown block diagram representation 400 illustrating the relationship between the various scripting files. As previously described, the textual script files are one of four types of information files which may be stored within the playback computer 124. The application program 302 creates a number of segment files such as the segment files 304 for use in providing multimedia presentations within the multimedia engine 128. The application program 302 may be any entity operated by any user of the multimedia engine 128 which is able to specify a multimedia presentation for assembly and presentation by the multimedia engine 128.
The segment files 304 shown in the block diagram representation 300 may contain any number of story files 306. A story file 306 included within a segment file 304 is a collection of story events wherein each of the story events has a relative start time and a predetermined duration. Each story file 306 referenced within the segment file 304 must have a reference to a specific time at which the story file 306 is to be presented by the multimedia engine 128 to the cable users 135. A story file 306 is thus a logical sequence of multimedia events with a specific application within a multimedia presentation by the multimedia engine 128. The application may be, for example, a news story, a movie promotion, an event promotion, or any multimedia information presentation. The story events within the story files 306 contain the information required for the multimedia engine 128 to execute the audio and visual transitions within the presentation of a multimedia story by applying signals to the audio line 130 and the video line 132 of the cable television headend 134 as previously described. The following is an example of a number of story files 306 which may be associated with each other to form a portion of a segment file 304 by the multimedia engine 128:
<stry> 00:00:00101 :001 1 highlite.sty <stry>00:01 :00|00:15| | schedule.sty <stry>00:01 :15|00:15| | howtoord.sty <stry>00:01 :30|00:30| | upnexttv.sty
The prefix <stry> at the beginning of each line identifies the file types of the four entries in this segment file 304 as story files 306. The various fields within the story files 306, are delineated by pipe symbols "|" as previously described. The time values 00:00:00 after the <stry> prefix indicate the starting tfmes of the story files 306 relative to the starting time for segment file 304. For example, 00:00:00 in this field identifies the first story file 306 as the one within the segment file 304 which begins at time zero. The 01 :00 in the duration field, delineated by a single pipe symbol, indicates that the first story file 306 has a duration of one minute. The device fields in this example are empty. The last field indicates that the name of the first story file 306 is "highlite.sty".
The second story file 306, named "schedule.sty", has 00:01 :00 in its second field, indicating that it starts one minute into the segment file 304 represented by this example. The story file 306 named "schedule.sty" lasts for fifteen seconds as indicated by 00:15 in the duration field.
Thus, in the multimedia presentation represented by the segment file 304 of this example, the story file 306 named "highlite.sty" starts at time zero which is the beginning of the presentation of the segment file 304 and runs for one minute. The story file 306 named "schedule.sty" begins at one minute and runs for fifteen seconds. The story file 306 named "howtoord.sty" begins at one minute and fifteen seconds and runs for another fifteen seconds, and the story file 306 named "upnexttv.sty" begins at one minute and thirty seconds and runs for thirty seconds. In addition to the story files 306, the block diagram file representation 300 of textual script files stored within the playback computer 124 also sets forth advertising list files 308. The advertising files 308 are sequential lists of advertising insertion lines and other advertising identifications. Advertising files 308 are used to insert advertising from a separate device or from a story file 306 into the multimedia presentation assembled and prepared by the multimedia engine 128. The advertisement information to be presented to the cable users 135 in this manner during a multimedia presentation is accessed from the local memory within the playback computer 124 by the multimedia engine 128 as required in order to properly insert it into the presentation. The accessing of advertising information by the playback computer 124 in accordance with an advertising file 308 occurs when the multimedia engine 128 encounters a <advr> prefix within a segment file 304 as shown in the following example:
<stry>00:00:00100:001 1 highlite.sty
<stry>00:01 :00|00:15| | schedule.sty <stry>00:01 :15| 00:15 | | howtoord.sty
<stry>00:01 :30|00:30| | upnexttv.sty
<advr>00:02:00100:151 1
. . .
In this example after the four story files 306 are processed and displayed as previously described, an advertisement file, whose file name is described by the current month and day (as detailed below), is referenced. The contents of samplead.ad are read and scanned for a time pointer. This time pointer or time reference may point to a digital video filename or to a another story file 306. By separating advertising references into independent files identified in the advertising file 308, the advertising content may be scheduled separately from the rest of the presentation. This permits multimedia content to be managed by an editorial group which is separate from those managing the advertising material. This kind of separation is common in the art of commercial print publishing. It also permits the advertisement information to be used independently of the advertisement information time slots of the multimedia presentation.
The file name format for advertising list files 308 is of the form MMDD.ad in order to indicate the month and day of playback. For example, a list of advertising list files 308 may be of the form:
0101.ad
0102.ad
0103.ad
1231.ad
The last filename "1231.ad" is an advertising list file 308 which is played on
December 3lst of the current year. Fig. 5 shows a block diagram representation of a script file parsing method 500 for parsing various textual script files within the multimedia engine 128 of the present invention. The script file parsing method 500 may be used to parse segment files 304, story files 306 and template files 310.
Therefore within the script file parsing method 500 a determination is made at decision block 502 what type of textual script file is being parsed by the parsing method 500.
If the textual script file being parsed is a story file 306, execution of the script file parsing method 500 proceeds by way of decision node 504 of decision block 502 to the script file line parsing process 600 for parsing of the individual file lines within the story file 306. Execution of the script file line processing process 600 described hereinbelow with respect to Fig. 6.
When execution exits the script file line parsing process 600 by way of the exit pathway 614, it reenters the file parsing method 500 at block
510. In block 510, the pages of the parsed story file 306 are identified. The appropriate records within the local storage of the playback computer 124 are associated with the various identified pages as shown in block 512. In block 514 a determination is made as to which template files 310 are included within the story file 306 in order to present the story file 306. The template files 310 are then loaded opened, parsed, and converted into metafile objects as described with other text files, block 524. If the textual script file being parsed by the script file parsing method 500 is a segment file 304, as determined in decision block 502, execution of the script file parsing method 500 proceeds from decision block 502 by way of decision node 506. The segment file 304 being parsed is then processed by the file line parsing process 600 as previously described. Execution exits the file line parsing process 600 by way of exit pathway 614 and reenters the script file parsing process 500 at block 518. The duration of the segment file 304 is then calculated in block 518. All of the story files 306 within the segment file 304 being parsed by the parsing process 600 are identified, loaded and updated as shown in blocks 520, 522. If the textual script file being parsed by the script file parsing method 500 is a template file 310, as determined in decision block 502, execution of the script file parsing method 500 proceeds from the decision block 502 by way of the decision node 508. The template files 310 are then opened and parsed as shown in block 600. They are converted into metafile objects with other text files as shown in block 524 in a manner described hereinbelow.
Referring to Fig. 6, there is shown a block diagram representation of the script file line parsing process 600. As previously described the script file line parsing process 600 may be used for the parsing of individual lines within the various textual script files. This parsing may occur within the parser block 1106 (Fig. 11) of the presentation process overview 1100 operating within the multimedia engine 128 as further described hereinbelow. The script file line parsing process 600 may be used to separate the lines of textual script files including segment files 304, story files 306, and template files 310 for processing by the script file parsing method 500. Execution of the script file line parsing process 600 begins with a textual file being opened as shown in block 602. A determination is made whether the last line of the textual file has been parsed by the line parsing process 600 in decision block 604. If the current line is not an end of file marker, a line of the file is read as shown in block 606. If the last line of the file has been parsed the file is closed as shown in block 610. After the parsed file is closed execution of the script file line parsing process 600 exits by way of the exit pathway 614.
If the end of the textual script file being parsed by the line parsing process 600 has not been encountered, as determined in decision block 604, the items between the pipe delineation characters in the current file line of the textual script file are separated as shown in block 608. Execution of the file line parsing process 600 then returns to decision block 604 where another line of the textual script file is examined to determine whether it is an end-of-file marker.
Fig. 7 shows a block diagram representation of the segment event generation process 700 for generating segment events to prepare multimedia presentations by the multimedia engine 128. The first step in generating the events for a segment 304 by the process 700 is preparation of an advertising record for the segment 304 as shown in block 702 if required. Only applications that take advantage of dynamic advertising insertion use this feature. In most cases block 702 is not executed. The next step is the reading of a record as shown in block 708. If decision block 712 determines that the file is not done, the record is checked to see if it is a story record in decision block 732. If not, it is an advertising record and an advertisement event for the multimedia event queue is generated 748. The process continues with the next record being read in block 708 and checked in decision block 712. If the next record read is a story record the process for generating story events 800 is executed as described hereinbelow. Upon completion of story event generation, the next line is checked to see if it is the end of the file as shown in decision block 712. Once the end of the file is reached, as determined by block 712, a determination is made whether the system is currently in hunt mode as shown in decision block 716. If the software is in hunt mode it calculates the specified offset event times to run as shown in block 720 and exits at terminal 724. If the software is not in hunt mode the multimedia engine 128 adds a stop event to the queue in block 728 and exits the operation at terminal 724.
Fig. 8 shows the story event generation process 800 for generating events corresponding to a story file 306 by the multimedia engine 128 of the present invention. Within the story event generation process 800 a determination is made in decision block 804 whether the story file 306 being processed by the multimedia engine 128 is present within the directory of the playback computer 124. If the story file 306 is not present in the playback computer memory 124, the events necessary for a predetermined default presentation are generated as shown in block 820 and execution by the multimedia engine 128 exits the story event generation process 800 at terminal block 822.
If the story file 306 does exist in the playback computer 124, as determined in decision block 804, the story event generation process 800 begins sequentially processing the pages within the story file 306. In order to do this, a record is read in block 806 and a determination is made in decision 808 whether the end of the story file 306 has been encountered. If the end of the story file 306 has been encountered, a predetermined end story event is added to the output of the story by the event generation process 800 as shown in block 824. A determination is then made in decision block 828 whether a stop is specified within the story file 306 being processed. If a stop has not been specified execution of the story event generation process 800 ends at terminal block 822. If a stop is specified, as determined in decision block 828, a stop event is scheduled as shown in block 832 and execution exits the story event generation process 800 by way of terminal block 822. If the end of the story file 306 being processed by the story event generation process 800 is not encountered, as determined in decision block 808, a determination is made whether the page of the file being processed should be displayed. This page display determination is made in decision block 812. (For instance, the multimedia engine may decide to skip a page if needed to stay in synch with timing indicated in the script file.) If the page should not be displayed, execution proceeds to decision 808 where another determination is made whether the end of the story file 306 has been encountered. If the page should be displayed, as determined in decision block 812, execution proceeds by way of path 813 to the block 1200 (hereinafter described) where the queue events for the page are generated. Fig. 9 shows the segment event generation process 900 for determining whether a segment file 304 being processed by the multimedia engine 128 contains an advertising list file 308 and for processing an advertising list file 308 when it is present. Within the segment event generation process 900 a determination is made in decision block 902 whether the segment file 304 contains an advertising list file 308. If the segment 304 does not contain an advertising list file 308, execution exits the segment processing process 900 by way of terminal block 904. If the segment file 304 does contain an advertising list file 308, a determination of the advertising file name is made, as shown in block 908. A determination is then made in decision block 912 whether the advertising list file 308 indicated can be opened. If no advertising list file 308 can be opened an advertisement file directory within the local memory of the playback computer 124 is scanned by the event generation process 900 to determine the most recently used advertising list file 308, as shown in block 916. If the advertising list file 308 found in block 916 is not open, as determined in decision block 920, execution exits the segment processing process 900 by way of terminal block 904.
If an advertising list file 308 is open, as determined by either decision block 912 or decision block 920, the record for the current HH:MM is obtained in block 924. The record obtained in block 924 is returned by the segment processing process 900, as shown in the block 928 and the process 900 is exited by way of terminal block 904.
Fig. 10 shows the advertising event generation process 1000 for generating advertising events within the multimedia engine 128 of the present invention. Execution of the advertising event generation process 1000 begins with a determination whether the advertising record to be processed is present in memory. This determination uses the advertising record process 900 described hereinabove. This determination is indicated in decision block 1004. If the event is not present a default static screen event is scheduled as shown in block 1008 and execution exits the advertising event generation process 1000 by way of terminal block 1010.
If the advertising record to be processed by the event generation process 1000 does exist within memory, as determined in decision block 1004, a determination is made in block 1012 which advertising event should be played. This determination is made in a round robin method. The advertising record that is returned via record generation 900 is a series of file pointers that point to one or more digital video files or story files in memory. One advertisement is selected from the record returned. The first time the multimedia engine 128 accesses an advertisement in this method a round-robin counter is set. Upon subsequent calls to this record the counter is incremented and the next advertisement is played. When the end of the series is encountered the round-robin counter is reset to one and the first advertisement is replayed.
A determination is then made at decision block 1016 whether the advertising event being processed is a story file 306. If the event is a story file 306, the story events for the story file 306 are generated as shown in block 1020. After the story events are generated in block 1020 the advertising event generation process 1000 is complete and execution exits by way of terminal block 1010. If the advertising event being processed by the event generation process 1000 is not a story file 306, as determined in decision block 1016, video file events are scheduled by the multimedia engine 128 as shown in block 1024. After scheduling of the video file events in block 1024, execution exits the advertising event generation process 1000 by way of terminal block 1010. Referring to Fig. 11 , there is shown a process overview 1100 describing the operations of the multimedia engine 128 of the present invention. Within the process overview 1100 an application programming interface 1102 provides an interface for applications to allow them to control the multimedia engine 128. The application programming interface 1102 applies commands scripting information from the external applications (not shown) to the various primary control subsystems of the multimedia engine 128. Commands can be applied, for example, to the parsing subsystem 1106, the hunting subsystem 1108, the control subsystem 1114 and directly to the graphics rendering subsystem 1118. The application programming interface 1102 permits a consistent view of the multimedia engine 128 for the external applications (i.e., a computer program whose purpose is to control the multimedia engine).
Through the application programming interface 1102 external applications can issue commands to the parsing subsystem 1106 to parse input files and schedule multimedia events based upon the files. The parsing subsystem 1106 reads the passive operation data files 1104 and generates the metafiles 1110 which represent the non-timing related items in the passive operation data files (PODF) or script files 1104. A metafile 1110 is a set of records which represent a series of calls by the application programming interface 1102 which are to be made to the graphics rendering subsystem 1118. The records of the metafile 1110 are played sequentially when all of the corresponding calls of the application programming interface 1102 are made. The metafiles 1110 are then scheduled into the timing subsystem 1116.
Also through the application programming interface 1102 external applications may request that the hunting subsystem 1108 identify a set of passive operation data files or PODF's 1104 which should be run at a particular time. The hunting subsystem 1108 checks the system time and identifies the PODF's 1104 which have time encoded filenames corresponding to the current system time. When the set of data files 1104 have been identified the parsing subsystem 1106 schedules the set of files for presentation by the multimedia engine 128.
External applications may also use the application programming interface 1102 to directly access the graphics rendering subsystem 1118 as previously described. This permits an external application to directly control the output of a presentation. This direct control allows an alternate means of controlling the multimedia presentation rather than the scripted format provided by the parsing subsystem 1106 and hunting subsystem 1108 (generally, the components of the script manager 1150) of the process overview 1100.
External applications using the multimedia engine 128 by way of the application programming interface 1102 may control the timing of the multimedia engine 128. This is done by an external application (not shown) having a timing loop for calling a polling function in the application programming interface 1102. This polling function in the application programming interface 1102 distributes the polling into the control subsystem 1114 which controls the timing subsystem 1116 and the hunting subsystem 1108. When a poll in the control subsystem 1114 occurs, the poll is transferred to the timing subsystem 1116. If the timing subsystem 1116 identifies an event as requiring action, it returns that event to the control subsystem 1114. A metafile 1110 is then retrieved from the event and it is passed into the graphics rendering subsystem 1118 for processing. If the timing subsystem 1116 is then empty, either the hunting subsystem 1108 is called to schedule another set of files through hunting or playback is stopped. An external application can also directly stop the presentation by way of the application programming interface 1102.
Among the components within the renderer 1151 , is the graphics rendering subsystem 1118, which operates in a plurality of modes. The graphics rendering subsystem 1118 supports a set of functions which directly control multimedia presentation hardware within the multimedia engine 128. These functions can be called by external applications by way of the application programming interface 1102, or by a metafile 1110.
The graphics rendering subsystem 1118 provides a number of features within the multimedia engine 128. It may provide a hardware independent application program interface which external applications can use. This is possible because all of the functions in the graphics rendering subsystem 1118 are propagated to a similar function in the hardware abstraction layer 1128. It provides high level control of the font rasterizer 1122, the graphics rasterizer 1120, and the digital video decoder 234 to provide a logical multimedia presentation. This logical presentation is coordinated by preventing the user from requesting an illogical sequence of actions, by providing control of transitions from complete graphical presentations to video presentations, and by handling exceptional cases such as video playback failure.
The graphics rasterizer 1120 of the process overview 1100 is called by the graphics rendering subsystem 1118 to provide raster display operations. This includes drawing lines and polygons as well as displaying graphics files 1124. The graphics rasterizer 1120 loads the graphics files 1124 if required and calls the graphics hardware interface 1134 section of the hardware abstraction layer 1128 to draw an item on the graphics hardware (graphics generator/video overlay card) 230.
The font rasterizer 1122 is called by the graphics rendering subsystem 1118 to provide anti-aliased font-based raster display operations. The font rasterizer 1122 is provided with a font specification, text content, and position and provides a font from a font file 1126. It also performs anti-aliasing and calls the graphics hardware interface 1134 section of the hardware abstraction layer 1128 to display the desired text in the appropriate font. The font rasterizer 1122 also permits caching of font rasterization to provide greater performance. The graphics renderer 1118 directly calls functions in the hardware abstraction layer 1128 for controlling the digital video decoder 234 and the digital video file storage of disk 238. The digital video decoder 234 includes its own application program interface which is abstracted by the video decoder interface 1132 of the hardware abstraction layer 1128. Through the video decoder interface 1132, the graphics rendering subsystem 1118 can request that the video decoder 234 play a particular video file. The video decoder interface 1132 first verifies the existence of the digital video file on the disk 238, and if the file exists, starts the video decoder 234 playback of that file. The video decoder 234 may send asynchronous messages to the multimedia engine 128. These messages are directed to the external application, whose responsibility it is to return them to the playback computer 124 by way of the application programming interface 1102. These messages are asynchronous and are relayed by the application programming interface 1102 to the control module 1114 and then to the graphics rendering subsystem 1118. The control module 1114 may monitor these messages in the case of serious errors on the video decoder 234 which require the halting or modification of the presentation in the timing subsystem 1116. When the graphics rendering subsystem 1118 receives an asynchronous message from the video decoder 234 it applies the message to the video decoder interface 1132 which determines how to handle the message. Based upon the decision made by the video decoder interface 1132, the graphics rendering subsystem 1118 may make modifications in the presentation.
Referring to Fig. 12, there is shown a block diagram representation 1200 of a method for generating queue events for a page of a story file 306 within the multimedia engine 128 of the present invention. Entry of the method of block diagram representation 1200 is by way of decision block 812 and path 813 of the story event process 800. As previously described, decisions block 812 determines whether a page of a story is to be displayed. If the page is to be displayed its queue events must be generated. Thus the story page is applied to block 1202 which determines the time of the page in the story, and possibly the position of the story within a segment. Process block 1204 then creates an empty metafile 1110. Records are added to the metafile 1100 to correspond to the items on the story page.
A page is read in block 1206. Decision block 1208 determines whether the file is finished. If the file is finished block 1226 creates an event in the event queue for the time identified by block 1202 and process 1228 places the metafile 1100 inside the event created by process 1226. The procedure of diagram 1200 is then terminated.
If a record exists in the story page, execution of process 1200 proceeds to decision block 1210. Decision block 1210 determines if the story page record represents a change of a template 310. If it is, block 1216 attempts to select the specified template. Process control is then returned to decision block 1208 from block 1216. If the story page record is not a change of template, execution proceeds to decision block 1212. Decision block 1212 determines if the story page record requires a template 310. If it does, control passes to block 1218, otherwise it passes to block 1214. Block 1218 looks up the corresponding template record specified by the story page record. If the record does not exist, the record is ignored and execution proceeds from block 1220 to block 1224 and block 1208. If the record does exist control process 1222 merges the story page and template information into a single record to be placed within the metafile 1100. Block 1214 takes the output of block 1212 and/or block 1222 and creates a record with the metafile created by block 1204 which represents the record identified of created by block 1212 or block 1222. Control then passes back to decision 1208. It will be appreciated by those skilled in the art that changes could be made to the embodiments described above without departing from the broad inventive concept thereof. It is understood, therefore, that this invention is not limited to the particular embodiments disclosed, but it is intended to cover modifications within the spirit and scope of the present invention as defined by the appended claims. APPENDIX
Starnet ultime ia Scripting Language
[S S
Language Specification
Version 1.5
PURPOSE OF THIS DOCUMENT.
WHAT IS THE SMME?
WHAT IS THE SMSL?
WHAT IS SΉLL TO COME IN THE SMSL?., How DOES THE SMME WORK?
ENGINE OPERATING MODES
NORMAL MODE 4
HUNT MODE
SMME INPUT FILES AND SOURCES 4
TEXTUAL SCRIPT FILES ^ 4
STORV FILES i.STD 4
SEGMENT FILES I .SEO ; 4
ADVERTISEMENT FILES ( .AD) 4
TEMPLATE FILES (.TPL) 5
THE SEGMENT FILE .
File naming convention . THE STORY FILE
LANGUAGE DESCRD7TION BY TAG .
PRESENTATION OPERATORS 9 menu 9 menutpl 9 menuback 9
STORY OPERATORS 10 page 10 at.. 10 msg. 10 wav 10 back 10 img 11 tmpi /; hed // sub // tint 12 vid 12
: (used in ALL SλfSL script files) 12 lin. 13 box 13 elk 13 trns 14 blk 14
SEGMENT OPERATORS 14 soy 14 advr 14
TEMPLATE OPERATORS 15 hnx ; Λ
(in ixt // msg. 16 grd 16
Figure imgf000037_0001
SUO : / ." tun / "
DEVICES — V) comp 19 vsn 19 rape 19 lasr 19
Fnnnn 20
Wcnnnn 20
Dnnnnn 20
INLINE FORMATTING AND COMMANDS 21 snπn 21
Jfonmamef. 22
Innn '. 22 ann... *• rnnn . ti gnnn J bnnn 23
THE ADVERTISING LIST FD1E 24
THE MESSAGE LIST FILE 26
THE VΛWΛr/V/V/VΛr.LOG 27
THE STARNET.INI FILE 28
όb
Purpose of this document
The purpose of this document is to define the StarNet Multimeαia Scπpt g Language (SMSL). This definition includes what the StarNet Multimeαia Engine (SMME) does, how it does it. the input data necessary for it's operation and the output it produces as a result of that ongoing operation.
What is the SMME?
The SMME is a software commanα interpreter wnich runs in a product specific computer at a cable head end and produces output consisting of still pictures, full motion video, rendered text and graphics and sound which is in turn fed to the cable system for broaαcastmg.
What is the SMSL?
The SMSL is a the multimedia scπptiπg language invented by Gerard Kunkel of StarNet. and refineo by Michael Heydt of StarNet. The language addresses all of the capabilities of the SMME in a simple collection of file formats ail stored as ASCII text files.
What is still to come in the SMSL?
The SMSL was orgmatly conceived as an all purpose multimedia scπpting language for use in passive or interactive television. As such, the language will be expanded dunng the fall of 1994 to address the new mteraσive needs or television set-top boxes (STBs). Of note will be client- server nature of data access to feed the engine, expanded graphics capabilities of the new dedicated graphics chips to do graphic sprites. Other operator needs will be identified as the API for the set-top box is disclosed. In the meantime, the general Windows 3.1 API will be used as a model for additional operators.
One very important change coming to the SMSL is the support of VGA for graphics output. Currently the engine talks directly to graphics hardware for its graphics display. Since the STBs appear to be supporting VGA as a standard for graphics overlay, the engine will be modified to use all of the Windows 3.1 API calls for graphics reπdeππg and control.
How Does the SMME Work?
The SMME takes as input the SMSL files which consist of presentation sequencing information, and data files in the form of graphic images, digital video sequences and digitized sound. Some of the control information that the SMME receives directs it to access several different types of playback devices. in order to produce the necessary output. These playback devices include, and are currently limited to: digital video subsystems (DVSS from Scientific Atlanta), digital video adapters (Optibase PC Motion), video tape recorders (VTRs) and laser disk players.
It is not within the scope of this document to descπbe the method by which information is received by the SMME. We will assume for the sake of simplicity that any necessary information is available for the engine's use before it is needed. For a detailed descπptioπ of the communications methods and interactions between StarNet equipment in a cable head end, see the StarNet Communications Server Specification and StarNet Multimedia Hardware Specification. Engine operating modes
SMMEFRNT.EXE is software application that interfaces between data files and the engine itself. SMMEFRNT is a software front end that allows a user to define one of two operating for multimedia playback. There are other uses for the SMMEFRNT application that are not relevant here. For more detail see the document StarNet Multimedia Engine Front End Application.
Normal mode
If a SEGMENT file plays, ends and is not replaced by a newer segment file, the current segment file will stop playing and the program will stop.
Hunt mode
If a SEGMENT file plays, ends and is not replaced by a newer segment file, the current segment file will continue fooping until a newer SEGMENT file is found.
SMME Input Files and Sources
The SMME takes as input 4 fundamentally different types of information: textual scπpt files, bit mapped graphic images, digitized sound, and digital or analog full-motion video. This information comes from four different storage devιces:hard disk, digital video subsystem (which may actually be a hard disk attached to another device), video tape and laser disk. This document is responsible for descπbing the contents of only the textual scπpt files.
Textual Script Files
There are 4 types of scπpt files that the SMME reads and interprets: Segment files, Story files. Control files and Advertisement files. The Segment file (.SEG) contains the basic sequence of events for the presentation of a multimedia sequence. It references two other types of fi!es:Stoπes (.STYs) and the Advertisements file (.AD). The CONTROL files (.CTL) are used only in the News Channel product and have no beaπng on the operation of the SMME.
Story Files (.STY)
The term Story comes from the News Channel project (see other documentation) and is defined as a logical sequence of multimedia events with a specific application. The application may be a news story, a movie promotion, an event promotion, an advertisement, etc. The Story file contains all of the details for executing the individual audio and visual transitions and events which represent a Story.
Segment Files (.SEG)
T e SEGMENT file is a collection of Stones and Advertisements.
Advertisement Files (.AD) There is only one Advertisements file wnich lists all of the Advertisements which are scheduled for a certain amount of time. The SMME keeps track of which aαs have been played and simply runs sequentially through the file.
Template Files (.TPL)
The template file contains the geometry for rendeπng a page in the multimedia presentation. Each of the tags contained in the TPL file corresponds to a matching tag in the STORY file.
DESCRIPTION TO COME:
; story '.emois a
<;-.ec-i>40,-3:-:::,-:;oi55. ; = ! -.vcsc :"..:. , :
<ι gi>40, .='.■ •;::...Co 10 -:'.'.. >4 , 2;: ni.120 ιi' 2; ::τ.v ✓<.,""",",
Figure imgf000040_0001
File Formats
Each of the SMME textual control files is wπtten using the language specification called the StarNet Multimedia Scπpting Language (SMSL). This language borrows the construct of a number of existing languages: C, SGML, and the Windows INI file format.
The following line shows the general format of a iine within either a Story or Segment file:
<TAG>StartTime|Duratιon|Devιce|Other (Filename etc.)|..
Where:
<TAG> one of the tags descπbed in the SMEPDL section of this document.
StartTime HH:MM:SS Duration MM:SS Device descπbed in a later section, defaults to COMP Other tag specific parameters
General standards for initialization and data files include the following:
• a semicolon (;) at the beginning of a iine signals a comment
• all time magnitudes will be expressed in milliseconds where granularity of less than one second may be necessary and in MM:SS form where larger spans of time are needed.
The Segment File
The SEGMENT file is a list of events that the multimedia engine will play. The SEGMENT file points to STORY files that contain the actual data for display . The SEGMENT file is a higher level that simply sequences STORY files. This relationship is illustrated in the following diagram.
Story File Template File
Segment file Story File Background File
Story File Wave File
Story File Video device
Story File
Advertising File Video device
Comment line Video device
Video device File naming convention
The name of the SEGMENT file is very important to the correct operation of the SMME. The DOS eight character filename is used to represent the exact minute of the day/moπth/year that the SEGMENT is to begin operation. Below is an example of the SEGMENT file naming convention:
Month Day Hour Minute Extension 1-12 1-31 00-23 00-59 .SEG or
01080155.SEG Below is a sample NewsChanπel SEGMENT file.
<3i-.ry>00 :' ύ .00100: oi : 000001S. STY
<3-r>' •" 1 .40100: 15: : : . oβo:.-..
<s-.ry>on 55100: 19
<5-.r.->oo 14100: 12 "TOCCT .
<3try> 0 o: 2SI00: IS: .,0010003.
<stry>00 oi 42100: 15: ,.COOOOGA.
<3*:ry 00 01 :7i00: 19i 10000002.
<3<:ry>00 02 15100: 12: 7000006. ST
<3try>00 02 22100: 3000000.
<3try>Q0 03 10100: 0000003. STY
<s<:ry>00 03 22! 00: 11. OOOOOOB.STY
<a^ry>00 03 33100: 15: 'OOOOOOC.STY
<3Hry>00 03 48100: 24110000013.STY
<stry>00 04 13100: 121 : 0000008. STY
<3ϊrγ><)0 04 2SI00: 11! : 0000005. STY
<scry>00 04 37100: 151 ICOOOOC .STY
<3try>00 04 =2100: 17: I0000Q1S. STY
<stry>00 OS 40100: 121 '0000030.STY
<scty>00 05 52100: 12U000003C.STY
<str/00 06 04100: 121 i 0000030.STY
<stry>oo 06 16100:12110000030.STY
<stryx)θ 06 23100: 12! 10000030.STY
<3try>oo 06 40100: 1 110000030.STY
<stry>00 06 52100: 12110000030.STY
<snryx>0 07 04100: 12110000030.STY
<3try 00 07 16100: 12110000030.STY
<strγ>O0 07 23100: Hi 10000030. sτr
<advr>00 07 40100: 30iaa.ini
<scr> 0 03 10100:21: I000002A. STY
<stry>00 03 31100:211 I000002A.STY
<atry>00 C3 53100: 000C2Λ. TY
<stry>00 05 15100:211 :000002A.STY
<3tr> 0 05 100:211 '000002A.STY
<βtry>00 03 53100: Oil .000002A.STY
Below is a sample Barker SEGMENT file representing 5 minutes of presentation
<3trγ>00 : 00:00101: 001 liiignli-.a. sr.y <3ir/>00 : 01:00100: 15 ' ' scaeαu.a. Sty <stry>00 : 01:15100: IS : Ihσ toor:.. 3-γ <3try 00 : 01:30100: 30! lupnβxtvi. s'.γ <3try>00 : 02:00100: S:
Figure imgf000042_0001
<3i:ry>00 : 02:15100: S i i.'-.α noord. <stry>00 : 02:30101: 001 i ignlica. sr.y <3tryM>0 : 03:30100: 15 I iscnβα'iiβ. sty <3trγ>00 : 03:45100: 151 ihowcooca. sty <3tr/>00 : 04:00100: 301 hjpnβxivi. sty <3iryX)0104:30100: 15 : : -cneαule. sty <3Cry>O0 : 04:45100: IS 1 ifiαwcoorα. Sty The Story File
The STORY file is a collection of events and pointers. The events listed in the STORY file occupy one line per event. Each event reference must have a relative starting time and a relative duration (dwell). The starting time is relative to the start of the STORY sequence. The duration is relative to the start of the individual event. Each event is labeled with a tag denoted in greater and less-than symbols (<>).
A NewsChannel Story File:
Figure imgf000043_0001
<t!EDi>:0:00:00!0C > vπs2tι r . PL <haαl> ;0 : 00 : 0T "" . aαs _* e—<: .. —> •
Figure imgf000043_0002
«-i'ioi "-*:ππ:n,-..-.- Az-inis r ion d aies ϊ'lrooe.r. ;;..rτe <"".χ-l>Λ : 00 : 001 ff. :10ι I ASH.D.C. — merican traαe partners, cracinα :'rr -.-.a 21.--:a aαaiai- όtra-.isa's f rrt -ret c: "raαe sanctions, are angry ana ccr.: sea. r.
IZ'iicais it.-.in tr.e I-rooean ;aιamunιty's Execut ve 2c3nιs3ion <.ιagl>'? : 00 : 001 no : 2£ i : : JOOlll . -3a
<t:<tl>00:00:10.00: lθι . rβei the President has indicated he is roπmi eα -.c :ree -.race out ir a. o receot va to orotectisr.isai. \n
Administration orf c ais resDonαeα t.-ere LS no mconsis-.- <txtl>CO:00:20IOO:06l ι escy in the emerging U.S. ooncy ana tr.at traae aioicsscv ran no -eager _= neativ iaoeled.
A Barker Story File: scr.eα' ie screen
<page>l
<t-ap±>3cnedui.e . tol
<viαl>OO:O0IO0:15l Ischeαuie.vid
<tx i>0O:OOI00:0'" 110AM
<tx-2>CO:OOI00:0-': ICh 45
<tχt3>00:OOIOO:0" 'A River Runs Througr. It \s030PG\n\s0401-8C0-CINEMA-l S3. =5
<tχ 4>00:00l00:0-l i 10AM
<txt5>00:00l00:0-: :Ch 46
< x β>00 : 00100 : 0' i Sram Stoker's Dracula
Figure imgf000043_0003
34.35
<-Jttl>00:07|00:OSI '10AM
< xt2>CO:07IOO:38ι ICh 4"
<-x 3>00:07l00::S! iTsy'ϊ
Figure imgf000043_0004
S3.95
<--Xt4>GO:07|00:0B I 112?!!
<txt;>00:07|00:05! 'Ch
<tx ό>00:0 lOO:JS .. A Rivar Runs Througn Z:
Figure imgf000043_0005
34.55
Language description by tag
PRESENTATION operators menu
SYNTAX:
<meπuπ_rt>title|director|description
DESCRIPTION: Signifies the creation of a new menu of choices. The subseoueπt menu items are listed in seπes following the initial menu item. The engine assumes base 0 for both the menu and item numbering scheme.
SAMPLE:
<menuύ_0x:iάS3i; ed advert si.-. i men'.l rThe latest ir. rti -ias :', r saia, rea estate ana personals tba world' a firs- ι__-βraβ__.vafprβββnca,c_.on file
jmd saccop bααcaa
Figure imgf000044_0001
menutpl
SYNTAX:
<menutpl>fileπame
DESCRIPTION: Specifies the template file to use for building a menu list in the interactive presentation.
SAMPLE:
<_aenutpι πιsnuαe- . t?ι menuback SYNTAX:
<menuback>fiiename
DESCRIPTION: Specifies the bacKgrouπα bitmap file to use for building a menu list in the interactive presentation.
SAMPLE:
Figure imgf000045_0001
STORY operators page
SYNTAX:
<page>hh:mm:ss|mm:ss|π
DESCRIPTION: Signifies a change of pages in the presentation, it is used pπmaπiy as a device for the News Channel but can be applied to any product that requires tracking the changing of pages.
SAMPLE;
Figure imgf000045_0002
txt
SYNTAX:
<txtn> :mrn:ss|mm:ss||text
DESCRIPTION: Denotes a line of text to be rendered to the display.
SAMPLE;
<txtl> 0:00:00100: 15 I I PHILADELPHIA - The Phils Sweβo -r.e Jivisi.r. msg
SYNTA
<msgπ>hh:mm:ssjmm:ss
DESCRIPTION: Denotes a line of text to be rendered to the display.
SAMPLE
<msgi>:." : 00: 00 l 00 : :.
wav
SYNTAX:
<wavπ> :mm:ss|mm:ss|||filename
DESCRIPTION: Signifies the inclusion of a WAV digital audio file. The actual file is referenced by the trailing filename.
SAMPLE:
<wav_>C- : 00: OO I OO : l- . l samp e.wav
back
SYNTAX: SYNTAX.
<back> :mm:ss|mm:ss|t|fileπame
DESCRIPTION: Filename of TGA file wnich will be oveπayed by live video, grapnics or rendered text.
SAMPLE.
<oacιo00: 00: 00100:15 I samDie.t a
img
SYNTAX:
<imgπ>hh:mm:ss|mm:ss||filename
DESCRIPTION: Signifies the inclusion of a TARGA raster file. The actual file is referenced by the trailing filename.
SAMPLE:
<i_ιgl>00:00:OOIOO:15. : sampiβ. ga
tmpl
SYNTAX:
<tmpl>hh:mm:ss|mm:ss|fiϊename
DESCRIPTION: Specifies the template file to be used for the current page. The actual file is referenced by the trailing filename. This specification should come immediately after the <page> tag.
SAMPLE:
<t_-pι>3ample. tpl
hed
SYNTAX;
< edn>h :mm:ss|mm:ss||text
DESCRIPTION: Signifies the inclusion of a headline. The actual text is referrenced by the last text entry following the transition designation.
SAMPLE:
<hβdl>00:00:OOIOO:15l IPhlllies Taice Mew Yorκ DV Storm
sub
SYNTAX:
<subπ>hh:mm:ss|mm:ss||text
DESCRIPTION: Signifies the inclusion of a subhead. The actual text is referrenced by the last text entry following the transition designation.
SAMPLE:
<si-bl>00:00:OOIOO:15l I Fans riot n the streets: tim
SYNTAX:
<timπ>h :mm:ss|mm:ss||time format| ate format
DESCRIPTION: Signifies the inclusion of a time stamp on the video display. The location, color, font and size is determined by the corresponding reference in the template file. There are different types of display format for time. The following list descπoes the available formats.
0 none
1 hh:mm am pm
Figure imgf000047_0001
3 hh:mm military
4 hh:mm:ss military
S mm:ss
There are different types of display format for time. The following list descnbes the available formats.
0 none
1 mm/dd
2 mπvyy 3 mm dd yy 4 mtn dd/yyyy 5 Month dd
6 Month dd,,yyyy 7 Month yyyy 8 mm - dd
9 mm - yy 10 mm - dd - yy
11 mm - dd - yyyy
SAMPLE:
<tι l>00 : 00 : 00101 : 001 1 11 0 <tι 2>00: 00 : OO I 01:OO I 101 5
vid
SYNTAX;
<vidπ>hh:mm:sstmrn:ss|devιce|startaddress|endaddress|filename
DESCRIPTION: Signifies the inclusion of a digital video clip file. The actual file is referenced by the trailing filename or by the starting and ending address on the device.
SAMPLE:
<vidl>00:00:OOIOO:15IDVSll i I sample.vid <vid2>00:00:10IOO:05ILASRI12345l34S67| <vιd3>00: 00: 15100: 1511TAPE! 12345134567 I
(used in ALL SMSL script files) SYNTAX:
; any ascii text
DESCRIPTION: Comment line used to annotate the files, ignored by the engine.
SAMPLE:
; J: \PFOG\SRC\BARKER\SAMPLE. STY
; Last User: John Doe
; Septan-oar 6, 1994 11:20:23 lin
SYNTAX:
<liππ>hh:mm:ss|mm:ss
DESCRIPTION: Signifies the drawing of a line.
SAMPLE:
^ lme 00 : 00 : 00 i 00 : 15
box
SYNTAX:
<boxn>hh:mfn:ss|mm:ss
DESCRIPTION: Signifies the drawing of a box.
SAMPLE:
<boxl>00:00:00l00:15
cik
SYNTAX:
<clcΛ>hh:mm:ss|mm:ss|start time|interval|directιon|time format
DESCRIPTION: Signifies the inclusion of a clock on the video display which will be updated at the specified interval. The location, color, font and size β determined by the corresponding reference in the template file. There are different types of display format for time. The following list descnbes the available formats.
0
1 hh:mm ampm
2 hh:mm:ss ampm
3 hh:mm military
4 hh:mm:ss military
5 mm:ss
The direction item speαfied whether the dock is to count upwards or downwards. A value of zero will cause the clock time to be incremented by the value specified by the interval parameter. Likewise, a value of 1 will cause the clock time to be decremented (to allow countdown timers)
Note: This operator will only work property when used on top of a full screen video with text overlay.
SAMPLE:
<cl l>00:00:00l00: 15102:15: 00100:001110 trns
SYNTAX
<tms>hh:rnm:ss|mm:ss|type
.DESCRIPTION: Signifies the setting of the global variable for transition type. The SMME will read this tag and set its internal transition global vaπaole Therefore, ail page transitions that follow this tag will transition according to the type specified Please see the section on transitions for a complete descπption of transition types. The default transition is F1000 whicn equals a fade transition tasting for 1 second. The following example shows a transition of 1/2 second being directed to the system.
SAMPLE.
; story die
<trns^00:00: 00 l 00 : 00 l F0500
<3UBl>00: 00: 00100 : 15 : : Fans not _π tna STβ ts
blk
SYNTAX.
<blkn>hh:mm:ss|mm:ss|on tιme|off time|text
DESCRIPTION: Signifies the inclusion of a text item which will be flashed on and off on the screen at the defined intervals. On tme specifies the duration that the text will be displayed on the screen, and off time specifies the time interval that the item will be removed from the screen before it is displayed again.
Note: This operator will only work property when used on top of a full screen video with text overlay.
SAMPLE:
<clcl l>00 : 00 : 00100: 15102 : 15 : 00 ' 00 : 001 1 1 0
SEGMENT operators stry
SYNTAX:
<stry>hh:mm:ss|mm:ss||filename
DESCRIPTION: Signifies the inclusion of a story within a segment file. The filename points to a complete story file that will in turn descπbe the individual multimedia sequences.
SAMPLE:
<stryX)O:00:00 I OO: 30 l i sample. sty advr
SYNTAX:
<advr>hh:mm:ssjmm:ss|filename
DESCRIPTION: Signifies the inclusion of an advertisement in the sequence of events in the segment file. The filename points to the advertisement piayiist supplied to the system. That piayiist contains all of the information on where to find the advertisement and what device to get the ad from.
SAMPLE:
<advr>00 : 00 : 00100 : 15 . sample . ad TEMPLATE operators box
SYNTAX:
<boxt7>llx|lly|wjh|frametype|filltype|iineweight|line R!G|B|fill R|G|B
DESCRIPTION: Signifies the inclusion or a box on the video page. Below is a list of available box frame types:
0 outline rrame
1 curved bevel
2 flat bevel
3 shaded bevel
4 hard drop shadow
5 outline w/napd droo shadow
6 soft drop snadow
7 outline w/soft drop shadow
Visual samples:
Figure imgf000050_0001
The last entry in this command line is the frame weight It is measured in pixels. The color of the frame is determined by the current foreground color as set by the <cαlr> tag. If the box is filled the fill color is determined by the current background color.
An fill entry of 1 signifies that the box is to be filled. An entry of 0 signifies the box is not filled. unfilled filled
SAMPLE:
<boxl>10012401 122. 100 I 51 1121128 1123 1128 I 64 I 64 I 64
lin
SYNTAX:
<linn>x1 |y1 |x2|y2|line weight|R|G|B
DESCRIPTION: Signifies the inclusion of a line.
SAMPLE:
<Una>10012C0 I 55 i 25 I 3 1128 I 46 I 46
txt SYNTAX:
<txtn>x|y|x|y|character spacιπg|sιze|leading|foπtname|font color R|G|B|shadow sιze|shadow color R|G)B
DESCRIPTION: Signifies the inclusion of a line of text.
SAMPLE:
Figure imgf000051_0001
6i .0 55 : 55 , 'ivcoo . ι j ι C , 3 ι •' O ■".
'ιmgl>40ilόOI432H26
< Xtl>40l290l432ll30ICi22i22' v 123.0,0,3,', ,r,
<vidl>OI I 5121486
msg
SYNTAX:
<msgπ>x|y|x|y|character spaαng|size|leading|fontname|fαnt color R|G|B|shadow sιze|snadow color R|G|B
DESCRIPTION: Signifies the inclusion of a message line. The text for this message is obtained from the message file on disk.
SAMPLE: story tamplate
<hβαi>40'tβl432l61;0l55 SSI'IVCDC 0 ι 0 i ' I 3 ' 0 '0 I 0
<ιaιgl>4 ιl60l432:125
<ra3gi>40l290l432ll20I !22:22l v 123 0 , 0.3 ' 0 ι 3 I 0
<V101>0: I512I486
grd
SYNTAX:
<grdπ>gradieπt typejstarting R|G|B|ending R|G|B
DESCRIPTION: Specifies a gradient fill. This is a global reference that can be applied, but must also be restored when necessary. The following types may be applied to any filled box.
0 no gradient
1 top-down
2 left-πght
Visual samples:
SAMPLE:
<boxl>100 I 2401 221100 I 5 i 112 12911281123 I 64 I 64 I 64 hed
SYNTAX
<hedn>x|y|x|y|character spacing|sizejleading|fontname|font color R|G|B|shadow sizejshadow color R|G|B
DESCRIPTION: Signifies the inclusion of a line of text. This operator is idenbcal to the txt operator.
SAMPLE:
; story tamolata
<hβdl>40 |Tβ I 32l61l0< 55 i 551 uvcoo <Qισι0l3l0ι0ι0
<imgl>40U60l432:12.
< χtl>40 I 290 I 4321130 I 22122, nv 1 28 I 0 I 0 I 3 I 0 ! 0 I 0
<Vldl>0l0l 5121486 img
SYNTAX:
<imgπ>x|y|x|y|transparent|red|greeπ|blue
DESCRIPTION: Signifies the inclusion of an image. A particular color in the image may be made transparent by assigning
Figure imgf000052_0001
, and specifying the RGB value of the color in the folσwing three parameters. If transparent=0 then no' colors in the image will be maαe transparent.
SAMPLE:
; arv template
<hedl>40 I 78 I 432 I 61 > 0 ' 55 ! 55 ! vcao -1O!0ι3l0iO,O
<ιmji>401 601432112. : 0 i 0101
<txtl>4Cl290l 32: 130ι l22:22!.ιv 123 : ■') ' 13 I 0 I Q ' :
<vιαl ϋ i 0 l 512 I 48611. 55 : ύ i •''*■
vid
SYNTAX:
<vidn>type|x1 |y1 |x2|y2
DESCRIPTION: Signifies the inclusion of an vtαeo clip. If the type field is 0. the video clip will be played full screen and all other items will be rendered ontop of the video. If the type field is 1 , the video will be played in the window speαfied by (x1 ,y1 Hx2,y2).
SAMPLE:
; story temolate
<hβdl>40 I 78 I 432 ! 61 ! 0 1 55 .' 55 S UVC2C 3 1 0 ! 0 ; . 3 1 0 10
Figure imgf000052_0002
<txtl>40129O I 432 I 130 I 0 I 22 ! 22 Ihv . 12S 10101 31010 10
<V!dl>0 l 0 l 0 l 512 1 486
sub
SYNTAX:
<subπ>x|y|x|y|cπaracter spacing|size|leadiπg|foπtπamelfont color R|G|B|shadow sιze|shadow color R|G|B
DESCRIPTION: Signifies the inclusion of a subhead. This operator is identical to the txt operator.
SAMPLE:
: story template
<snjbi>40 I 781432 I δl I 01 5 I 55 IUVCOO 01010' 3:31010
<ιmgl>40ll60l432ll26
<txtl>40 I 2901432:130101 2:22!hv 12S 101013101010
<vιdl>OIOl512l486
tim
SYNTAX:
<timπ>x|y|x|y|character spacing|size|leading|fontname|font color R|G|B|shadow size|shadow color R|G|B
DESCRIPTION: Signifies the inclusion of a subhead. This operator is identical to the txt operator.
SAMPLE: ; j-.ory -.eraD a e
«.-.ial>40178I432I61I0I55:55I uvcac ; : 0 I 01 Lmαl>40llό l 32ll2c
•-t:tti>40ι250 I 432 i 13010 i 22 i 22 :nv .231
elk
SYNTAX:
<clckn>x|y|x|y|characterspacing|sizejleading|fontname|font color R|G|B|shadow size|shadow color R|G|B
DESCRIPTION: Signifies the drawing of a clock.
SAMPLE:
; s-ory template
<clc)l>40|78l432l61l !55l55:-:vcDθ 0'0:0'3 0 i • 0
<i-igi>4 l 1 = 0:432:126
'-.y.»l>4'ii 5 , 4.2 ' 13CI-"' ■ 32 21 r.v 112 -' ' ' ■ r
blk
SYNTAX:
<clckn>x|y|x|y|characterspacing|sizejieadiπg|fontname|font color R|G|B|shadow size|shadow color R|G|B
DESCRIPTION: Signifies the drawing of a flashing text item.
SAMPLE:
; story temolate
<clcll>40 I 9 H32 I 6110 I 55 I 55 : uvcoo ■ 0lOICI3!Ol IO
<imιjl>4011601432 ! 125
<txtl>40 I 290 I 432113010122 ; 22 I v 112810 I 0 I 3 i 0 I 0 i 0
Devices
The devices entry within the story and segment files refers to the physical device attached to the computer system. The four letter acronym tells the Multimedia Engine which device will be used to access data. Below is a description of all of the device types supported by the Engine. comp
DESCRIPTION: Specifies that the data for this instruction is available on the host computer. It is assumed that the dπve ana directory are the same as the location of the Engine. If the application using the Engine nas a separate set of directoπes setup by its INI file, then that dπve and directory designation is used. Como is the default device and should never actually be seen in any of these files. If the device parameter is null but the tag actually uses the device parameter, the engine will assume comp. dvsπ
DESCRIPTION: Specifies that the data for this instruction is available on the digital video subsystem. In this case the Scientific Atlanta Playback board is the source. The actual board designation is determined by the numeric value contained in the tag. For instance, if the tag specified dvsl then the first SA Playback board will be used. tape
DESCRIPTION: Specifies that the data for this instruction is available on an external tape deck. lasr
DESCRIPTION: Specifies that the data for this instruction is available on an attached laser disc player.
Transitions
The transitions entry within the story and segment files refers to the type of video transition that is to be used between video pages. This transition is called by the Engine used the Targa transitional effects that are resident on the Targa board. A limited set of transitions exist for this display board, and a smaller number of transitions have been engineered into the Engine.
NOTE: Transition information has moved from an item within all tags to its own tag. The SMME wiii read this transition and set a global vaπable for use in all of the following transitions.
The duration of effects in the engine is determined in milliseconds, not frames.
Fnnnn
DESCRIPTION: Specifies that the transition will be a fade. The numeric value contained in this tag instructs the Engine as to the duration of the fade from one page to the next. The value is measured in milliseconds. Therefore a value of 1000 would request a fade of one second.
SAMPLE:
< xtl>'i0 : u0 : 00 l 00 : 06 l ' T ir .s a -.eaαiiaa
Figure imgf000055_0001
DESCRIPTION: Specifies that the transition will be a wipe. The numeric value contained in this tag instructs the Engine as to the type of wipe from one page to the next The available wipe types are requested from the following table of wipe types.
CODE WIPE
WRnnnn wipe to the πght across nnnn milliseconds
WLnnπn wipe to the left across nnnn milliseconds
WUnπrw? wipe up across nnnn milliseconds
WD/vww wipe down across nnnn milliseconds
Dnnnnn
DESCRIPTION: Specifies that the transition will be a diagonal wipe. The numeric value contained in this tag instructs the Engine as to the type of wipe from one page to the next and its duration. The first n denotes which wipe type from the following table will be used.
CODE WIPE
D1 nnnn wipe diagonally from top left to lower πght across nnnn milliseconds
D2nnnn wipe diagonally from lower left to upper πght across nnnn milliseconds
DZnnnn wipe diagonally from upper πght to lower left across nnnn milliseconds
DAnnπn wipe diagonally from lower πght to upper left across nnnn milliseconds Inline Formatting and Commands
Inline formatting allows the individual lines of the STORY file to address multiple character rendering formats. For instance, the News Channel may require the use of italic or underlined text in a single line of text. This is prevelent in headlines when emphasis is required or when the name of a publication is used within the sentence. All of the inline formatting or commands are identified by the backslash. This was used to help keep the Lenfest Multimedia Language in line with some of the other common languages, such as C.
The following list of format and command operators is in no way complete. Other will be added as needed. The current list of operators is in lowercase and all are a single character. snnn
DESCRIPTION: Specifies that the current point size should be changed to nnn points. The numeric value contained in this tag instructs the Engine as to the point size. Therefore a value of 030 would request a cnange in point size to 30 points. The value nnn is fixed in length to 3 characters.
RANGE:
6 to 999
SAMPLE:
<txtl>00:00:OOIOO:06l IThis is a
Figure imgf000056_0001
ffontnamef
DESCRIPTION: Specifies that the current font should be cnaπged to fontname. Since the fontname is of undetermined length there must be an accompanying \f to mark the eno of the fontname.
RANGE:
Any legal and available font name. The default font is the current font.
SAMPLE:
<txtl>00:00:00l00:06i 'This :;- a
Figure imgf000057_0001
\nnn
DESCRIPTION:, Specifies that the current leading should be changed to nnn points. The numeric value contained in this tag instructs the Engine as to the leading size. Therefore a value of 030 would request a cnange in leading to 30 points. The value nnn is fixed in length to 3 characters.
RANGE:
6 to 999
SAMPLE:
<txtl>00:00:O0IOO:06!?his 15 a
Figure imgf000057_0002
ann
DESCRIPTION: Specifies that a font attπbute is being requested. The numenc value contained in this tag instructs the Engine as to the type of attπbute change. The value nn is fixed in length to 2 characters. The attπbute values can be added together to create composite attribute type such as 06, bold-italic: or 10, bold-underline.
ACCEPTABLE ATTRIBUTE TYPES: normai = 01 bold = 02 italic = 04 underline = 08
SAMPLE:
<txtl>00:00:00l00:06l IThia is a
Figure imgf000057_0003
rnnn
DESCRIPTION: Specifies that a the red component in RGB of the current font color is being requested to change. Trie numeric value contained in this tag instructs the Engine as to the type of the new red value, ranging from 0 to 255 with 0 being the darkest and 255 the brightest. The value nnn is fixed in length to 3 characters.
SAMPLE:
<<_χtl>00:00:OOIOO:06l iThls 1-' a \s030\1032\a04\r255a r.eadliae which has I'll! red gnnn
DESCRIPTION: Specifies that a the green component in RGB of the current font color is being requested to change. The numeπc value contained in this tag instructs the Engine as to the type of the new red value, ranging from 0 to 255 with 0 being the darkest and 255 the bπghtest. The value nnn is fixed in length to 3 characters.
SAMPLE.
<txtl>00:00:OOIOO:06 I ι7..j *. i ..30 v 1032 v=G r255-» r.eaαi.r.a ~mcn -.aj z... ,reen
bnnn
DESCRIPTION: Specifies that a the blue component in RGB of the current font color is being requested to change. The numeπc value contained in this tag instructs the Engine as to the type of the new red value, ranging from 0 to 255 with 0 being the darkest and 255 the brightest. The value nnn is fixed in length to 3 characters.
SAMPLE.
<txtl>00:00:00l00: 6 Thιs -j a -
Figure imgf000058_0001
neaαl.r.e «mιcr. r.aό ::!.
Figure imgf000058_0002
The Advertising List File
The advertising list file is used to insert advertising from a separate device, or as a story file into the multimedia presentation. The reason for having a separate list is that the insertion of advertising can be more, or less, dynamic than the multimedia presentation. In the case of the News Channel, it requires news programming to be fluid throughout the day. It cannot be burdened by managing the trafficking of advertising.
To handle advertising, the advertising list file is a sequential list of advertising insertion times and other IDs. This file shouid be read dunng the multimedia presentation as needed to property insert the next ad in the list whenever the <advr tag placed in the SEGMENT file for that presentation.
It is the sole responsibility of the multimedia engine to keep track of the advertising play list and to record the
Figure imgf000059_0001
descπbing the success or failure of the ad insertion.
The creation of advertising playlists is handled by the ADLIST.EXE application. The application functions very much like the SEGMENT.EXE application which combines stones to create segments. ADUST combines individual advertisements into the advertisement list file and manages the files across multiple days.
The advertising play list must be delivered to the Playback Chassis for each day of service. The filename format of this file is as follows:
MMDD-AD
Sample:
0101.AD 0102.AD 0103.AD
1231. AD which represents January 1 through December 31 of the current year. This file must reside in the following directory:
C:\MMDATA\AD
The format for the actual list file is as follows:
HH:MM|barcode|barcode|barcode|... barcode
The barcode can be replaced by a Story filename if the user wants to playback a multimedia advertisement rather than a digital video file. It would appear as follows:
HH:MM|barcode|SAMPLE.STY|barcode|... barcode
The following is a sample advertising piayiist: filename: 1229.AD advertising play list
09/06/34 creatioaαate: 08/09/93 createαror: NE SCHAK EL jys eaiα: 2442, csatβsville 00: 00 '123 5 I 2345613456" I 45678 I 56789 OO:lC.12245l23456l0OOB123.STY!45678,567eS 00:20112345123456134567145078156789 DO
>". ιnnnB12 ..rT"'2345όl 4567,
Figure imgf000060_0001
4ul 12345123456, 34567 ' n00B123.SV ' 5 = 7 = 5 5nll2345i<100B123.STY,n00B123.S7Y' 5»7 , 67=9
The local creation system musi follow a treeα directory structure that is referenced in the ADUST.INI file. The tree should be as follows:
\NTV
\AD
\DST00313 \DST00314 \DST00315
\DST00456 Within each directory will reside the ADLIST files for each day for that site. The ADUST.INI file is as follows
ADLIST.INI
Crsateα on 01/29/94 sites I
Figure imgf000060_0002
[ pegrilesi 1-23455 2-23456 3-23457 istorvfilesl 1-SAMPLE.STY 2-JUNK.STY
The ADUST.INI file will be maintained by the ADUST.EXE software. This application will read and write the .AD files for each site. It will also be the list manager for the available MPEG spots and relative .STY files composed as advertisements.
The Message List File
The message iist file is used to insert a dynamic message from a separate text file into the multimedia presentation.
To handle a message, the message list file is a sequential list of message insertion times and other IDs. This fiie should be read during the multimedia presentation as needed to properly insert the next message in the list whenever the <msgπ> tag placeα in the STORY file for that presentation.
It is the sole responsibility of the multimedia engine to keep track of the message play list and to record the log information descπbing the success or failure of the message insertion.
The creation of message playiists is handled by the MSGLIST.EXE application. The application functions very much like the ADUST.EXE application. MSGLIST combines individual messages into the message list file and manages the files across multiple days.
The message play iist must be delivered to the Playback Chassis for each day of service. The filename format of this file is as follows:
MMDD.MSG
Sample:
0101. MSG 0102.MSG 0103.MSG
1231.MSG which represents January 1 through DecemDer 31 of the current year. This file must reside in the following directory:
C:\MMDATA\MESSAGE The format for the actual list file is as follows:
HH:MM:SS|HH:MM:SS|message|message|... message The following is a sample message piayiist:
; filename: 1229.MSG
; adver ii cg play list
; 09/06/54
Figure imgf000061_0001
; crea aα.or: NEKSCHANNE
; systβmid: 2442, Coatesviile
0O:0O:00i00:10:2OIThe president has been shotlT e president r.as αiadl Funeral on Friday
23:0O: ι23:10:20IThe president has oeen snotiThe president nas αiβαi Funeral or. Fπαay The nnnnnnnn.LOG
The creation of an activities log file is extremely important to the overall operations of the system. The name of the log file is determined by the system ID. This way the person viewing the log file that comes back from the field will know which machine it has come from.
The information needed in the log file is as follows:
'"122455. OG
Product - iarlcer Location - Coatesvn..
Ir.itiaiiration succeϊsrii
1.55:"' IS ~,->nr)2rr,C; .3Z
1. 3..- ?5 JOGAF-14. TV
2.U-: ;C ?S rJθOAF31S ..?.'
1.01:30 ?S 00 A3314.JTY •2:02:00 ?S 0002231 .STY 'ERROR! DVS1 '.2.02:30 ?S 0001-314. STY 32:03:30 ?S 00083314. ST-/ 32:04:00 ?S 00076543.3-' adl-l'00:00:OOIOO:30ll234l2345lGUi3t
ad2-U23:S9:OOIOO:30ll23 l2345IGιllette 13:59:30 ?S 00076543. STY
The LS acronym stands for Load Segment. The PS acronym stands for Play Story.
The creation of advertising entries withing the log file is important from a billing perspective and from a basic system maintenance perspective. The log file should report back the basic success or failure of an advertising insertion. A successful insertion is referenced as a 1 and failure as a 0. It is the first entry item within an entry line. The remainder of the line is exactly the same as advertising INI file. This will allow for simple searches and retπeval of valuable diagnostic information. Advertisement logging will occur in the mm engine's log file and will be split out if necessary to a separate log using another program. adl-i 0^ : 00 : 00 i 30 : , 234 I 2345 I Gillette
aα2-l 23: 5: 00100: 3011234123451 Gillette
adn-1123 : 59: 30100: 3014567 I 5678 l Suburcan Cable The STARNET.INI File
STARNET. ::JI
StarNet, Inc. ul imedia Laos
1232 E.-.terorise Drive, Suite 200
Figure imgf000063_0001
Purpose: Tr s tne is used to aescr^oe tr.e jvstem oro. ne ioc one oi any or the StarNet ?c-oased oroductr . riutory:
15/26/ 33 /Kit created
06/02, 33 _κ)c .-nooiiiaα r.e system LG ic-crsaticn to satcr. wnat iϊ rtaDpeninα r. t.ιe vc FCRM_SYS.F? fi.e.
•ϊ/24-93 HJP id eα Jversioni ?ec ^or. witr. "?rsιon»
'iaior " rsion cr.anges .u ;r.cu α or.iy o πiaoe ir t.ιe
-ormat or content o =.-.- sxis'iύJ JS;-.::_ ;.-.„ngas o any section is removed. Minor version r.anges ' ..'.:; reflect acmt cr. ;f r.ew section: version I version"!
[systemI the system id is used by communications to uniquely identify a i nbαuno communications, the system name is the text name :or the system. the system pnone data is the pnone a moβr :or dialing into tha τ,acnιne. the system phone voice is the pnone number :or camng the system contact, the system pnone contact is the name or tha system representative t me zone 13 measured in numoer of ncurs offset :rcm GMT. parentoperator-TCI sys emoparator-Suburban Cable contactl-John Smith pnonβl-(215) 555-1212 contact2-Jane Jones pnone2-<215) 555-1212 syscemaodreas-Coacesville, PA nλimberorsuDS-ΘO, 000 tιmezone-0 i irectories I empiatβ-c: \template fcnt«c:\ oats nooκ-c: * iuaox cαage-c:
Figure imgf000063_0002
oacigrouno-c:
Figure imgf000063_0003
αvHe: vwav βssage-c: 'message
[barxerl the starnet product line identifies which service is on this system tha starnet phone nome number is tne number tnat th s machine wi i call when a proDiem occurs the mpβg line tells the system if MPEG files are to be used. ppvBumββr-ι215) 555-1212
•yscemnumαer-(215! 555-1212 phonβnome-(215) 692-0000 systeπu.o-123456.123456 stereo-yes pag-yes cc πxγpβ-1 syste notes-Modified the Barker INI to accomodata new system id.
Channel Map
The data stricture:
Days of weβK are listed as integer values tnat correspond to a cit-masic in the reading and writing software. This is oonβ to conserve space. The individual integer vaiuβs for days of the wββ< are:
Sunday-1
Monday-2
Tuasday-4 βdnβsday-Θ r. rroay-ιβ
3ituroay- 4 I: %lι v.'s or the w e* r= turned C . ".r.=r. t :4-32*16-9*4+2J-l or 127. If Monday, T:?3day ir? turned on then the value is l~—>-i r 21
Tha αiff.rent de inition:: per c-.ar.ne. a* avc ssoaratao oy piping symool; .
Tha time is expressed s single .r.te tfer αetvreen " ana li .-. :.-. t represents * .. _ r.our t ne i enua. tc I2:30om ana 14 'Λiiiα oι» » 'isι
The order or appearance roilcws: THANi-CAlL LETTERS, DAY, ( : c : l-rhannels; 2-PRISM, 12 ,0-0 3- YW,127, -0 4-PREV, 127 3-0 5-HBO, 127, -0 •3-WPVI.12" 3-0
■3-0
:-o
,0-0
11- TXF,12 ,3-0 2-WHYY, 12 ,0-0
13-WGBS.12 ,0-0
14-SHCW, 12 ,0-0
16-DIS.127 ,0-0
17-MAX, 127 ,0-0
18-MEU, 127 -0
19-QVC, 127 -n
20-LOCAi,.1 ,0-0
21-HSK,127 -0
22-CSPAN, 1 ,,27,0-0
23-REQ1.1 ,0-0
24-REQ2.12 ,0-0
25-PLAY.12 7,0-0
26— SPICE i 27,0-0
27-USA, ,0-0
28-DSC127 ,0-0
29-TNN,127 ,0-0
30-FAM.127 ,0-0
31-CNN,12 ,0-0
32-NIC , 12 7,0-0
33-SC, 127 0-0
34-A*E,127 ,0-0
35-CVS.127 ,0-0
36-LIFE.12 7,0-0
37-VC,12 0-0
38-CCM, 127 ,0-0
35-CNSC1. 7,0-0
40-TSS.127 ,0-0
41-BET.l. ,0-0
42-AMC, 1 ,0-0
43-MTV.127 ,0-0
44-CNN.127 ,0-0 5-VH1.127 ,0-0
46-TLC, 127 ,0-0
47-TNT.127 ,0-0
48-WWDR.12 7,0-0
49-TWC,127 ,0-0
51-ENC 27 ,0-0
52-CCURT, i 27,0-0
Figure imgf000064_0001

Claims

1. A multimedia presentation system capable of dynamically constructing a multimedia presentation comprising: a receiving and storing means for receiving and storing commands and data files, which commands comprise script files or function calls; and a multimedia engine comprising a renderer and an application programming interface or a script manager, which multimedia engine comprises: a translating means for translating human-readable script files or human-readable function calls into commands usable by the renderer; and an executing means for executing the translated commands to create a multimedia presentation comprising audio and video components for broadcast, cable transmission or display.
2. The multimedia presentation system of claim 1 , wherein the data files comprise image, video, audio and text files.
3. The multimedia presentation system of claim 1 , wherein the script files include at least one segment file.
4. The multimedia presentation system of claim 3, wherein the script and data files already within the receiving and storing means can be substituted before the scheduled broadcast, cable transmission or display time of a multimedia presentation, so long as the substitution occurs before the time and date that the segment, as defined in the segment file, in which the script and data files are to be presented is scheduled for broadcast, cable transmission or display.
5. The multimedia presentation system of claim 3, wherein the at least one segment file encodes the name of at least one story file.
6. The multimedia presentation system of claim 3 comprising an advertisement file stored in the receiving and storing means, wherein the at least one segment file encodes the name of said advertisement file.
7. The multimedia presentation system of claim 6, wherein said advertisement file encodes a list of video file names or story file names, each such video or story file having a duration of defined length.
8. The multimedia presentation system of claim 7, wherein the multimedia engine monitors which video or story files listed in an advertisement file have been broadcast, cable transmitted or displayed and stores the information in the receiving and storing means.
9. The multimedia presentation system of claim 6 comprising a video file stored in the receiving and storing means, wherein the advertisement file names said video file.
10. The multimedia presentation system of claim 5 comprising said at least one story file and at least one template file stored in the receiving and storing means, which story file encodes text and the name of said at least one template file.
11. The multimedia presentation system of claim 10, wherein every segment file encodes a story file starting time and a story file duration in association with each story file named in the segment file.
12. The multimedia presentation system of claim 10 comprising a video file stored in the receiving and storing means and wherein said at least one story file names said video file.
13. The multimedia presentation system of claim 10 comprising an audio file stored in the receiving and storing means and wherein said at least one story file names said audio file.
14. The multimedia presentation system of claim 10 comprising an image file stored in the receiving and storing means and wherein said at least one story file names said image file.
15. The multimedia presentation system of claim 4, wherein the renderer comprises a means of rendering the image stored in said image file by positioning the image in the broadcast, cable transmission or display according to the instructions in the template file.
16. The multimedia presentation system of claim 1 , wherein the receiving and storing means comprises a remote receiving means for receiving the script files and data files from a remote source.
17. The multimedia presentation system of claim 16, wherein the remote receiving means comprises a satellite dish.
18. The multimedia presentation system of claim 17 further comprising a cable television system and an output for transmitting the multimedia presentation over the cable television system.
19. The multimedia presentation system of claim 1 , wherein the translating means comprises the script manager.
20. The multimedia presentation system of claim 1 , wherein the translating means comprises the application programming interface.
21. The multimedia presentation system of claim 1 , wherein the translating means comprises the script manager and the application programming interface.
22. The multimedia presentation system of claim 21 , wherein the script files encode the name of at least one segment file.
23. The multimedia presentation system of claim 22, wherein the at least one segment file encodes the name of at least one story file.
24. The multimedia presentation system of claim 23 compπsing at least one story file stored in the receiving and storing means, which story file encodes text to be used in a multimedia presentation and the name of at least one template file.
25. The multimedia presentation system of claim 24, wherein the at least one segment file encodes the name of at least one advertisement file.
26. A multimedia presentation system capable of dynamically constructing a multimedia presentation for cable transmission to a plurality of cable television users comprising: a receiving and storing means for receiving and storing commands and data files, which commands comprise script files or function calls; at least one segment file, at least one story file and at least one template file stored in the receiving and storing means, wherein said at least one segment file encodes the name of said at least one story file, and said at least one story file encodes text and the name of said at least one template file; and a multimedia engine comprising a renderer, an application programming interface and a script manager, which multimedia engine comprises: a translating means of translating human-readable script files or human-readable function calls into commands usable by the renderer; and an executing means of executing the translated commands to create a multimedia presentation comprising audio and video components for cable transmission.
27. The multimedia presentation system of claim 26, wherein the receiving and storing means comprises a remote receiving means for receiving the script files and data files from a remote source.
28. The multimedia presentation system of claim 27, wherein the remote receiving means comprises a satellite dish.
PCT/US1995/013433 1994-10-11 1995-10-11 Remote platform independent dynamic multimedia engine WO1998029835A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU40035/95A AU4003595A (en) 1994-10-11 1995-10-12 Remote platform independent dynamic multi-media engine

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US32133294A 1994-10-11 1994-10-11
US08/321,332 1994-10-11

Publications (1)

Publication Number Publication Date
WO1998029835A1 true WO1998029835A1 (en) 1998-07-09

Family

ID=23250159

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1995/013433 WO1998029835A1 (en) 1994-10-11 1995-10-11 Remote platform independent dynamic multimedia engine

Country Status (2)

Country Link
AU (1) AU4003595A (en)
WO (1) WO1998029835A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001018678A2 (en) * 1999-09-07 2001-03-15 Liberate Technologies, L.L.C. Methods, apparatus, and systems for storing, retrieving and playing multimedia data
WO2001026378A1 (en) * 1999-10-06 2001-04-12 Streaming21, Inc. Method and apparatus for managing streaming data
US6434560B1 (en) 1999-07-19 2002-08-13 International Business Machines Corporation Method for accelerated sorting based on data format
GB2382696A (en) * 2001-10-30 2003-06-04 Hewlett Packard Co Multimedia presentation creator
US6725421B1 (en) 1999-06-11 2004-04-20 Liberate Technologies Methods, apparatus, and systems for storing, retrieving and playing multimedia data
WO2007009180A1 (en) * 2005-07-19 2007-01-25 Direct Tv Pty Ltd Presentation content management and creation systems and methods
WO2008004237A3 (en) * 2006-07-06 2008-09-12 Sundaysky Ltd Automatic generation of video from structured content
EP2479756A3 (en) * 2005-11-10 2012-08-15 QDC IP Technologies Pty Ltd Personalised video generation

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5027400A (en) * 1988-08-19 1991-06-25 Hitachi Ltd. Multimedia bidirectional broadcast system
US5319455A (en) * 1990-09-28 1994-06-07 Ictv Inc. System for distributing customized commercials to television viewers
US5318450A (en) * 1989-11-22 1994-06-07 Gte California Incorporated Multimedia distribution system for instructional materials
US5325423A (en) * 1992-11-13 1994-06-28 Multimedia Systems Corporation Interactive multimedia communication system
US5381412A (en) * 1991-10-02 1995-01-10 Canon Kabushiki Kaisha Multimedia communication apparatus
US5434678A (en) * 1993-01-11 1995-07-18 Abecassis; Max Seamless transmission of non-sequential video segments
US5440336A (en) * 1993-07-23 1995-08-08 Electronic Data Systems Corporation System and method for storing and forwarding audio and/or visual information on demand

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5027400A (en) * 1988-08-19 1991-06-25 Hitachi Ltd. Multimedia bidirectional broadcast system
US5318450A (en) * 1989-11-22 1994-06-07 Gte California Incorporated Multimedia distribution system for instructional materials
US5319455A (en) * 1990-09-28 1994-06-07 Ictv Inc. System for distributing customized commercials to television viewers
US5381412A (en) * 1991-10-02 1995-01-10 Canon Kabushiki Kaisha Multimedia communication apparatus
US5325423A (en) * 1992-11-13 1994-06-28 Multimedia Systems Corporation Interactive multimedia communication system
US5434678A (en) * 1993-01-11 1995-07-18 Abecassis; Max Seamless transmission of non-sequential video segments
US5440336A (en) * 1993-07-23 1995-08-08 Electronic Data Systems Corporation System and method for storing and forwarding audio and/or visual information on demand

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
COMPUTER DICTIONARY, MICROSOFT PRESS, 2nd Edition, 1993, page 24. *

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6725421B1 (en) 1999-06-11 2004-04-20 Liberate Technologies Methods, apparatus, and systems for storing, retrieving and playing multimedia data
US6434560B1 (en) 1999-07-19 2002-08-13 International Business Machines Corporation Method for accelerated sorting based on data format
WO2001018678A3 (en) * 1999-09-07 2003-12-24 Liberate Technologies L L C Methods, apparatus, and systems for storing, retrieving and playing multimedia data
WO2001018678A2 (en) * 1999-09-07 2001-03-15 Liberate Technologies, L.L.C. Methods, apparatus, and systems for storing, retrieving and playing multimedia data
WO2001026378A1 (en) * 1999-10-06 2001-04-12 Streaming21, Inc. Method and apparatus for managing streaming data
GB2382696A (en) * 2001-10-30 2003-06-04 Hewlett Packard Co Multimedia presentation creator
WO2007009180A1 (en) * 2005-07-19 2007-01-25 Direct Tv Pty Ltd Presentation content management and creation systems and methods
GB2442166A (en) * 2005-07-19 2008-03-26 Direct Tv Pty Ltd Presentation content management and creation systems and methods
EP2479756A3 (en) * 2005-11-10 2012-08-15 QDC IP Technologies Pty Ltd Personalised video generation
US8340493B2 (en) 2006-07-06 2012-12-25 Sundaysky Ltd. Automatic generation of video from structured content
WO2008004237A3 (en) * 2006-07-06 2008-09-12 Sundaysky Ltd Automatic generation of video from structured content
US8913878B2 (en) 2006-07-06 2014-12-16 Sundaysky Ltd. Automatic generation of video from structured content
US9129642B2 (en) 2006-07-06 2015-09-08 Sundaysky Ltd. Automatic generation of video from structured content
US9330719B2 (en) 2006-07-06 2016-05-03 Sundaysky Ltd. Automatic generation of video from structured content
US9508384B2 (en) 2006-07-06 2016-11-29 Sundaysky Ltd. Automatic generation of video from structured content
US9633695B2 (en) 2006-07-06 2017-04-25 Sundaysky Ltd. Automatic generation of video from structured content
US9711179B2 (en) 2006-07-06 2017-07-18 Sundaysky Ltd. Automatic generation of video from structured content
US9997198B2 (en) 2006-07-06 2018-06-12 Sundaysky Ltd. Automatic generation of video from structured content
US10236028B2 (en) 2006-07-06 2019-03-19 Sundaysky Ltd. Automatic generation of video from structured content
US10283164B2 (en) 2006-07-06 2019-05-07 Sundaysky Ltd. Automatic generation of video from structured content
US10755745B2 (en) 2006-07-06 2020-08-25 Sundaysky Ltd. Automatic generation of video from structured content

Also Published As

Publication number Publication date
AU4003595A (en) 1998-07-31

Similar Documents

Publication Publication Date Title
US6240555B1 (en) Interactive entertainment system for presenting supplemental interactive content together with continuous video programs
US6446082B1 (en) Method of receiving time-specified program contents
US6188398B1 (en) Targeting advertising using web pages with video
US6791579B2 (en) Method of enhancing streaming media content
AU770084B2 (en) Methods, apparatus, and systems for storing, retrieving and playing multimedia data
US7380206B1 (en) Data distribution method and apparatus, and data reception method and apparatus
CN1777945B (en) Method and apparatus for synchronous reproduction of main contents recorded on an interactive recording medium and additional contents therefor
US20090228921A1 (en) Content Matching Information Presentation Device and Presentation Method Thereof
US20020129156A1 (en) Plural media data synchronizing system
EP0982947A2 (en) Audio video encoding system with enhanced functionality
US20020059604A1 (en) System and method for linking media content
US20020188959A1 (en) Parallel and synchronized display of augmented multimedia information
US20040034622A1 (en) Applications software and method for authoring and communicating multimedia content in a multimedia object communication and handling platform
US20030206182A1 (en) Synchronized graphical information and time-lapse photography for weather presentations and the like
US20020135698A1 (en) Transmission system, receiver, and broadcast system
US6856331B2 (en) System and method of enriching non-linkable media representations in a network by enabling an overlying hotlink canvas
EP1019852A1 (en) Hierarchical method and system for object-based audiovisual descriptive tagging of images for information retrieval, editing, and manipulation
JP2003530726A (en) Convergence enable DVD and web system
EP0896774A4 (en) Tv with url reception for internet access
US7996451B2 (en) System, method, and multi-level object data structure thereof for browsing multimedia data
WO2011055257A1 (en) Systems and methods for selecting ad objects to insert into video content
WO1998029835A1 (en) Remote platform independent dynamic multimedia engine
US7051272B1 (en) Method for coding a presentation utilizing a sub-presentation having an independent play-out specification
AU765232B2 (en) Methods, apparatus, and systems for storing, retrieving and playing multimedia data
JP2004508605A5 (en)

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AM AU BB BG BR BY CA CN CZ EE FI GE HU IS JP KG KP KR KZ LK LR LT LV MD MG MN MX NO NZ PL RO RU SG SI SK TJ TM TT UA UZ VN

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): KE MW SD SZ UG AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
NENP Non-entry into the national phase

Ref country code: JP

Ref document number: 96517182

Format of ref document f/p: F

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase