WO2002077750A2 - Authoring platform for synchronized multimedia presentation - Google Patents

Authoring platform for synchronized multimedia presentation Download PDF

Info

Publication number
WO2002077750A2
WO2002077750A2 PCT/US2002/000372 US0200372W WO02077750A2 WO 2002077750 A2 WO2002077750 A2 WO 2002077750A2 US 0200372 W US0200372 W US 0200372W WO 02077750 A2 WO02077750 A2 WO 02077750A2
Authority
WO
WIPO (PCT)
Prior art keywords
multimedia file
presentation
file
authoring
window
Prior art date
Application number
PCT/US2002/000372
Other languages
French (fr)
Other versions
WO2002077750A3 (en
Inventor
Yuewei Wang
Ganesh Jampani
Yenjen Lee
Original Assignee
3Cx, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 3Cx, Inc. filed Critical 3Cx, Inc.
Priority to AU2002311744A priority Critical patent/AU2002311744A1/en
Publication of WO2002077750A2 publication Critical patent/WO2002077750A2/en
Publication of WO2002077750A3 publication Critical patent/WO2002077750A3/en

Links

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/32Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier

Definitions

  • the present invention relates to a system and method for authoring multimedia presentations that will be displayed on a client terminal attached to a network, and more particularly relates to a system and method allowing an author to time the display of multimedia files relative to a primary video file, resulting in a synchronized multimedia presentation.
  • HTML-based applications have drastically increased. Many businesses and educational institutions have implemented HTML applications to service customers, employees and students. Examples of these types of applications include commercials, instructions, shopping, and Internet educational courses. Typically, a user will interface with an HTML-based application that will perform some function (i.e., display a product and corresponding price, give a detailed description of the installation of a certain product, display a student's course schedule).
  • An authoring platform allows a user to synchronize multimedia files together to create a multi-file presentation.
  • This platform may operate on any computer related device that provides necessary processing and storage functionalities for the platform.
  • the platform provides a user interface that allows the user to easily identify indices within a primary multimedia file and a corresponding secondary multimedia file(s) that are synchronized to these indices.
  • the user interface comprises a first display window, a second display window, a text message window, a presentation indices window, and a presentation control window. These windows display either multimedia files or synchronization/control data during the authoring process.
  • the first display window displays a primary multimedia file to which other secondary multimedia files are synchronized.
  • this primary multimedia file is a video file (i.e., mpeg, avi, real, asf, mp3, etc.) although it may be any type of multimedia file including those conforming to VoIP and H.323 standards.
  • the si:ae, shape and color quality of the window may all be adjusted by the viewer during this authoring phase.
  • the second display window displays a secondary multimedia file.
  • the text message window displays text messages or annotations that may be indexed to the primary multimedia file.
  • synchronized text messages may appear within a web browser.
  • this text message window may be used as a chat window on which an author may communicate with other individuals during the authoring process. This embodiment provides for collaboration between multiple authors as a presentation is being created.
  • the presentation indices window allows a user to create indices that synchronize a secondary multimedia file(s) to the primary multimedia file. Specifically, a user identifies a location within the primary multimedia file to which a secondary multimedia file is indexed. This location may be identified by various means including time or frame number. This indexing information is stored within a synchronization file that controls the presentation when it is displayed. Accordingly, during the display of the presentation, a secondary multimedia file will be triggered when the index is crossed resulting in the display of the secondary multimedia file, an audio file being played, or any other multimedia function defined within the secondary multimedia file.
  • the presentation indices window may comprise multiple icons that function to define these indices.
  • the window may have play and pause buttons that allow a user to stop and start the primary multimedia file.
  • the window may also have file browse icons and URL inputs that allow a user to easily identify a secondary multimedia file that is indexed to the primary multimedia file.
  • the synchronization process occurs by creating a plurality of indices that synchronize these files together.
  • the presentation control window comprises a plurality of controls that allow a viewer to manipulate a presentation. For example, a viewer may play, pause, rewind, fast forward or select continual play from a variety of controls within this window.
  • a status bar may be implemented to show a currently displayed frame or time position relative to the entire video file.
  • a user may also control other viewing options within the presentation by accessing controls such as window size, font, font size, font color, and window arrangement.
  • the presentation control window may contain a counter that counts the number of frames that have been displayed as well as displaying the total number of frames within a certain video file.
  • FIG. 1 shows an illustration of a system used to deliver the synchronized multimedia presentation to a display device according to an embodiment of the present invention.
  • Fig. 2 is an illustration of a graphical user interface used to display a synchronized multimedia presentation according to an embodiment of the present invention.
  • FIG. 3 is a flowchart describing a method for authoring a synchronized multimedia presentation according to an embodiment of the present invention.
  • Fig. 4 is a block diagram of a device that may be used to display the synchronized multimedia presentation according to an embodiment of the present invention.
  • FIG. 5 is an illustration of a storage and control device used for a synchronized multimedia presentation according to an embodiment of the present invention.
  • Fig. 8 is a block diagram of a user interface that may be used to define display characteristics of a secondary multimedia file in the synchronized multimedia presentation according to an embodiment of the present invention.
  • FIG. 9 is a detailed block diagram of a user interface that may be used to create a synchronized multimedia file according to an embodiment of the present invention.
  • Fig. 11 is a block diagram of a user interface that controls an index within a synchronized multimedia presentation according to an embodiment of the present invention.
  • Fig. 12 is a flowchart describing a method for defining characteristics of an index within a synchronized multimedia presentation according to an embodiment of the present invention.
  • the present invention also relates to apparatus for performing the operations herein.
  • This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
  • FIG. 1 shows a block diagram describing an embodiment of a system that may be used to author multimedia presentations according to the present invention.
  • a client terminal 105 is com ected to a network 110.
  • An author of a multimedia presentation uploads a primary multimedia file, typically a video file, to a video server 115 attached to the network 110.
  • a secondary multimedia file typically an HTML-based file but may be any type of file including another video file, is uploaded to a web server 120 attached to the network 110.
  • a database 125 is coupled to the web server 120 for storing indexing information of the primary and secondary multimedia files. This database 125 may be external to the web server
  • an author of a presentation containing multimedia files that will be synchronized may first upload each multimedia file to a corresponding server before indexing the display of each of the multimedia files.
  • a primary multimedia file in this case a video file
  • Secondary multimedia files are uploaded to a web server 120.
  • there will be only one secondary multimedia file typically an HTML-based file.
  • multiple secondary multimedia files may be used to implement numerous files being synchronized and concurrently displayed within the presentation.
  • the secondary multimedia files do not need to be HTML-based files but can be any type of file including video, text, and image files.
  • Fig. 2 illustrates a user interface that may be used to synchronize a secondary multimedia file(s) to a primary multimedia file.
  • software operating on the client 105 creates a user interface that operates within a web browser.
  • the web browser is one example of a display device on which the present invention may be shown.
  • Examples of web browsers include Microsoft Explorer and Netscape
  • the browser 200 may be partitioned into different windows on which various different types of multimedia files may be shown and synchronized.
  • a first display window 205 displays a primary multimedia file.
  • this primary multimedia file is a video file (i.e., mpeg, avi, real, asf, mp3, etc) although it may be any type of multimedia file including those conforming to VoIP and H.323 standards.
  • the size, shape and color quality of the window may all be adjusted by the viewer during this authoring phase.
  • a second display window 210 within the browser 200 displays a secondary multimedia file.
  • a text message window 215 may also be included within the web browser
  • this window 215 text or annotations may be indexed to the primary multimedia file.
  • synchronized text messages may appear within a web browser.
  • this text message window 215 may be used as a chat window on which an author may communicate with other individuals during the authoring process. This embodiment provides for collaboration between multiple authors as a presentation is being created.
  • a presentation indices window 230 within the web browser 200 allows a user to create indices that synchronize a secondary multimedia file(s) to the primary multimedia file. Specifically, a user identifies a location within the primary multimedia file to which a secondary multimedia file is indexed. This location may be identified by various means including time or frame number. This indexing information is stored within a synchronization file that controls the presentation when it is displayed.
  • the presentation indices window 230 may comprise multiple icons that function to define these indices.
  • the window 230 may have play and pause buttons that allow a user to stop and start the primary multimedia file.
  • the window 230 may also have file browse icons and URL inputs that allow a user to easily identify a secondary multimedia file that is indexed to the primary multimedia file.
  • the synchronization process occurs by creating a plurality of indices that synchronize these files together. This process will be described in greater detail below.
  • a presentation control window 235 contains a plurality of controls that allow a viewer to manipulate a presentation. For example, a viewer may play, pause, rewind, fast forward or select continual play from a variety of controls within this window 235. Additionally, a status bar (not shown) may be implemented to show a currently displayed frame or time position relative to the entire video file. A user may also control other viewing options within the presentation by accessing controls such as window size, font, font size, font color, and window arrangement. Also, the presentation control window 235 may contain a counter that counts the number of frames that have been displayed as well as displaying the total number of frames within a certain video file.
  • FIG. 3 illustrates a flowchart describing a method used for authoring synchronized multimedia presentations according to an embodiment of the present invention.
  • a primary multimedia file is displayed 305 allowing a user to easily synchronize a secondary multimedia file(s) to the primary multimedia file.
  • the user may stop 310 the multimedia file at a specific point of time or frame in order to index a secondary multimedia file to that specific moment in the primary multimedia file.
  • a user may identify 315 a secondary multimedia file that will be displayed at that particular index.
  • This secondary multimedia file may be identified by an address path of a file stored in a storage device, a uniform resource locator identifying the file on a network, or any other identifier that allows a particular multimedia file to be identified and stored within the system.
  • this indexing information is stored 320 within a database.
  • this database may be attached to the web server 120 or it may be stored locally at the terminal.
  • the primary multimedia file continues to be displayed 325 and another index may be inserted to synchronize another secondary multimedia file to the primary multimedia file.
  • FIG. 4 shows an example of a client terminal on which the author may create or edit the presentation.
  • the client terminal comprises a control unit 400 coupled, via a bus, to a display 405, a keyboard 410, a cursor control 415, a network controller 420, and an I/O device 425.
  • the control unit 400 is typically a personal computer or computing box attached to a network. However, it may also be a personal digital assistant or any other device able to receive, process and display data. In one embodiment, the control unit 1000 has an operating system (i.e., Windows, UNIX, etc.) upon which multiple applications operate.
  • the control unit 400 comprises a processor 450, main memory 435, and a data storage device all connected to a bus 430.
  • a processor 450 processes data signals and may comprise various computing architectures including a complex instruction set computer (CISC) architecture, a reduced instruction set computer (RISC) architecture, or an architecture implementing a combination of instruction sets. Although only a single processor is shown, multiple processors may be attached.
  • CISC complex instruction set computer
  • RISC reduced instruction set computer
  • Main memory 435 may store instructions and/or data that may be executed by processor 450.
  • the instructions and or data may comprise code for performing any and/or all of the techniques described herein.
  • Main memory 435 may be a dynamic random access memory (DRAM) device, a static random access memory (SRAM) device, or some other memory device known in the art.
  • the memory 435 preferably includes a web browser 440 of a conventional type that provides access to the Internet and processes HTML, XML or other mark up language to generated images on the display device 405.
  • the web browser 440 could be Netscape Navigator or Microsoft Internet Explorer.
  • Data storage device 445 stores data and instructions for processor 450 and may comprise one or more devices including a hard disk drive, a floppy disk drive, a
  • CD-ROM device a DVD-ROM device, a DVD-RAM device, a DVD-RW device, a flash memory device, or some other mass storage device known in the art.
  • System bus 430 represents a shared bus for communicating information and data throughout control unit 400.
  • System bus 430 may represent one or more buses including an industry standard architecture (ISA) bus, a peripheral component interconnect (PCI) bus, a universal serial bus (USB), or some other bus known in the art to provide similar functionality.
  • ISA industry standard architecture
  • PCI peripheral component interconnect
  • USB universal serial bus
  • Keyboard 410 represents an alphanumeric input device coupled to control unit 400 to communicate information and command selections to processor 450.
  • Cursor control 415 represents a user input device equipped to communicate positional data as well as command selections to processor 450. Cursor control 415 may include a mouse, a trackball, a stylus, a pen, a touch screen, cursor direction keys, or other mechanisms to cause movement of a cursor.
  • Network controller 420 links control unit 400 to a network that may include multiple processing systems.
  • the network of processing systems may comprise a local area network (LAN), a wide area network (WAN) (e.g., the Internet), and/or any other interconnected data path across which multiple devices may communicate.
  • LAN local area network
  • WAN wide area network
  • One or more I/O devices 425 are coupled to the system bus 430.
  • the I O device 425 may be an audio device equipped to receive audio input and transmit audio output. Audio input may be received through various devices including a microphone within audio device 425 and network controller 420. Similarly, audio output may originate from various devices including processor 450 and network controller 420.
  • audio device 425 is a general purpose; audio add-in/expansion card designed for use within a general purpose computer system.
  • audio device 425 may contain one or more analog-to-digital or digital-to-analog converters, and/or one or more digital signal processors to facilitate audio processing.
  • control unit 400 may include more or less components than those shown in Figure 4 without departing from the spirit and scope of the present invention.
  • control unit 400 may include additional memory, such as, for example, a first or second level cache, or one or more application specific integrated circuits (ASICs).
  • additional components may be coupled to control unit 400 including, for example, image scanning devices, digital still or video cameras, or other devices that may or may not be equipped to capture and/or download electronic data to control unit 400.
  • FIG. 5 shows a more detailed drawing of the main memory 435.
  • the main memory generally comprises an operating system 500 whereon a number of applications 505 operate.
  • a bus 430 couples the operating system 500 to the applications 505.
  • a video storage module 510 is coupled to the bus 430.
  • the video storage module 510 stores a primary video file.
  • the primary video file may be stored locally or streamed in real time from a video server and buffered in the video storage module. Examples of a video storage module include a hard disk drive or a RAM module.
  • an HTML storage module 520 is coupled to the bus 430.
  • the HTML storage module stores 520 secondary multimedia files.
  • These secondary multimedia files may be either converted to an HTML format by an author prior to storage or automatically converted by the HTML storage module 520 to an HTML based file. These secondary multimedia files may be stored locally within the HTML storage module 520, pre-fetched from a web server and buffered in the module 520, or streamed in real time and buffered in the module 520. Examples of an HTML storage module 520 include a hard disk drive or a RAM module.
  • a synchronization control module 525 is coupled to the bus 430. The synchronization control module 525 stores indexing and file addressing information created during the authoring of the presentation. Generally, this information is buffered in the synchronization control module 525 and then transmitted to a web server through a connected network. The web server then stores the information in a database. Examples of a synchronization control module 525 include a hard disk drive or a
  • a graphical user interface control module 530 is coupled to a bus 430.
  • the graphical user interface control module 530 stores graphical display options created by the author during the creation of a presentation. Examples of these options include the size of each of the display windows, the volume of the audio, and the duration a secondary multimedia file is displayed.
  • the data may be transmitted to a web server through a connected network. The web server may then either cache the display options internally or store the data within an attached database.
  • Examples of a graphical control module 525 include a hard disk drive or a RAM module.
  • Fig. 6 illustrates a graphical representation of a display device showing a browser window displaying an interface for initializing or modifying the authoring process.
  • an author is given a menu from which she may add, author, view or see presentations.
  • An "Add a Presentation” window 605 allows an author to add a new presentation to a corresponding directory on a server.
  • An "Author a Presentation” window 610 allows an author to create and edit a presentation that has been added to a directory.
  • a "Presentation View” operation 615 allows an author to view a previously authored presentation and/or review a presentation that was just created.
  • a "Presentation Listings" operation 620 allows an author to view all of the presentations within a specific directory. It is also important to note that there are other operations that an author can do from this interface, including uploading a multimedia file, sending an e-mail allowing the recipient to view a presentation, and creating a Webcast implementing multiple multimedia files.
  • Fig. 7 illustrates a graphical representation of a display device showing a browser window displaying an interface for adding a new presentation to an appropriate directory. As shown in Fig. 7, when adding a presentation to a directory, an author may give the presentation a title 705. Also, there is an input box 710 that allows the author to give a brief description of the presentation.
  • the brief description may be viewed when the author is later browsing the directory containing the presentation.
  • the author may also control the access rights of the presentation via an access control 715.
  • the author may keep the presentation private so that only she may have access to it or may give rights of access to certain individuals. Additionally, the author may unprotect the presentation and permit anyone to view it.
  • the author may define a category that describes the presentation using a category description operation 720. This operation allows an author to categorize the content of the presentation. This categorization allows an individual to easily search presentations by content stored on a server. Examples of categories include educational presentations, movies, advertisements, etc.)
  • an author may protect access to the presentation by defining a certain password 725.
  • the author may input ownership information within the owner information box 825. This allows individuals to easily search presentations stored on a server by author. There is also a copyright info box 830 in which the author may input relevant copyright information regarding the multimedia file.
  • the author has control over who may access the uploaded multimedia file.
  • the author may control the access rights of the presentation via an access control 835.
  • the author may keep the presentation private so that only she may have access to it or may give certain individuals rights of access. Additionally, the author may permit anyone to view the presentation.
  • the user may define a category that describes the presentation using a category description operation 840. As previously described, this operation allows an author to categorize the content of the presentation. This categorization allows an individual to easily search presentations by content stored on a server.
  • Examples of categories include educational presentations, movies, advertisements, etc. Finally, an author may protect access to the presentation by defining a certain password 845.
  • Fig. 9 shows a graphical representation of a display device showing a browser window displaying an interface for authoring a presentation after multimedia files have either been uploaded onto a server(s) or stored locally.
  • a web browser 900 is depicted having an addressing window 990 through which a user may access authoring software over a network.
  • a first window 902 displays a primary video file.
  • a second window 905 displays a secondary multimedia file.
  • a status window 915 displays status, indexing, timing and editing information that the author may use in creating a presentation.
  • An annotation/text window 907 displays text appearing during the presentation.
  • An annotation list window 945 displays a list of secondary multimedia file indexes relative to the primary video file. It is important to note that the number and size of the windows are not fixed and may be adjusted by the author according to a specific presentation.
  • the author may first select a primary video file to be displayed in the first window 902. This selection is done by clicking on a select video icon 930 which will either allow the author to browse directories containing uploaded video files or directly input the address of an uploaded video file. Once the video file is selected, it begins to play in the first window 902.
  • the author may control the video by clicking on a play/pause icon 950, a stop icon 955, and a continual play option 960 that restarts the video after it has ended.
  • first window re-size icon 975 that toggles the size of the first window from full screen to a default size.
  • the author may also control other display options by clicking on the option icon 970. After clicking on this icon, a small screen 1000 may appears that allows an author to control what and how material is displayed within a presentation. An example of this window is illustrated in Fig. 10. A first area 1005 within the window 1000 allows the author to control what windows are visible on the display. Controls for each of the following are within the area:
  • a textual control area 1035 where the author can confrol the size, color and font of the text within the display.
  • a first controller entitled “Font Type” 1040 allows the author to select a particular font for the text within the annotation window 907.
  • the status window 915 monitors and displays the current frame or position in time of the current video frame displayed in the first window.
  • a status bar 965 shows the progression of the video by sliding a position indicator as the video plays. The author can watch the position indicator to see approximately what percentage of the video has been shown and what percentage is yet to be shown. Additionally, the author can control which frame is displayed by sliding the position indicator either forward or backwards.
  • a status display 980 shows the author the overall length in time of the video file and the time position of the currently displayed frame in the first window 902. The status display 980 may also show the number of video frames within the video file and the current video frame being displayed within the first window 902.
  • each secondary multimedia file is indexed relative to the primary multimedia file so that each secondary file is displayed when a certain frame or position in time is reached or displayed by the primary file.
  • This synchronization may be achieved. The following describes a preferred system and method for authoring this synchronized multimedia presentation.
  • the primary multimedia file in this case a video file, is shown in the first window 902.
  • a secondary multimedia file in this case a PowerpointTM presentation is shown in the second window 905.
  • the author is able to control which slide is displayed in the second window 905 by using a slide ruler 920 or a frame forward/back controller 925. .
  • the author may index the display of each of the secondary multimedia files relative to the primary multimedia file.
  • the author clicks on the "Add/Edit Index” icon 935, which causes the video in the first window 902 to stop and an index is created.
  • an "Add/Edit a video index” window appears as shown in Figure 10.
  • a time box shows the author how much time has elapsed in the video to where the index was created.
  • an author may provide an address path to a document stored on a storage device.
  • a second way is to select a web page by inputting its URL address as shown in 1030.
  • the user can view that specific secondary multimedia file (i.e., document or web page) by pressing the view button 1040 and the file is displayed in the second window 905.
  • the author may store the created index and the location of that specific file within a database, such as the network attached database 125.
  • a corresponding server may convert the secondary multimedia files automatically to a preferred format during the uploading process or selection process. Additionally, the author may want to convert specific files to different formats to optimize the performance of the presentation. [0088]
  • the annotation list window 945 For each hdex, the time, the text message, slide number and/or the document location is shown 910. As the author creates multiple indexes, the annotation list window 945 will display a list of each of the indexes with a corresponding description 910.
  • the author may then click on each displayed index and the corresponding video frame will be shown in the first window 902, the indexed secondary multimedia file will be shown in the second window 905, and the correspond text message will be shown in the annotation window 907.
  • the author may quickly review her presentation both during and after the authoring process is completed.
  • the author may also review the presentation by using the slide ruler 920 or the frame forward/back controller 925. Using these two devices allows the author to display the different indexed secondary multimedia files in the second window 905.
  • a secondary multimedia file is displayed within the second window 905, the corresponding video frame will be shown in the first window 902, the corresponding index will be highlighted within the annotation list window 945, and the text message will be displayed in the annotation window 907.
  • the author may also delete any or all of the indexes during the authoring process or subsequently during an editing process.
  • the author can highlight a specific index with the annotation list window 945. Then, clicking a delete index icon 940 will delete the highlighted index.
  • the author may also use the slide ruler 920 or the frame forward/back controller 925 to highlight a specific index and then click on the delete Index icon 940 to delete the highlighted index.
  • the author may delete all indices by simultaneously highlighting all of the indexes and the clicking on the delete index icon 940. As a result, the author is able to easily control each multimedia file, create indexes between the multimedia files, and review the presentation during the authoring process.
  • the author also has the ability to change an index either during authoring or editing of the presentation. This may be done by first highlighting an index within the annotation list window 945. Next, the author may input an appropriate frame of the primary multimedia/video file. Thereafter, the corresponding secondary file is re-indexed to the inputted frame. Additionally, the author may update a file by simply re-uploading a new file with the same name or changing the file pointer to the new file. [0092] There may also be an error detection that analyzes the presentation either during or after the authoring process. The error detection will check to see if there are any conflicts between indices and/or default settings and warn the author.
  • FIG. 12 illustrates a flowchart of an overview for indexing secondary multimedia files to a primary multimedia file.
  • an author identifies 1205 a secondary multimedia file that will be indexed to a location of the primary multimedia file.
  • the author provides a location of the secondary multimedia file so that during the presentation this secondary multimedia file may be retrieved and displayed or played. For example, as previously mentioned, this location may be an address path on a storage device or a uniform resource locator of a file on a network device.
  • an author may supply 1215 an annotation or text message corresponding to the index. For example, it may be a text message describing the secondary multimedia file or a question to a viewer of the presentation.
  • the author may also control 1220 the display of both the secondary multimedia file and the text message by changing display characteristics. For example, the author may adjust the size, font, and color of the text message. Also, the author may adjust the size of the window displaying the secondary multimedia file or adjust the volume of an audio component of the secondary multimedia file.
  • the author may preview the indexed primary and secondary multimedia files. This may be done by displaying the primary multimedia indexed frame in the primary multimedia window 902 and the indexed secondary multimedia file in the secondary multimedia window 905.

Abstract

An authoring platform and corresponding system and method are described that allow an author to create synchronized multimedia presentations. The authoring platform comprises a first display window (205) that displays a primary multimedia file, a second display window (210) that displays a secondary multimedia file, and a presentation indices window (230) that synchronizes the secondary multimedia file to the primary multimedia file by creating at least one index. An index is created by identifying a location in the primary multimedia file, such as a time location or frame number, and providing a secondary multimedia file that is displayed when the index is crossed. The index may contain other data that controls the characteristics of the display or provide text messaging (215) that is displayed when the index is crossed. Thus, a synchronized multimedia presentation may be created that operates independent of a presenter.

Description

AUTHORING PLATFORM FOR SYNCHRONIZED MULTIMEDIA PRESENTATION
INVENTORS Yuewei Wang Ganesh Jampani Yenjen Lee
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority from U.S. Provisional Application Serial
No. 60/260,171, "System and Method for Authoring Synchronized Multimedia Presentations," by Yuewei Wang, Ganesh Jampani, and Yenjen Lee, filed January 4, 2001. The subject matter of the provisional application is incorporated herein by reference.
BACKGROUND
A. Technical Field
[0002] The present invention relates to a system and method for authoring multimedia presentations that will be displayed on a client terminal attached to a network, and more particularly relates to a system and method allowing an author to time the display of multimedia files relative to a primary video file, resulting in a synchronized multimedia presentation.
B. Background of the Invention
[0003] Recent advances in technology as well as physical expansions of network infrastructures have increased the available bandwidth a large number of existing networks, including the Internet. This increase in bandwidth has greatly expanded a typical network's capacity to stream large amounts of data to client terminals on a network. Due to this increase in network capacity, service and content providers' ability to simultaneously stream multiple files to a client has vastly increased over the past year. These advancements have also increased the quality and quantity of files that a provider may deliver to a particular client. For example, in the recent past, if a user wanted to view a video clip, the video file needed to be delivered to and saved locally on a corresponding client terminal before actually viewing the clip. However, currently a provider may deliver content to a client in real-time, allowing a user to view a file as it is being streamed (i.e., Webcasts, server-side cached real-time multimedia files).
[0004] Due to the increase in network capacity, many bandwidth considerations that had often constrained software developers in producing new software applications are no longer relevant. As a result, software products requiring higher bandwidth are being produced and made available to customers. These new software products provide customers greater control and quality in a variety of network and multimedia applications. Additionally, advances in data compression and file protocol have also played a significant role in increasing network utilization. All of these advancements have allowed service providers to effectively stream large multimedia files to a client terminal. Examples of these multimedia files include RealPlayer™ audio/video files, mpeg, avi, and asf.
[0005] The quality and functionality of HTML-based applications have drastically increased. Many businesses and educational institutions have implemented HTML applications to service customers, employees and students. Examples of these types of applications include commercials, instructions, shopping, and Internet educational courses. Typically, a user will interface with an HTML-based application that will perform some function (i.e., display a product and corresponding price, give a detailed description of the installation of a certain product, display a student's course schedule).
[0006] The quality and functionality of audio/video applications have drastically increased as well, due in main part to the increases in network bandwidth. These video applications typically require large amounts of bandwidth in order to function properly on a corresponding client. Content providers, as well as businesses, provide these video applications to generate income or advertise a certain product. Examples of these high bandwidth video applications include Webcasts and streaming video. Typically, a user will only need to initiate a multimedia video/audio stream to display the file on a client terminal. [0007] Generally, multimedia files, including both HTML-based and audio/video, are viewed independently within a single presentation. For example, when a user is viewing a presentation containing streaming video, the display window generally contains only the streaming video file. Moreover, if files are displayed concurrently then there is no synchronization between the files. Therefore, typical multiple multimedia presentation viewed today lack a certain level of cooperation or synchronization between corresponding multimedia files within the presentation. [0008] The use of multiple multimedia files that are concurrently displayed allows a user a higher level of interaction and a content provider an increase in the quality and quantity of content presentations available for delivery. Additionally, a content provider may easily monitor the use and receive feedback when multimedia files are viewed concurrently. However, the presentation of synchronized multimedia files and the authoring of the presentation can be complex and time consuming. Thus, there is a need for a simple system and method for authoring presentations containing multiple multimedia files that are concurrently displayed on a client terminal.
SUMMARY OF THE INVENTION [0009] The present invention provides a simple system and method for authoring presentations containing multiple synchronized multimedia files that are displayed concurrently. Specifically, the present invention creates a platform on which rich content presentations are authored by synchronizing at least one secondary multimedia file to a primary multimedia file. [0010] The system comprises a first server coupled to a network, a second server coupled to the network, a database coupled to the second server, and at least one client coupled to the network. These networked devices allow multiple files to be synchronized and stored on the network. An example of the first server is a video server that streams video across the network and an example of the second server is a web server that transmits multimedia files across the network. As a result, a presentation may be authored locally and stored either locally or remotely on these servers and/or database.
[0011] An authoring platform allows a user to synchronize multimedia files together to create a multi-file presentation. This platform may operate on any computer related device that provides necessary processing and storage functionalities for the platform. The platform provides a user interface that allows the user to easily identify indices within a primary multimedia file and a corresponding secondary multimedia file(s) that are synchronized to these indices.
[0012] The user interface comprises a first display window, a second display window, a text message window, a presentation indices window, and a presentation control window. These windows display either multimedia files or synchronization/control data during the authoring process. The first display window displays a primary multimedia file to which other secondary multimedia files are synchronized. Generally, this primary multimedia file is a video file (i.e., mpeg, avi, real, asf, mp3, etc.) although it may be any type of multimedia file including those conforming to VoIP and H.323 standards. The si:ae, shape and color quality of the window may all be adjusted by the viewer during this authoring phase. [0013] The second display window displays a secondary multimedia file. The secondary multimedia file may be an HTML based file or any type of file including a video file, a Microsoft Powerpoint™ file, an image file, or a word processing file. During the authoring process, this secondary multimedia file is shown and indexed to the primary multimedia file. This synchronization process results in a synchronization file that controls the display of the synchronized multimedia presentation.
[0014] The text message window displays text messages or annotations that may be indexed to the primary multimedia file. Thus, during a synchronized presentation, synchronized text messages may appear within a web browser. In another embodiment, this text message window may be used as a chat window on which an author may communicate with other individuals during the authoring process. This embodiment provides for collaboration between multiple authors as a presentation is being created.
[0015] Although there are only three display windows described in these embodiments, it is important to note that according to the present invention there can be numerous windows displaying numerous multimedia files or text during the authoring process. [0016] The presentation indices window allows a user to create indices that synchronize a secondary multimedia file(s) to the primary multimedia file. Specifically, a user identifies a location within the primary multimedia file to which a secondary multimedia file is indexed. This location may be identified by various means including time or frame number. This indexing information is stored within a synchronization file that controls the presentation when it is displayed. Accordingly, during the display of the presentation, a secondary multimedia file will be triggered when the index is crossed resulting in the display of the secondary multimedia file, an audio file being played, or any other multimedia function defined within the secondary multimedia file.
[0017] The presentation indices window may comprise multiple icons that function to define these indices. For example, the window may have play and pause buttons that allow a user to stop and start the primary multimedia file. Additionally, the window may also have file browse icons and URL inputs that allow a user to easily identify a secondary multimedia file that is indexed to the primary multimedia file. The synchronization process occurs by creating a plurality of indices that synchronize these files together.
[0018] The presentation control window comprises a plurality of controls that allow a viewer to manipulate a presentation. For example, a viewer may play, pause, rewind, fast forward or select continual play from a variety of controls within this window.
Additionally, a status bar may be implemented to show a currently displayed frame or time position relative to the entire video file. A user may also control other viewing options within the presentation by accessing controls such as window size, font, font size, font color, and window arrangement. Also, the presentation control window may contain a counter that counts the number of frames that have been displayed as well as displaying the total number of frames within a certain video file.
[0019] The features and advantages described in this summary and the following detailed description are not all-inclusive, and particularly, many additional features and advantages will be apparent to one of ordinary skill in the art in view of the drawings, specification, and claims hereof.
BRIEF DESCRIPTION OF THE DRAWINGS
[0020] Fig. 1 shows an illustration of a system used to deliver the synchronized multimedia presentation to a display device according to an embodiment of the present invention. [0021] Fig. 2 is an illustration of a graphical user interface used to display a synchronized multimedia presentation according to an embodiment of the present invention.
[0022] Fig. 3 is a flowchart describing a method for authoring a synchronized multimedia presentation according to an embodiment of the present invention.
[0023] Fig. 4 is a block diagram of a device that may be used to display the synchronized multimedia presentation according to an embodiment of the present invention.
[0024] Fig. 5 is an illustration of a storage and control device used for a synchronized multimedia presentation according to an embodiment of the present invention.
[0025] Fig. 6 is a block diagram of a user interface that may be used to initiate the authoring of a synchronized multimedia presentation according to an embodiment of the present invention. [0026] Fig. 7 is a block diagram of a user interface that may be used to define characteristics of a synchronized multimedia presentation according to an embodiment of the present invention.
[0027] Fig. 8 is a block diagram of a user interface that may be used to define display characteristics of a secondary multimedia file in the synchronized multimedia presentation according to an embodiment of the present invention.
[0028] Fig. 9 is a detailed block diagram of a user interface that may be used to create a synchronized multimedia file according to an embodiment of the present invention.
[0029] Fig. 10 is a block diagram of a user interface that controls the presentation of synchronized multimedia files according to an embodiment of the present invention.
[0030] Fig. 11 is a block diagram of a user interface that controls an index within a synchronized multimedia presentation according to an embodiment of the present invention.
[0031] Fig. 12 is a flowchart describing a method for defining characteristics of an index within a synchronized multimedia presentation according to an embodiment of the present invention.
[0032] The figures depict a preferred embodiment of the present invention for purposes of illustration only. One skilled in the art will recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the invention described herein.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS [0033] A system and method for authoring, modifying, and creating a presentation containing synchronized concurrently displayed files is described. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the invention. It will be apparent, however, to one skilled in the art that the invention can be practiced without these specific details. In other instances, structures and devices are shown in block diagram form in order to avoid obscuring the invention.
[0034] Reference in the specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment.
[0035] Some portions of the detailed description that follows are presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, through not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. [0036] It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as "authoring" or "defining" or "storing" or "indexing" or "displaying" or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
[0037] The present invention also relates to apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
[0038] The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these systems will appear from the description below. In addition, the present invention is not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the invention as described herein. [0039] Moreover, the present invention is claimed below operating or working with a plurality of servers attached to a network. As such, software implementing the present invention may be stored locally on a client terminal or in conjunction with one or a plurality of servers on a network. Additionally, the software may be stored withinother storage devices (i.e., database, SAN) connected to the network.
A. Overview of Synchronized Multimedia Presentation System
[0040] Figure 1 shows a block diagram describing an embodiment of a system that may be used to author multimedia presentations according to the present invention. A client terminal 105 is com ected to a network 110. An author of a multimedia presentation uploads a primary multimedia file, typically a video file, to a video server 115 attached to the network 110. A secondary multimedia file, typically an HTML-based file but may be any type of file including another video file, is uploaded to a web server 120 attached to the network 110. A database 125 is coupled to the web server 120 for storing indexing information of the primary and secondary multimedia files. This database 125 may be external to the web server
120, as shown, or may be internal within the web server 120.
[0041] According to a first embodiment of the present invention, an author of a presentation containing multimedia files that will be synchronized may first upload each multimedia file to a corresponding server before indexing the display of each of the multimedia files. A primary multimedia file, in this case a video file, is uploaded to the video server 115. Secondary multimedia files are uploaded to a web server 120. Generally, there will be only one secondary multimedia file, typically an HTML-based file. However, multiple secondary multimedia files may be used to implement numerous files being synchronized and concurrently displayed within the presentation. Additionally, the secondary multimedia files do not need to be HTML-based files but can be any type of file including video, text, and image files. Once the multimedia files have been uploaded onto corresponding servers, the author can begin to create a presentation by synchronizing or indexing each of the multimedia files. Additionally, web pages on the Internet may be included within the presentation by providing the corresponding URL address of the specific page
[0042] A second embodiment of the present invention allows an author to create a presentation locally on the client terminal 105 using multimedia files stored on the terminal 105 or on an attached storage device (not shown). The client terminal 105 will need to use a software program to allow the author to index the display of each of the multimedia files within the presentation. Thus, once a primary multimedia file and secondary multimedia file(s) are properly stored in the above system, these files may be hdexed to create a synchronized presentation.
[0043] Having just described an overview of a system that allows a user to author synclironized presentations, the devices and methods used within the system to author the synchronized presentations are described below. First, a user interface operating on a client or other computer device that allows multiple files to be synchronized will be described. Second, the a method of authoring synchronized multimedia presentations will be described. Third, an exemplary example of a display device capable of authoring synchronized presentations is described in detail. Fourth, various control user interfaces and corresponding methods will be described. B. User Interface for Authoring Synchronized Multimedia Presentations
[0044] Fig. 2 illustrates a user interface that may be used to synchronize a secondary multimedia file(s) to a primary multimedia file. According to this embodiment, software operating on the client 105 creates a user interface that operates within a web browser.
[0045] The web browser is one example of a display device on which the present invention may be shown. Examples of web browsers include Microsoft Explorer and Netscape
Navigator. The browser 200 may be partitioned into different windows on which various different types of multimedia files may be shown and synchronized. For example, within the web browser 200, a first display window 205 displays a primary multimedia file. Generally, this primary multimedia file is a video file (i.e., mpeg, avi, real, asf, mp3, etc) although it may be any type of multimedia file including those conforming to VoIP and H.323 standards. The size, shape and color quality of the window may all be adjusted by the viewer during this authoring phase. [0046] A second display window 210 within the browser 200 displays a secondary multimedia file. The secondary multimedia file may be an HTML based file or any type of file including a video file, a Microsoft Powerpoint™ file, an image file, or a word processing file. During the authoring process, this secondary multimedia file is shown and indexed to the primary multimedia file. This synchronization process results in a synchronization file that controls the display of the synchronized multimedia presentation. The generation of this synchronization file will be discussed in greater detail below.
[0047] A text message window 215 may also be included within the web browser
200. In this window 215, text or annotations may be indexed to the primary multimedia file. Thus, during a synchronized presentation, synchronized text messages may appear within a web browser. In another embodiment, this text message window 215 may be used as a chat window on which an author may communicate with other individuals during the authoring process. This embodiment provides for collaboration between multiple authors as a presentation is being created.
[0048] Although there are only three display windows described in these embodiments, it is important to note that according to the present invention there can be numerous windows displaying numerous multimedia files or text during the authoring process. [0049] A presentation indices window 230 within the web browser 200 allows a user to create indices that synchronize a secondary multimedia file(s) to the primary multimedia file. Specifically, a user identifies a location within the primary multimedia file to which a secondary multimedia file is indexed. This location may be identified by various means including time or frame number. This indexing information is stored within a synchronization file that controls the presentation when it is displayed. Accordingly, during the display of the presentation, a secondary multimedia file will be triggered when the index is crossed resulting in the display of the secondary multimedia file, an audio file being played, or any other multimedia function defined within the secondary multimedia file. [0050] The presentation indices window 230 may comprise multiple icons that function to define these indices. For example, the window 230 may have play and pause buttons that allow a user to stop and start the primary multimedia file. Additionally, the window 230 may also have file browse icons and URL inputs that allow a user to easily identify a secondary multimedia file that is indexed to the primary multimedia file. The synchronization process occurs by creating a plurality of indices that synchronize these files together. This process will be described in greater detail below.
[0051] A presentation control window 235 contains a plurality of controls that allow a viewer to manipulate a presentation. For example, a viewer may play, pause, rewind, fast forward or select continual play from a variety of controls within this window 235. Additionally, a status bar (not shown) may be implemented to show a currently displayed frame or time position relative to the entire video file. A user may also control other viewing options within the presentation by accessing controls such as window size, font, font size, font color, and window arrangement. Also, the presentation control window 235 may contain a counter that counts the number of frames that have been displayed as well as displaying the total number of frames within a certain video file.
[0052] Fig. 3 illustrates a flowchart describing a method used for authoring synchronized multimedia presentations according to an embodiment of the present invention. A primary multimedia file is displayed 305 allowing a user to easily synchronize a secondary multimedia file(s) to the primary multimedia file. As the primary multimedia file is displayed, the user may stop 310 the multimedia file at a specific point of time or frame in order to index a secondary multimedia file to that specific moment in the primary multimedia file. [0053] Once an index point in the primary multimedia file is identified, a user may identify 315 a secondary multimedia file that will be displayed at that particular index.
This secondary multimedia file may be identified by an address path of a file stored in a storage device, a uniform resource locator identifying the file on a network, or any other identifier that allows a particular multimedia file to be identified and stored within the system.
After the secondary multimedia file is properly indexed to the primary multimedia file, this indexing information is stored 320 within a database. As previously described, this database may be attached to the web server 120 or it may be stored locally at the terminal. Finally, the primary multimedia file continues to be displayed 325 and another index may be inserted to synchronize another secondary multimedia file to the primary multimedia file. Thus, this process allows a user to synchronize a single or multiple secondary multimedia files to a primary multimedia file resulting in a synchronized multimedia presentation that may be viewed by other devices on the network.
C. Device for Authoring Synchronized Multimedia Presentations [0054] Figure 4 shows an example of a client terminal on which the author may create or edit the presentation. The client terminal comprises a control unit 400 coupled, via a bus, to a display 405, a keyboard 410, a cursor control 415, a network controller 420, and an I/O device 425.
[0055] The control unit 400 is typically a personal computer or computing box attached to a network. However, it may also be a personal digital assistant or any other device able to receive, process and display data. In one embodiment, the control unit 1000 has an operating system (i.e., Windows, UNIX, etc.) upon which multiple applications operate. The control unit 400 comprises a processor 450, main memory 435, and a data storage device all connected to a bus 430. [0056] A processor 450 processes data signals and may comprise various computing architectures including a complex instruction set computer (CISC) architecture, a reduced instruction set computer (RISC) architecture, or an architecture implementing a combination of instruction sets. Although only a single processor is shown, multiple processors may be attached. [0057] Main memory 435 may store instructions and/or data that may be executed by processor 450. The instructions and or data may comprise code for performing any and/or all of the techniques described herein. Main memory 435 may be a dynamic random access memory (DRAM) device, a static random access memory (SRAM) device, or some other memory device known in the art. The memory 435 preferably includes a web browser 440 of a conventional type that provides access to the Internet and processes HTML, XML or other mark up language to generated images on the display device 405. For example, the web browser 440 could be Netscape Navigator or Microsoft Internet Explorer.
[0058] Data storage device 445 stores data and instructions for processor 450 and may comprise one or more devices including a hard disk drive, a floppy disk drive, a
CD-ROM device, a DVD-ROM device, a DVD-RAM device, a DVD-RW device, a flash memory device, or some other mass storage device known in the art.
[0059] System bus 430 represents a shared bus for communicating information and data throughout control unit 400. System bus 430 may represent one or more buses including an industry standard architecture (ISA) bus, a peripheral component interconnect (PCI) bus, a universal serial bus (USB), or some other bus known in the art to provide similar functionality.
[0060] Additional components coupled to control unit 400 through system bus
430 include display device 405, keyboard 410, cursor control device 415, network controller 420 and audio device 425. Display device 405 represents any device equipped to display electronic images and data as described herein. Display device 405 may be a cathode ray tube (CRT), liquid crystal display (LCD), or any other similarly equipped display device, screen, or monitor. In one embodiment, display device 405 is equipped with a touch screen in which a touch-sensitive, transparent panel covers the screen of display device 405.
[0061] Keyboard 410 represents an alphanumeric input device coupled to control unit 400 to communicate information and command selections to processor 450. Cursor control 415 represents a user input device equipped to communicate positional data as well as command selections to processor 450. Cursor control 415 may include a mouse, a trackball, a stylus, a pen, a touch screen, cursor direction keys, or other mechanisms to cause movement of a cursor. Network controller 420 links control unit 400 to a network that may include multiple processing systems. The network of processing systems may comprise a local area network (LAN), a wide area network (WAN) (e.g., the Internet), and/or any other interconnected data path across which multiple devices may communicate. [0062] One or more I/O devices 425 are coupled to the system bus 430. For example, the I O device 425 may be an audio device equipped to receive audio input and transmit audio output. Audio input may be received through various devices including a microphone within audio device 425 and network controller 420. Similarly, audio output may originate from various devices including processor 450 and network controller 420. In one embodiment, audio device 425 is a general purpose; audio add-in/expansion card designed for use within a general purpose computer system. Optionally, audio device 425 may contain one or more analog-to-digital or digital-to-analog converters, and/or one or more digital signal processors to facilitate audio processing. [0063] It should be apparent to one skilled in the art that control unit 400 may include more or less components than those shown in Figure 4 without departing from the spirit and scope of the present invention. For example, control unit 400 may include additional memory, such as, for example, a first or second level cache, or one or more application specific integrated circuits (ASICs). Similarly, additional components may be coupled to control unit 400 including, for example, image scanning devices, digital still or video cameras, or other devices that may or may not be equipped to capture and/or download electronic data to control unit 400.
[0064] Figure 5 shows a more detailed drawing of the main memory 435. The main memory generally comprises an operating system 500 whereon a number of applications 505 operate. A bus 430 couples the operating system 500 to the applications 505. In a preferred embodiment, a video storage module 510 is coupled to the bus 430. The video storage module 510 stores a primary video file. The primary video file may be stored locally or streamed in real time from a video server and buffered in the video storage module. Examples of a video storage module include a hard disk drive or a RAM module. [0065] In a preferred embodiment, an HTML storage module 520 is coupled to the bus 430. The HTML storage module stores 520 secondary multimedia files. These secondary multimedia files may be either converted to an HTML format by an author prior to storage or automatically converted by the HTML storage module 520 to an HTML based file. These secondary multimedia files may be stored locally within the HTML storage module 520, pre-fetched from a web server and buffered in the module 520, or streamed in real time and buffered in the module 520. Examples of an HTML storage module 520 include a hard disk drive or a RAM module. [0066] In a preferred embodiment, a synchronization control module 525 is coupled to the bus 430. The synchronization control module 525 stores indexing and file addressing information created during the authoring of the presentation. Generally, this information is buffered in the synchronization control module 525 and then transmitted to a web server through a connected network. The web server then stores the information in a database. Examples of a synchronization control module 525 include a hard disk drive or a
RAM module.
[0067] In a preferred embodiment, a graphical user interface control module 530 is coupled to a bus 430. The graphical user interface control module 530 stores graphical display options created by the author during the creation of a presentation. Examples of these options include the size of each of the display windows, the volume of the audio, and the duration a secondary multimedia file is displayed. Once these display options are stored within the module 530, the data may be transmitted to a web server through a connected network. The web server may then either cache the display options internally or store the data within an attached database. Examples of a graphical control module 525 include a hard disk drive or a RAM module.
D. Presentation Authoring Control User Interfaces [0068] Fig. 6 illustrates a graphical representation of a display device showing a browser window displaying an interface for initializing or modifying the authoring process. As show in Fig. 6, an author is given a menu from which she may add, author, view or see presentations. An "Add a Presentation" window 605, allows an author to add a new presentation to a corresponding directory on a server. An "Author a Presentation" window 610, allows an author to create and edit a presentation that has been added to a directory. A "Presentation View" operation 615, allows an author to view a previously authored presentation and/or review a presentation that was just created. Finally, a "Presentation Listings" operation 620, allows an author to view all of the presentations within a specific directory. It is also important to note that there are other operations that an author can do from this interface, including uploading a multimedia file, sending an e-mail allowing the recipient to view a presentation, and creating a Webcast implementing multiple multimedia files. [0069] Fig. 7 illustrates a graphical representation of a display device showing a browser window displaying an interface for adding a new presentation to an appropriate directory. As shown in Fig. 7, when adding a presentation to a directory, an author may give the presentation a title 705. Also, there is an input box 710 that allows the author to give a brief description of the presentation. The brief description may be viewed when the author is later browsing the directory containing the presentation. The author may also control the access rights of the presentation via an access control 715. The author may keep the presentation private so that only she may have access to it or may give rights of access to certain individuals. Additionally, the author may unprotect the presentation and permit anyone to view it. Next, the author may define a category that describes the presentation using a category description operation 720. This operation allows an author to categorize the content of the presentation. This categorization allows an individual to easily search presentations by content stored on a server. Examples of categories include educational presentations, movies, advertisements, etc.) Finally, an author may protect access to the presentation by defining a certain password 725. Accordingly, specific authors may access the presentation only by inputting the correct password when prompted during login. [0070] Fig. 8 shows a graphical representation of a display device showing a browser window displaying an interface for uploading a multimedia file onto a corresponding server. As shown in Fig. 8, the available space 805 for uploading a multimedia file is shown to the author. Next, by completing a media clip name input box 810, the author may give the multimedia file a title/name under which it will be stored. Next, the author may identify a source address location in a source file input box 815. The author may browse either local or remote directories, if needed, to determine the correct address of the multimedia file. Optionally, the author may briefly describe the multimedia file in a description input box 820. This information will later be seen when the author is browsing the directory in which the file is stored. The author may input ownership information within the owner information box 825. This allows individuals to easily search presentations stored on a server by author. There is also a copyright info box 830 in which the author may input relevant copyright information regarding the multimedia file.
[0071] As was the case when adding a presentation, the author has control over who may access the uploaded multimedia file. The author may control the access rights of the presentation via an access control 835. The author may keep the presentation private so that only she may have access to it or may give certain individuals rights of access. Additionally, the author may permit anyone to view the presentation. Next, the user may define a category that describes the presentation using a category description operation 840. As previously described, this operation allows an author to categorize the content of the presentation. This categorization allows an individual to easily search presentations by content stored on a server.
Examples of categories include educational presentations, movies, advertisements, etc. Finally, an author may protect access to the presentation by defining a certain password 845.
Accordingly, specific authors may access the presentation only by inputting the correct password when prompted during login.
[0072] Fig. 9 shows a graphical representation of a display device showing a browser window displaying an interface for authoring a presentation after multimedia files have either been uploaded onto a server(s) or stored locally. A web browser 900 is depicted having an addressing window 990 through which a user may access authoring software over a network. A first window 902 displays a primary video file. A second window 905 displays a secondary multimedia file. A status window 915 displays status, indexing, timing and editing information that the author may use in creating a presentation. An annotation/text window 907 displays text appearing during the presentation. An annotation list window 945 displays a list of secondary multimedia file indexes relative to the primary video file. It is important to note that the number and size of the windows are not fixed and may be adjusted by the author according to a specific presentation.
[0073] The author may first select a primary video file to be displayed in the first window 902. This selection is done by clicking on a select video icon 930 which will either allow the author to browse directories containing uploaded video files or directly input the address of an uploaded video file. Once the video file is selected, it begins to play in the first window 902.
[0074] The author may control the video by clicking on a play/pause icon 950, a stop icon 955, and a continual play option 960 that restarts the video after it has ended.
Additionally, there is a first window re-size icon 975 that toggles the size of the first window from full screen to a default size.
[0075] The author may also control other display options by clicking on the option icon 970. After clicking on this icon, a small screen 1000 may appears that allows an author to control what and how material is displayed within a presentation. An example of this window is illustrated in Fig. 10. A first area 1005 within the window 1000 allows the author to control what windows are visible on the display. Controls for each of the following are within the area:
[0076] ( 1 ) the first window showing video 1010;
[0077] (2) the second window showing a secondary multimedia file (presentation document) 1015;
[0078] (3) the window showing an annotation list 1020;
[0079] (4) the window showing an annotation text 1025;
[0080] (5) the window showing the status of the video file 1030.
[0081] Also, there is a textual control area 1035 where the author can confrol the size, color and font of the text within the display. A first controller entitled "Font Type" 1040 allows the author to select a particular font for the text within the annotation window 907. There is also a "Font Size" controller 1045 that allows the author to select a particular font size for the text within the annotation window 907. Additionally, there is a "Text Cobr" controller 1050 that allows the author to select a particular text color for the text within the annotation window 907.
[0082] Returning to Fig. 9, once the video has begun playing in the first window
902, the status window 915 monitors and displays the current frame or position in time of the current video frame displayed in the first window. A status bar 965 shows the progression of the video by sliding a position indicator as the video plays. The author can watch the position indicator to see approximately what percentage of the video has been shown and what percentage is yet to be shown. Additionally, the author can control which frame is displayed by sliding the position indicator either forward or backwards. A status display 980 shows the author the overall length in time of the video file and the time position of the currently displayed frame in the first window 902. The status display 980 may also show the number of video frames within the video file and the current video frame being displayed within the first window 902.
[0083] Once a primary multimedia file, typically a video file, and at least one secondary multimedia file have been selected, the author synchronizes the display of these files into a presentation. According to this embodiment of the present invention, each secondary multimedia file is indexed relative to the primary multimedia file so that each secondary file is displayed when a certain frame or position in time is reached or displayed by the primary file. There are numerous ways in which this synchronization may be achieved. The following describes a preferred system and method for authoring this synchronized multimedia presentation.
[0084] The primary multimedia file, in this case a video file, is shown in the first window 902. A secondary multimedia file, in this case a Powerpoint™ presentation is shown in the second window 905. The author is able to control which slide is displayed in the second window 905 by using a slide ruler 920 or a frame forward/back controller 925. .
[0085] In authoring a presentation, the author may index the display of each of the secondary multimedia files relative to the primary multimedia file. To accomplish this, the author clicks on the "Add/Edit Index" icon 935, which causes the video in the first window 902 to stop and an index is created. Next, an "Add/Edit a video index" window appears as shown in Figure 10. A time box shows the author how much time has elapsed in the video to where the index was created.
[0086] Referring to Fig. 11, depicting a user interface 1100 that allows an author to identify a secondary multimedia file corresponding to an index, an author may easily index a secondary multimedia file to the created index. A "Frame Index" icon 1110 allows the author to see the exact frame at which the index was created. An annotation box 1115 allows the author to input text messages that will be displayed in the annotation window 907 at the created index. Next, the author can identify a secondary multimedia file that will be displayed in the second window 905 at the created index using options under a document directory. There are numerous types and ways in which these secondary multimedia files can be identified and indexed. A first way is to select a document by inputting its storage address as shown in 1025. For example, an author may provide an address path to a document stored on a storage device. A second way is to select a web page by inputting its URL address as shown in 1030. The user can view that specific secondary multimedia file (i.e., document or web page) by pressing the view button 1040 and the file is displayed in the second window 905. Once a secondary multimedia file has been located, the author may store the created index and the location of that specific file within a database, such as the network attached database 125.
[0087] It is important to note that a corresponding server may convert the secondary multimedia files automatically to a preferred format during the uploading process or selection process. Additionally, the author may want to convert specific files to different formats to optimize the performance of the presentation. [0088] Once the index and storage location of the secondary multimedia file has been stored, this information is shown in the annotation list window 945. For each hdex, the time, the text message, slide number and/or the document location is shown 910. As the author creates multiple indexes, the annotation list window 945 will display a list of each of the indexes with a corresponding description 910.
[0089] The author may then click on each displayed index and the corresponding video frame will be shown in the first window 902, the indexed secondary multimedia file will be shown in the second window 905, and the correspond text message will be shown in the annotation window 907. As a result, the author may quickly review her presentation both during and after the authoring process is completed. The author may also review the presentation by using the slide ruler 920 or the frame forward/back controller 925. Using these two devices allows the author to display the different indexed secondary multimedia files in the second window 905. When a secondary multimedia file is displayed within the second window 905, the corresponding video frame will be shown in the first window 902, the corresponding index will be highlighted within the annotation list window 945, and the text message will be displayed in the annotation window 907.
[0090] The author may also delete any or all of the indexes during the authoring process or subsequently during an editing process. First, the author can highlight a specific index with the annotation list window 945. Then, clicking a delete index icon 940 will delete the highlighted index. The author may also use the slide ruler 920 or the frame forward/back controller 925 to highlight a specific index and then click on the delete Index icon 940 to delete the highlighted index. Also, the author may delete all indices by simultaneously highlighting all of the indexes and the clicking on the delete index icon 940. As a result, the author is able to easily control each multimedia file, create indexes between the multimedia files, and review the presentation during the authoring process.
[0091] The author also has the ability to change an index either during authoring or editing of the presentation. This may be done by first highlighting an index within the annotation list window 945. Next, the author may input an appropriate frame of the primary multimedia/video file. Thereafter, the corresponding secondary file is re-indexed to the inputted frame. Additionally, the author may update a file by simply re-uploading a new file with the same name or changing the file pointer to the new file. [0092] There may also be an error detection that analyzes the presentation either during or after the authoring process. The error detection will check to see if there are any conflicts between indices and/or default settings and warn the author.
[0093] Once the authoring process is complete, a list of indexes with text messages, storage locations of secondary multimedia files, and frames of primary multimedia files are stored within the database 125. The primary video file is stored within the video server 115. At least one secondary multimedia file is stored within the web server 120. As a result, any display device that is connected to the network 110 may view the presentation containing these synchronized multimedia files. [0094] Figure 12 illustrates a flowchart of an overview for indexing secondary multimedia files to a primary multimedia file. According to this flowchart, an author identifies 1205 a secondary multimedia file that will be indexed to a location of the primary multimedia file. The author provides a location of the secondary multimedia file so that during the presentation this secondary multimedia file may be retrieved and displayed or played. For example, as previously mentioned, this location may be an address path on a storage device or a uniform resource locator of a file on a network device.
[0095] Once the location of the secondary multimedia file is provided, an author may supply 1215 an annotation or text message corresponding to the index. For example, it may be a text message describing the secondary multimedia file or a question to a viewer of the presentation. The author may also control 1220 the display of both the secondary multimedia file and the text message by changing display characteristics. For example, the author may adjust the size, font, and color of the text message. Also, the author may adjust the size of the window displaying the secondary multimedia file or adjust the volume of an audio component of the secondary multimedia file. Once the index has been completed, the author may preview the indexed primary and secondary multimedia files. This may be done by displaying the primary multimedia indexed frame in the primary multimedia window 902 and the indexed secondary multimedia file in the secondary multimedia window 905. This process continues until the author has indexed all the desired secondary multimedia files to the primary multimedia file. [0096] While the present invention has been described with reference to certain embodiments, those skilled in the art will recognize that various modifications may be provided. Variations upon and modifications to the preferred embodiments are provided for by the present invention, which is limited only by the following claims.

Claims

We claim:
1. A networked authoring system for synchronizing multimedia files into a presentation, the system comprising: a computerized display device, coupled to a network, that provides an authoring platform for synchronizing multimedia files into a presentation; a first server, coupled to the network; that receives and stores a primary multimedia file; a second server, coupled to the network, that receives and stores a secondary multimedia file; a third server, coupled to the network, that receives and stores a synchronization file.
2. The networked authoring system of claim 1 wherein the second server and the third server are the same.
3. The networked authoring system of claim 1 wherein the first server receives the primary multimedia file after the presentation is synchronized at the computerized display device.
4. The networked authoring system of claim 1 wherein the second seπer receives the secondary multimedia file after the presentation is synchronized at the computerized display device.
5. The networked authoring system of claim 1 wherein the third server receives the synchronization file after the presentation is synchronized at the computerized display device.
6. The networked authoring system of claim 1 wherein the synchronization file contains at least one frame index used to synchronize the secondary multimedia file to the primary multimedia file.
7. The networked authoring system of claim 1 wherein the synchronization file contains at least one time index used to synchronize the secondary multimedia file to the primary multimedia file.
8. The networked authoring system of claim 1 further comprising a security access control, coupled to the network, that limits access to the authoring platform.
9. The networked authoring system of claim 8 wherein a user is required to provide a password in order to access the authoring platform.
10. The networked authoring system of claim 1 further comprising a database, coupled to the second server, that stores the secondary multimedia file.
11. The networked authoring system of claim 10 wherein the synchronization file is stored in the database.
12. The networked authoring system of claim 10 wherein a control file containing control data for the presentation is stored in the database.
13. A computer implemented authoring platform for synchronized multimedia presentations comprising: a first display window, within the platform, that displays a primary multimedia file; a second display window, within the platform, that displays at least one secondary multimedia file; a text message window, within the platform, that displays text messages and annotations; a presentation indices window, within the platform, that allows an author to synchronize the at least one secondary multimedia file to the primary multimedia file by creating at least one index; and a presentation control window, within the platform, that allows an author to define characteristics of files within a synchronized multimedia presentation.
14. The authoring platform of claim 13 wherein the first display window displays a video file.
15. The authoring platform of claim 13 wherein the second display window displays an HTML-based multimedia file.
16. The authoring platform of claim 15 further comprising an HTML conversion script that converts a non-HTML secondary multimedia file into an HTML-based multimedia file.
17. The authoring platform of claim 13 wherein the text message window facilitates text messages to be synchronized to the primary multimedia file.
18. The authoring platform of claim 13 wherein the text message window allows an author to collaborate with other remote individuals.
19. The authoring platform of claim 13 wherein the presentation indices window allows files to be synchronized using a frame index to the primary multimedia file.
20. The authoring platform of claim 13 wherein the presentation indices window allows files to be synchronized using a time index to the primary multimedia file.
21. The authoring platform of claim 13 wherein the presentation control window allows an author to control the characteristics of at least one of the display windows.
22. The authoring platform of claim 13 wherein the presentation control window allows an author to control the characteristics of text within the text message window.
23. The authoring platform of claim 13 wherein a presentation may be accessed through a network connection allowing the presentation to be modified.
24. The authoring platform of claim 23 further comprising a presentation security module that limits access to the presentation.
25. The authoring platform of claim 24 wherein an author is required to provide a password to access the presentation.
26. The authoring platform of claim 24 wherein an author may access the presentation only from a particular network address.
27. The authoring platform of claim 13 further comprising a presentation index control, the control comprising: a time index window, within the control, that displays a time value corresponding to an index to the primary multimedia file; a frame index window, within the control, that displays a frame number coπesponding to an index to the primary multimedia file; and a secondary multimedia file locator, within the control, that identifies an address of a secondary multimedia file corresponding to an index.
28. The authoring platform of claim 27 further comprising an annotation box, within the control, that allows an author to provide a textmessage coπesponding to an index.
29. The authoring platform of claim 27 wherein the secondary multimedia file locator comprises a URL insert that allows an author to provide a URL address of the secondary multimedia file.
30. The authoring platform of claim 27 wherein the secondary multimedia file locator comprises a document address input that allows an author to provide an address path of the secondary multimedia file that is stored locally in a storage device.
31. The authoring platform of claim 27 further comprising a view document window that allows an author to preview the secondary multimedia file.
32. A method for authoring a synchronized multimedia presentation, the method comprising: displaying a primary multimedia file; identifying a secondary multimedia file that is to be synchronized to the primary multimedia file; providing a location in the primary multimedia file to which the secondary multimedia file is indexed; creating a synchronization file having at least one index; and storing the synchronization file within a storage device.
33. The method of claim 32 wherein the primary multimedia file is a video file.
34. The method of claim 32 wherein the secondary multimedia file is a HTML-based multimedia file.
35. The method of claim 32 wherein the at least one index is a time index to the primary multimedia file.
36. The method of claim 32 wherein the at least one index is a frame index to the primary multimedia file.
37. The method of claim 32 wherein the synchronization file is stored locally on a computer device.
38. The method of claim 32 wherein the synchronization file is stored remotely on a network device.
39. The method of claim 32 further comprising the step of providing security access to the synchronized multimedia presentation.
40. The method of claim 39 wherein an author is required to provide a password in order to access the synchronized multimedia presentation.
41. The method of claim 32 wherein the synchronization file contains file control data coπesponding to at least one file within the synchronized multimedia presentation.
42. The method of claim 41 further comprising the step of providing a message coπesponding to an index to the primary multimedia file.
43. The method of claim 41 further comprising the step of providing dsplay characteristics of at least one secondary multimedia file to an index to the primary multimedia file.
PCT/US2002/000372 2001-01-04 2002-01-04 Authoring platform for synchronized multimedia presentation WO2002077750A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2002311744A AU2002311744A1 (en) 2001-01-04 2002-01-04 Authoring platform for synchronized multimedia presentation

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US26017101P 2001-01-04 2001-01-04
US60/260,171 2001-01-04

Publications (2)

Publication Number Publication Date
WO2002077750A2 true WO2002077750A2 (en) 2002-10-03
WO2002077750A3 WO2002077750A3 (en) 2003-02-06

Family

ID=22988070

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2002/000372 WO2002077750A2 (en) 2001-01-04 2002-01-04 Authoring platform for synchronized multimedia presentation

Country Status (2)

Country Link
AU (1) AU2002311744A1 (en)
WO (1) WO2002077750A2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009120583A3 (en) * 2008-03-26 2009-12-10 Sri International Method and apparatus for selecting related content for display in conjunction with a media
US8645991B2 (en) 2006-03-30 2014-02-04 Tout Industries, Inc. Method and apparatus for annotating media streams
US9323572B2 (en) 2011-06-02 2016-04-26 International Business Machines Corporation Autoconfiguration of a cloud instance based on contextual parameters

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5848291A (en) * 1993-09-13 1998-12-08 Object Technology Licensing Corp. Object-oriented framework for creating multimedia applications
US5892900A (en) * 1996-08-30 1999-04-06 Intertrust Technologies Corp. Systems and methods for secure transaction management and electronic rights protection
US5966121A (en) * 1995-10-12 1999-10-12 Andersen Consulting Llp Interactive hypervideo editing system and interface
US5978835A (en) * 1993-10-01 1999-11-02 Collaboration Properties, Inc. Multimedia mail, conference recording and documents in video conferencing
US6161124A (en) * 1996-08-14 2000-12-12 Nippon Telegraph & Telephone Corporation Method and system for preparing and registering homepages, interactive input apparatus for multimedia information, and recording medium including interactive input programs of the multimedia information
US6204840B1 (en) * 1997-04-08 2001-03-20 Mgi Software Corporation Non-timeline, non-linear digital multimedia composition method and system
US6317794B1 (en) * 1997-11-12 2001-11-13 Ncr Corporation Computer system and computer implemented method for synchronization of simultaneous web views

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5848291A (en) * 1993-09-13 1998-12-08 Object Technology Licensing Corp. Object-oriented framework for creating multimedia applications
US5978835A (en) * 1993-10-01 1999-11-02 Collaboration Properties, Inc. Multimedia mail, conference recording and documents in video conferencing
US5966121A (en) * 1995-10-12 1999-10-12 Andersen Consulting Llp Interactive hypervideo editing system and interface
US6161124A (en) * 1996-08-14 2000-12-12 Nippon Telegraph & Telephone Corporation Method and system for preparing and registering homepages, interactive input apparatus for multimedia information, and recording medium including interactive input programs of the multimedia information
US5892900A (en) * 1996-08-30 1999-04-06 Intertrust Technologies Corp. Systems and methods for secure transaction management and electronic rights protection
US6204840B1 (en) * 1997-04-08 2001-03-20 Mgi Software Corporation Non-timeline, non-linear digital multimedia composition method and system
US6317794B1 (en) * 1997-11-12 2001-11-13 Ncr Corporation Computer system and computer implemented method for synchronization of simultaneous web views

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8645991B2 (en) 2006-03-30 2014-02-04 Tout Industries, Inc. Method and apparatus for annotating media streams
WO2009120583A3 (en) * 2008-03-26 2009-12-10 Sri International Method and apparatus for selecting related content for display in conjunction with a media
US8793256B2 (en) 2008-03-26 2014-07-29 Tout Industries, Inc. Method and apparatus for selecting related content for display in conjunction with a media
US9323572B2 (en) 2011-06-02 2016-04-26 International Business Machines Corporation Autoconfiguration of a cloud instance based on contextual parameters

Also Published As

Publication number Publication date
WO2002077750A3 (en) 2003-02-06
AU2002311744A1 (en) 2002-10-08

Similar Documents

Publication Publication Date Title
US9426214B2 (en) Synchronizing presentation states between multiple applications
US7631260B1 (en) Application modification based on feed content
US11017488B2 (en) Systems, methods, and user interface for navigating media playback using scrollable text
US9800941B2 (en) Text-synchronized media utilization and manipulation for transcripts
EP1592237B1 (en) Specialized media presentation via an electronic program guide (EPG)
US7653925B2 (en) Techniques for receiving information during multimedia presentations and communicating the information
US8494907B2 (en) Systems and methods for interaction prompt initiated video advertising
US7051275B2 (en) Annotations for multiple versions of media content
US10013704B2 (en) Integrating sponsored media with user-generated content
US20070011206A1 (en) Interactive playlist generation using annotations
US10268760B2 (en) Apparatus and method for reproducing multimedia content successively in a broadcasting system based on one integrated metadata
US20140143835A1 (en) Web-Based Digital Publishing Platform
CN112329403A (en) Live broadcast document processing method and device
WO2002054192A2 (en) Synchronized multimedia presentation
US10901762B1 (en) Tutorial content creation and interaction system
US20110138282A1 (en) System and method for synchronizing static images with dynamic multimedia contents
CN101491089A (en) Embedded metadata in a media presentation
WO2002077750A2 (en) Authoring platform for synchronized multimedia presentation
US20050033753A1 (en) System and method for managing transcripts and exhibits
WO2023135449A1 (en) System and method for interactive synchronized transcription
TW201333729A (en) Display system and method

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
AK Designated states

Kind code of ref document: A3

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase in:

Ref country code: JP