WO2001019088A1 - Client presentation page content synchronized to a streaming data signal - Google Patents

Client presentation page content synchronized to a streaming data signal Download PDF

Info

Publication number
WO2001019088A1
WO2001019088A1 PCT/US2000/024642 US0024642W WO0119088A1 WO 2001019088 A1 WO2001019088 A1 WO 2001019088A1 US 0024642 W US0024642 W US 0024642W WO 0119088 A1 WO0119088 A1 WO 0119088A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
client
data signal
signal
event
Prior art date
Application number
PCT/US2000/024642
Other languages
French (fr)
Inventor
John B. Swanton
Dennis Breckenridge
Original Assignee
E-Studiolive, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by E-Studiolive, Inc. filed Critical E-Studiolive, Inc.
Priority to AU73582/00A priority Critical patent/AU7358200A/en
Publication of WO2001019088A1 publication Critical patent/WO2001019088A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/268Signal distribution or switching
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Definitions

  • the present invention relates generally to apparatus and methods for providing a synchronized client presentation page and more specifically to an apparatus and method for realtime synchronizing of elements of a client presentation page to a streaming data signal having a video and/or audio component.
  • SMIL Synchronized Multimedia Integration Language
  • RealNetworks, Inc and ePublisher by Avid Technology, Inc. allow an operator to predefine and prearrange text and graphics elements of the presentation page to be coordinated with the video portion of the presentation page. The coordination is all done prior to dissemination of the presentation page using pre-recorded video.
  • the present invention synchronizes the content (i.e., text and/or graphics) of a client presentation page with the video and/or audio signals in a real-time environment, allowing a operator to change the synchronization up until the video and/or audio components are sent to the client for presentation.
  • the invention does not require a pre-timed presentation, nor a prerecorded video and/or audio portion, so that that synchronization can be done using a live presentation.
  • the invention relates to a method and apparatus for synchronizing content of a client presentation page to a streaming data signal, where the streaming data signal includes a video component and/or an audio component.
  • the invention relates to a method for synchronizing the content of a client presentation page to a streaming data signal which comprises a video component and/or an audio component.
  • the method embeds one or more data events into the streaming data signal simultaneously with the generation of the data event, transmits the streaming data signal to a client and retrieves on the client, or from the server, data in response to each data event in the streaming data signal.
  • the method further provides in the client presentation page a video and/or an audio component portion of the data signal as the data signal is processed, and simultaneously provides with the video and/or audio component portion in the client presentation page a representation of the data in response to each data event.
  • this method generates one or more data events in response to user input.
  • this method contains the step of processing the streaming data signal on the client.
  • the streaming data signal contains an encoded digital signal.
  • the video and/or audio component portion of the digital signal is associated with a live production.
  • the method of retrieving involves requesting from a server the data in response to each data event in the data signal and receiving the requested data at the client.
  • this method of retrieving also includes a process of receiving at the client, prior to receiving the data event, data that will be used for each data event in the data signal, storing at the client the received data for subsequent use, and retrieving from the client storage data in response to each data event in the data signal being processed on the client.
  • the method for synchronizing involves storing the streaming data signal. In another embodiment, this method includes changing by a power user a source from which a video component and an audio component portion of the digital signal is received.
  • the invention in another aspect, involves a method for generating a data signal for synchronizing content of a client presentation page to a streaming data signal.
  • This method of generating includes providing a digital signal containing a video component and/or an audio component, generating one or more data events in response to user input, embedding each data event into the digital signal substantially simultaneously with the generation of the data event to create a data signal and streaming to a client the data signal for providing in the client presentation page a video and/or an audio component portion of the data signal as the data signal is processed.
  • the method further includes transmitting to a client data corresponding to each data event for substantially simultaneously providing with the video component and/or audio component portion in the client presentation page a representation of the data in response to each data event.
  • the invention in another aspect, relates to a method for presenting to a user a client presentation page synchronized to a streaming data signal.
  • This method includes receiving from a server a streaming data signal, the data signal comprising at least one embedded data event and a video component and/or an audio component, processing the data signal on the client and retrieving data in response to each data event in the data signal being processed on the client.
  • the method further includes providing in the client presentation page the video and/or audio component portion of the data signal as the data signal is processed, and substantially simultaneously providing with the video and/or audio component portion in the client presentation page a representation of the data in response to each data event.
  • the invention in another aspect, relates to a system for synchronizing content of a client presentation page to a streaming data signal.
  • the system includes a producer module for generating one or more data events in response to a user input, a data signal generation module in communication with the producer module for receiving each data event and in communication with a data source for receiving a source signal comprising a video component and/or an audio component, the data signal generation module embedding the data event into a digital representation of the source signal substantially simultaneously with generation of the data event to create a digital data signal and streaming the data signal to a client.
  • the system further includes a web server module in communication with the producer module, the web server module transmitting to the client data corresponding to each embedded data event, and a client in communication with the web server module for receiving corresponding data and in communication with the data signal generation module for receiving the data signal.
  • the client processes the data signal, retrieves the corresponding data in response to each data event in the processed data signal, and provides in the client presentation page the video and/or audio component portion of the data signal as the data signal is processed and a representation of the corresponding data in response to each data event, thereby substantially simultaneously providing with the video and/or audio component portion in the client presentation page the representation.
  • the digital data signal comprises an encoded digital signal.
  • the video component and audio component portion of the data signal is associated with a live production.
  • the client is configured for requesting from the web server module corresponding data in response to each data event in the processed data signal.
  • the client also includes a memory buffer for storing the received data.
  • the system includes an archive module for storing the streaming data signal.
  • the producer module is further configured to allow a power user to change receipt of the source signal from the data source to a second data source from which a second source signal is received.
  • the data source is a video switcher.
  • the invention in another aspect, relates to a server node for generating a data signal for synchronizing content of a client presentation page to a streaming data signal.
  • the server node includes a producer module for generating one or more data events in response to a user input, a data signal generation module in communication with the producer module for receiving each data event and in communication with a data source for receiving a source signal comprising a video component and/or an audio component.
  • the data signal generation module embeds the data event into a digital representation of the source signal substantially simultaneously with generation of the data event to create a digital data signal and streams the data signal to a client for display of the video and/or audio component portion of the digital data signal in the client presentation page as the digital data signal is processed by the client.
  • the server node further includes a web server module in communication with the producer module, the web server module transmitting to the client data corresponding to each embedded data event for display of a representation of the data in the client presentation page in response to each data event, so that the representation is substantially simultaneously displayed with the video and/or audio component portion.
  • the invention in another aspect, relates to a client node presenting to a user a client presentation page synchronized to a streaming data signal.
  • the client node includes a client in communication with a web server module for receiving corresponding data and in communication with a data signal generation module for receiving a digital data signal comprising at least one embedded data event and at least one video component and/or audio component.
  • the client processes the digital data signal, retrieves the corresponding data in response to each data event in the processed digital data signal, and displays in the client presentation page the video and/or audio component portion of the digital data signal as the digital data signal is processed, and a representation of the corresponding data in response to each data event, thereby substantially simultaneously displaying with the video and/or audio component portion in the client presentation page the representation.
  • FIG. la is a high level block diagram of an embodiment of the invention.
  • FIG. lb is a high level block diagram of an embodiment of a display produced by the invention.
  • FIG. 2 is a high level block diagram of another embodiment of the invention.
  • FIG. 3a is a screen shot of an embodiment of a configuration of a control panel used to control a video switcher, according to the invention.
  • FIG. 3b is a screen shot of an embodiment of a configuration of a control panel used to control the embedding of data events, according to the invention.
  • FIG. 4 is a screen shot of an embodiment of a graphical interface used to create a synchronized client presentation page produced by the invention.
  • FIG. 5 is a screen shot of an embodiment of a graphical interface used to enter data to create a synchronized client presentation page produced by the invention.
  • FIG. 6 is a screen shot of an embodiment of a graphical interface used to enter data to create a synchronized client presentation page produced by the invention.
  • FIG. 7 is a screen shot of an embodiment of a graphical interface used to enter data to create a synchronized client presentation page produced by the invention.
  • FIG. 8 is a screen shot of an embodiment of a graphical interface used to enter data to create a synchronized client presentation page produced by the invention.
  • FIGS. 9a and 9b are screen shots of an embodiment of a graphical interface used to enter data to create a synchronized client presentation page produced by the invention.
  • FIG. 9c is a screen shot of an embodiment of a survey created by an operator for use in a synchronized client presentation page produced by the invention.
  • FIG. 9d is a screen shot of an embodiment of a survey result displayed in a synchronized client presentation page produced by the invention.
  • FIG. 10 is a screen shot of an embodiment of a graphical interface used to enter data to create a synchronized client presentation page produced by the invention with a customized layout.
  • FIGS, la and lb depict an exemplary embodiment of a multimedia presentation system 10.
  • FIG. la depicts a partial page regeneration system 10 that includes a first computing system (“server node”) 14 in communication with a second computing system (“client node”) 18 over a network 22.
  • the network 22 can be a local-area network (LAN), such as a company intranet or a wide area network (WAN) such as the Internet or the World Wide Web.
  • the server node 14 and the client node 18 can be connected to the network 22 through a variety of connections including standard telephone lines, LAN or WAN links (e.g., Tl, T3, 56kb, X.25), broadband connections (ISDN, Frame Relay, ATM), and wireless connections.
  • the connections to the network 22 and among the various modules can be established using a variety of communication protocols (e.g., TCP/IP, IPX, SPX, NetBIOS, Ethernet, RS232, and direct asynchronous connections).
  • the presentation system 10 displays to a user a synchronized client presentation page 26 (FIG. lb) on the display 30 at the client node 18.
  • the synchronized client presentation page 26 includes several elements 34a, 34b, 34c, 34d, 34n (referred to generally as elements 34).
  • the number of elements 34 varies depending on what an operator wants to display to a user.
  • the client presentation page 26 can be what is generally referred to as a web page.
  • the client presentation page 26 can be written in any format comprehendible by the client node 18 including, for example, HTML XML, VRML, WML, (display) postscript and nroff.
  • Each element 34 represents something that is displayed to the user.
  • An element 34 can be, for example, a picture, an image, a graphic, a title and/or a character string.
  • An element 34 also can be, for example, a slide for Microsoft ® Power Point ® (manufactured by Microsoft, Inc. of Redmond, WA), a spreadsheet for Microsoft ® Excel (manufactured by Microsoft, Inc. of Redmond, WA), an HTML encoded page or the display of a chat session.
  • the client presentation page 26 also includes a video component 38 and/or an audio component.
  • the audio component is presented to the user through an audio player 42.
  • the audio player 42 represents any type of device and/or software that converts a digital signal to a signal audible to the user.
  • the video component 38 and/or audio component are/is used to synchronize the other elements 34 of the client presentation page 26 and referred to generally as the synchronizing component(s).
  • the video component 38 is separated to distinguish the element as the synchronizing component.
  • other elements 34 can be video and/or audio signals and can even be streamed signals.
  • the server node 14 (FIG. la) includes a producer module 62, a data signal generation module 66 and a web-server module 70.
  • the server node 14 and the three modules 62, 66, 70 are a combination of hardware and software. The implementation of the functions described for each can vary depending on the "tools" the user has available to him/her to implement the described functions.
  • the three modules 62, 66, 70 need not be physically located near one another or within hardware designated as the server.
  • the three modules 62, 66, 70 can be located anywhere throughout the computer network and in electrical communication with one another.
  • the term node indicates a logical unit rather than a physical unit.
  • the producer module 62 coordinates synchronization of text/graphic data with audio and/or video signals.
  • the producer module 62 is in electrical communication with the data signal generation module 66.
  • the communication can be implemented using any communication protocol (e.g., TCP/IP), based on the requirements imposed by the data signal generation module 66.
  • the producer module 62 transmits control signals to the data signal generation module 66 to control the conversion of the video and/or audio input signal 58 to the streaming data signal 50.
  • the producer module 62 generates data events. Each data event represents the desired state of an element 34 (FIG. lb) at the particular point in the video/audio at which the data event is inserted.
  • the producer module 62 transmits each data event to the data signal generation module 66 as it is generated to create real-time synchronization.
  • the producer module 62 also is in communication with the web-server module 70.
  • the communication can be implemented using any communication protocol (e.g., TCP/IP) based on the requirements imposed by the web-server module 70.
  • the producer module 62 transmits control signals to the web-server module 70 to control the access of client nodes 18 to the stored data needed for the presentation of the client presentation page 26.
  • the producer module 62 also ensures that the data needed for the other elements 34 of the client presentation page 26 is stored on to the web-server module 70, so that client nodes 18 can retrieve data as the data is needed to respond to data events.
  • the data signal generation module 66 of the server node 14 generates the streaming data signal 50.
  • the data signal generation module 66 receives a video and/or audio input signal 58.
  • the video and or audio input signal 58 is the synchronizing component, i.e., the signal to which the other elements 34 of the client presentation page 26 are synchronized.
  • the video and/or audio input signal 58 can be an analog signal or a digital signal. If the video and/or audio input signal 58 is an analog signal, the data signal generation module 66 converts the analog input into a digital signal. This conversion process can be done by using A/D hardware (e.g.,
  • the digital signal can be compressed using known compression codecs (e.g., RealVideo G2 with SVT from RealNetworks, Inc. or any of the MPEG standards developed by MPEG, an ISO working group).
  • the control signals received from the producer module 62 instruct the data signal generation module 66 in the encoding process.
  • the producer module 62 converts to control signals the user inputs in the encoder settings box 156, FIG. 10, such as the video quality, audio format, frame rate and data transmission rate.
  • the data signal generation module 66 also receives data events from the producer module 62. In another embodiment, the data signal generation module 66 receives data events from a third party source (not shown). For example, if the presentation is a sporting event, the database with the scores of the event can transmit new scores as they change, as a text type data event, to the data signal generation module 66. Similarly, as another example, if the presentation is an auction, the database with the prices of items for auction can transmit new prices as they change, as a text type data event, to the data signal generation module 66. As the data events are received from the producer module 66 or a third party source
  • the data event is embedded based on the encoding used by the data signal generation module 66. For example, if the digital version of the video and/or audio input signal 58 is being compressed using a RealVideo codec, the data event is embedded after the compression process.
  • the data signal generation module 66 embeds each data event in the digital signal as it is received from the producer module 62.
  • the data event is synchronized to the video and/or audio input signal 58 by embedding the data event at the near exact location in the digital version of the video and or audio input signal 58 with which the data event is intended to coincide.
  • the data signal generation module 66 streams the digital signal, with embedded data events, to the client node 18 using known streaming server software and/or hardware (e.g., RealNetworks G2 server).
  • the data signal generation module 66 includes an encoder server (not shown) for the conversion and compression and a streaming server to stream the streaming data signal to the client node 18.
  • the client node 18 includes a display 18 to display the synchronized client presentation page 26 and a data signal processing module 74. To receive the synchronized client presentation page 26, the client node 18 establishes a connection with the streaming server of the data signal generation module 66. In one embodiment, the client node 18 and the data signal generation module 66 communicate using a RealTime Streaming Protocol ("RTSP"), a protocol commonly used to transmit true streaming media to one or more viewers simultaneously. RTSP provides for viewers randomly accessing the stream and uses RealTime Transfer Protocol
  • RTSP RealTime Streaming Protocol
  • RTP transport protocol is created to deliver live media to one or more viewers simultaneously.
  • this protocol is a true streaming protocol, (i.e., the protocol matches the bandwidth of the media signal to the viewer's connection, so that the media is always seen in real-time.) and is the reason for one of the control signals being the data transfer rate .
  • progressive download techniques e.g., QuickTime's "fast start” feature
  • Progressive download allows the user to view the file as it is being downloaded.
  • Progressive download files do not adjust to match the bandwidth of the user's connection like a "true streaming” format.
  • Progressive download is also called “HTTP Streaming” because standard HTTP servers can deliver progressive download files, and no
  • the data signal processing module 74 processes the data events embedded in the streaming data signal 50.
  • the data signal can be removed a number of ways, depending on the software the client is executing for displaying video signals. For example, RealPlayer by RealNetworks, Inc. removes the data events from the streaming data signal and transmits the data events to the data signal processing module 74.
  • the data signal processing module 74 can include a typical web browser and additional code (e.g., JAVA) that instructs the web browser to retrieve data files from the web server module 70, using the data that is included in the data event.
  • the client node 18 In response to each data event, if necessary, the client node 18 establishes a connection to the web-server module 70 of the server node 14 using any network communication protocol (e.g., TCP/IP).
  • the client node 18 retrieves the data 54 needed for the data event and displays the graphical representation of the data in the synchronized client presentation page 26. Once the data requested has been received, the connection is terminated. When new data is required, the connection is reestablished.
  • the data is transmitted to the client node 18 before the client node 18 processes the data event. In this embodiment, the client node 18 does not establish a connection with the web-server module 70 to retrieve data. The client node 18 instead retrieves the data from a storage location within the client node 18.
  • FIG. 2 depicts another embodiment of the presentation system 10'.
  • the 14' further includes a data source 88, a control panel 80 and an archive module 84.
  • the data source 88 is in communication with data signal generation module 66 for transmitting a video and/or audio input signal 58 and in communication with the producer module 62 for receiving control signals.
  • the data source 88 is a device that receives multiple video and/or audio signals as inputs and creates an output signal to be used as the video and/or audio input signal 58. In one embodiment, the data source 88 selects the input to use based on the control signals from the producer module 62.
  • the data source 88 can be for example, a microphone, a compact disk (“CD”) and/or an audio mixer.
  • the data source 88 also can be for example, a video camera, a video tape recorder ("VTR") and/or a video switcher.
  • VTR video tape recorder
  • an ECHOlab 5000 video switcher Manufactured by e-StudioLIVE, Inc. of Chelmsford, Massachusetts
  • a data source 88 allows the operator to select from a number of video inputs.
  • the video inputs can be cameras at various angles of an item being displayed for an auction.
  • the data source 88 selects from the various inputs and transmits an output that is received by the data signal generation module 66 and streamed to the client node 18.
  • switching can be done at any time and there it is not necessary to rebuffer any video files on the client node 18 before the switchover can occur.
  • the switchover is seamless and goes unnoticed by the user at the client node 18. Since the data signal generation module 66 creates synchronization by embedding a data event as the streaming data signal 50 is generated, the other elements 34 of the client presentation page 26 are synchronized to the new video signal input selected without any interruption.
  • the control panel 80 is in electrical communication with the data source 88 and the producer module 62 for transmitting control signals in response to input by an operator.
  • the data source 88 selects the video input to use based on the control signals from the control panel 80.
  • the control panel 80 uses control signals to create special effects on the video input.
  • the control panel 80 can be a configurable input device, for example the ECHOlab Commander by e-StudioLIVE, Inc. As shown in FIG. 3a, the control panel 80 can be configured with commands to control the data source 88 when the data source 88 is a video switcher.
  • Each button 81 is configured to send a command to the video switcher (i.e., data source 88) to create a special effect.
  • the control panel 80' also can be configured to control the insertion of predefined data events into the streaming data signal 50.
  • each of the buttons 81 ' can represent a desired state of the other elements 34 of the client presentation page 26.
  • the operator presses a button 81 ' at the point of the video presentation to change the current state of an element 34 to the state represented by the button. Pressing the button causes the producer module 62 to generate a data event to effect the desired state represented by the button.
  • the archive module 84 is in communication with the data signal generation module 66.
  • the archive module captures the streaming data signal 50 as it is transmitted to the client node 18.
  • the archive module 84 stores the captured streaming data signal 50 in a storage medium.
  • a user may subsequently retrieve the stored streaming data signal 50 and watch the synchronized client presentation page 26, as it appeared during the original transmission.
  • Users can view chat sessions and surveys of the archived presentation as they appeared in the original presentation.
  • the original chat sessions and surveys can be replaced or augmented with chat sessions and surveys of the users accessing the archived streaming data signal 50.
  • the streaming data signal 50 contains the synchronizing component (i.e., the video and/or audio input signal 58) and data events.
  • the producer module 62 generates the data events. Before the producer module 62 can generate data events to embed into streaming data signal 50, the operator must define them.
  • FIG. 4 depicts a screenshot of an exemplary embodiment of a graphical interface generated by the producer module 62 that the operator uses to predefine the desired elements 34 of the client presentation page 26.
  • the operator has access to the producer module 62 using an input device (not shown) in communication with the producer module 62 (e.g., a keyboard and mouse of a personal computer).
  • the client presentation page 94 has been predefined with five elements 100, 101, 102, 103, 104.
  • the synchronizing component is a video (with audio) signal displayed in element 101.
  • the elements 100, 101, 102, 103, 104 are defined to the client node 14 using the HTML framing tags.
  • the synchronizing component is displayed in element 101.
  • the operator creates a predefined list (e.g., a playlist 90) of the desired states of the elements 100, 101, 102, 103, 104 that the operator wants to use as part of the client presentation page 94 during the video presentation.
  • a predefined list e
  • the playlist 90 in FIG. 4 has two defined events labeled "First Event” and "Second Event.” These events represent the two desired states the operator uses to create his/her synchronized client presentation page 94.
  • frame 0 which corresponds to element 100, displays the string "This is some text”.
  • frame 2 which corresponds to element 102, the image bottom_bar_right, stored as a GIF file, is displayed.
  • frame 0 which corresponds to element 100, displays the string "Second Event Text”.
  • frame 4 which corresponds to element 104, the homepage associated with the listed URL http://www.e- studiolive.com is displayed. Since frame 2 is not changed in the second event, it continues displaying the image from event 1.
  • the first and second events of the playlist 90 also include a command "pause 30." This command instructs the producer module 62 that after a 30 second pause, the producer module 62 should generate the next data event. Without the pause command, the producer module 62 waits until the operator instructs it to generate an event. Even with the pause command, the operator can override the automatic generation of a data event. An operator has ultimate control over the data events that are generated. However, absent intervention, the pause command causes the producer module 62 to generate data events automatically.
  • an operator is a professor giving a live lecture.
  • the professor has 16 slides to use in aiding discussion, each stored on the web-server 70 as a GIF image file.
  • the live presentation of the professor is displayed in element 101.
  • the slides are presented in element
  • a chat session for student's questions is presented in element 103. Since the professor does not know in advance the length of time he needs to display the slides or if the order may change based on student questions, the professor defines a playlist with 16 events, one for each slide, without using the pause feature. The professor also uses a configurable control panel 80, where 16 of the keys are configured to represent each event. As the professor is giving his lecture, he presses the button on the control panel 80 representing the slide he wants displayed. While discussing the current slide, a student types in the chat session that he does not understand where the variable in the displayed equation was determined. Upon reading the question, the professor states that the variable in question was determined with the set of initial conditions. The professor presses the button on the control panel 80 to display slide six, on which the calculation of variable in question is shown.
  • the control panel 80 transmits a control signal representing the button pressed to the producer module 62.
  • the producer module 62 Upon receiving that control signal, the producer module 62 generates a data event that represents event six that, according to the playlist, displays the sixth slide.
  • the producer module 62 transmits the data event to the data signal generation module 66 upon receipt of the control signal.
  • the data signal generation module 66 embeds the data signal into the streaming data signal 50 being streamed to the client node 18.
  • the data signal processing module 74 obtains the streaming data signal 50 from the server node 14 and obtains the embedded data event.
  • the data signal processing module 74 retrieves the data needed for the data event, in this case slide six, from either the storage buffer of the client node 18, or if not there, from the web-server module 70.
  • the client node 18 displays the representation of the retrieved data, the sixth slide on the display 30.
  • the data signal processing module 74 is also decompressing the video signal and displaying the video in the display 30. The result is that at substantially the same time the video is being displayed, a representation of the data associated with the data event is retrieved and also displayed simultaneously.
  • the data events inserted into the streaming data signal 50 include the data event frame identifier, the data event type and additional data.
  • the frame identifier indicates to the data signal processing module 74 the frame of the client presentation page 26 in which the representation of the data should be displayed.
  • the data event type indicates the type of data event, which indicates to the data signal processing module 74 what additional data parameters will follow the data event type.
  • Data event types can include, for example, titles, text events, survey events, links, picture and HTML files.
  • the additional data are the parameters needed to display a particular data type. The parameters will vary with each data event type.
  • the data event type can be a picture type event.
  • the associated data of the data event includes the parameters for the file name, the scale and the position.
  • the frame id is two (i.e., element 102) and the data event type is a picture.
  • the additional data includes the file name, the file is slide_six.GIF, and the scale is to display the picture as 95% of the frame and to center the slide both horizontally and vertically in the frame.
  • the data event type can be a title type event.
  • the operator enters the text of the title 126 (FIG. 6), the font of the title 130 (FIG. 6), the colors of the text and the background of the title 134 (FIG. 6), and the position of the title in the frame 138 (FIG. 6).
  • the associated data of the data event includes the parameters for the text, the font, the text and background colors and the position.
  • no file is retrieved because the text to be displayed, along with the parameters on how to display it, are all included in the data event.
  • the file name in the additional data for all of the data event types can be replaced with the file itself.
  • all the data needed to create the desired states of the other elements 34 of the client presentation page 26 is included as a data event in the streaming data signal 50.
  • the client node 18 does not need to retrieve any data from the webserver node 70.
  • the producer module 62 generates various graphical user interfaces ("GUI").
  • GUI graphical user interfaces
  • FIG. 5 depicts a GUI to create an event.
  • the operator enters a description of the event in the description box 110, so that during a presentation, the operator can quickly identify which event to select.
  • the operator can optionally select a pause time by checking the define pause box 114 and entering a time in the time box 118.
  • the operator can define the desired state of each of the elements 34 for that event.
  • FIG. 6 depicts a GUI to create an element that is a text string.
  • the operator enters the string in the text box 126.
  • the operator can select font characteristics 130 of the text string, text and background colors 134, and the position 138 of the text with respect to the placement within an element 34.
  • the producer module 62 has a predetermined layout, as depicted in FIG. 4, of 5 elements 100, 101, 102, 103, and 104.
  • An operator using this GUI can select one of the 4 elements 100, 102, 103, 104 in which to display the text string by checking the location in the where box 142. Only four of the five elements can be chosen from because one element 101 is reserved for the video synchronizing component.
  • the operator can check the link box 146 and enter a URL of the link in the address box 150. If a user while viewing a presentation selects the hyperlink, the browser of the client node 18 opens a separate window with the client presentation page of the hyperlink. When the text string is complete, the operator clicks on the OK box 122 and the text string is added to the playlist under the selected event.
  • the producer module 62 generates GUIs to create elements containing an image (e.g., FIG. 7), HTML code (e.g., FIG. 8) or a survey (e.g., FIG. 9).
  • the producer module 62 uses the data entered through the GUI to generate an associated data event.
  • the operator has entered, through the GUI, the desired frame, the file or text string to be used and any additional data to aid in the display of the file.
  • Creating a survey requires slightly more data.
  • the producer module 62 creates a GUI (e.g., FIG. 9a) to allow entry of the questions and a (e.g., GUI FIG. 9b) to allow entry of the choice of answers.
  • the producer module 62 also generates a view (e.g., FIG. 9c) of the survey to show the operator how the survey is presented to a user on the client node 18.
  • the survey has two events associated with it. One is presenting to the user the survey. The other is displaying the results of the survey.
  • FIG. 9d depicts a view of the results of a survey presented to users after those users were presented with and answered the questions of the survey.
  • the layout of the client presentation page 26 (i.e., the placement of the elements 34, including the video component 38) is controlled using a frameset file written in HTML.
  • the producer module 62 has a predetermined layout (e.g., client presentation page 94, FIG. 4), the producer module 62 provides a GUI to allow the operator to customize the layout.
  • FIG. 10 depicts an example of a GUI that includes a show data tab 160. When the operator selects this tab 160, the operator has access to a custom layout button 164. When the operator selects this button 164, the producer module 62 generates a custom options box 168 for further information. The operator enters a template of his/her custom layout. If the custom layout template references any other files, they must also be entered. The list of additional files are needed so that the producer module 62 can verify that all of the files needed to transmit the synchronized client presentation page 26 to the client node 18 are located on the web server module 70.
  • the custom layout template is a frameset file written in HTML.
  • the operator can insert show elements, such as titles, pictures, links and surveys, in the same way as for a predetermined layout.
  • the operator can simply right click in the playlist at the point where a data event is to be inserted and define the data event using one of the GUIs described above (e.g., FIG. 6, FIG. 7, FIG. 8, FIG. 9).
  • a difference is that instead of selecting the frame in which to place the data event by clicking in a diagram of the frames in the where box 142, FIG. 6, a dropdown box labeled "Where" will list all frames by the name given in the frameset file template. The operator selects the frame by highlighting and clicking the name of the frame in the dropdown list.
  • the synchronizing component can be a video component 38 that is a live broadcast of an sales agent webcasting a sales pitch for the latest product, the super widget, over the Internet.
  • Element a 34a contains the registered trademark of the company.
  • Element c 34c contains the name of the sales agent and all of his/her contact information (e.g., phone number, fax number and e-mail address). These two elements do not change during the sales pitch and are displayed at the beginning of the live broadcast and not changed.
  • Element n 34n contains a chat session that displays user input. The users use the chat session to ask questions to the sales agent during the sales pitch, so the sales agent can tailor the live broadcast to the needs of the users.
  • Element b displays many different items to aid the sales agent.
  • the element 34b displays slides for Microsoft ® Power Point ® (manufactured by Microsoft, Inc. of Redmond, WA), portraying features of the superwidget.
  • the element 34b displays still photographs of the superwidget in different colors.
  • the element 34b contains video of the superwidget being used.
  • the invention allows the sales agent to use any of these aids (e.g., slides for Microsoft ® Power Point ® (manufactured by Microsoft, Inc. of Redmond, WA), still photographs, video) during any point during the sales pitch.
  • the sales agent uses the chat session as a guide and displays the aids in response to the reaction of the users.
  • the sales agent (or the operator) synchronizes the display of the aids with his live sales pitch as he/she is talking.
  • Element d 34d displays an image of a bar graph with an entry for yes and an entry for no.
  • the sales agent surveys the users with yes or no questions and as the results are received from the users, they are displayed on the graph.
  • the sales agent has used multiple choice questions with four or five answers to choose from.
  • the displaying of the results are synchronized with the video presentation of the sales agent so he/she can discuss the results as they are presented to the users.

Abstract

In one aspect, the invention relates to a method for synchronizing the content of a client presentation page to a streaming data signal which comprises a video component and/or an audio component. The method embeds one or more data events into the streaming data signal simultaneously with the generation of the data event, transmits the streaming data signal to a client and retrieves on the client, or from the server, data in response to each data event in the streaming data signal. The method further provides in the client presentation page a video and/or an audio component portion of the data signal as the data signal is processed, and simultaneously provides with the video and/or audio component portion in the client presentation page a representation of the data in response to each data event.

Description

CLIENT PRESENTATION PAGE CONTENT SYNCHRONIZED TO A STREAMING DATA SIGNAL
Cross-Reference to Related Applications
This application claims priority to U.S. provisional application Serial No. 60/153,132, filed September 09, 1999. This co-pending application is incorporated herein by reference in its entirety.
Field of the Invention
The present invention relates generally to apparatus and methods for providing a synchronized client presentation page and more specifically to an apparatus and method for realtime synchronizing of elements of a client presentation page to a streaming data signal having a video and/or audio component.
Background of the Invention
With the growth of networked computers, the Internet, intranets and other computer networks are being used to distribute and present information. Presentations over a computer network can be displayed to a user using a web page. A typical web page can contain many elements, including video and/or audio components. Operators are the creators of a web page presentation. Operators of web presentations can coordinate text and graphical elements of the web page with the video and/or audio components using known software tools. For example, software tools such as Synchronized Multimedia Integration Language ("SMIL") by RealNetworks, Inc and ePublisher by Avid Technology, Inc., allow an operator to predefine and prearrange text and graphics elements of the presentation page to be coordinated with the video portion of the presentation page. The coordination is all done prior to dissemination of the presentation page using pre-recorded video.
Summary of the Invention
The present invention synchronizes the content (i.e., text and/or graphics) of a client presentation page with the video and/or audio signals in a real-time environment, allowing a operator to change the synchronization up until the video and/or audio components are sent to the client for presentation. The invention does not require a pre-timed presentation, nor a prerecorded video and/or audio portion, so that that synchronization can be done using a live presentation. The invention relates to a method and apparatus for synchronizing content of a client presentation page to a streaming data signal, where the streaming data signal includes a video component and/or an audio component.
In one aspect, the invention relates to a method for synchronizing the content of a client presentation page to a streaming data signal which comprises a video component and/or an audio component. The method embeds one or more data events into the streaming data signal simultaneously with the generation of the data event, transmits the streaming data signal to a client and retrieves on the client, or from the server, data in response to each data event in the streaming data signal. The method further provides in the client presentation page a video and/or an audio component portion of the data signal as the data signal is processed, and simultaneously provides with the video and/or audio component portion in the client presentation page a representation of the data in response to each data event.
In another embodiment, this method generates one or more data events in response to user input. In another embodiment, this method contains the step of processing the streaming data signal on the client. In another embodiment, the streaming data signal contains an encoded digital signal. In another embodiment, the video and/or audio component portion of the digital signal is associated with a live production. In another embodiment, the method of retrieving involves requesting from a server the data in response to each data event in the data signal and receiving the requested data at the client. In another embodiment, this method of retrieving also includes a process of receiving at the client, prior to receiving the data event, data that will be used for each data event in the data signal, storing at the client the received data for subsequent use, and retrieving from the client storage data in response to each data event in the data signal being processed on the client.
In another embodiment, the method for synchronizing involves storing the streaming data signal. In another embodiment, this method includes changing by a power user a source from which a video component and an audio component portion of the digital signal is received.
In another aspect, the invention involves a method for generating a data signal for synchronizing content of a client presentation page to a streaming data signal. This method of generating includes providing a digital signal containing a video component and/or an audio component, generating one or more data events in response to user input, embedding each data event into the digital signal substantially simultaneously with the generation of the data event to create a data signal and streaming to a client the data signal for providing in the client presentation page a video and/or an audio component portion of the data signal as the data signal is processed. The method further includes transmitting to a client data corresponding to each data event for substantially simultaneously providing with the video component and/or audio component portion in the client presentation page a representation of the data in response to each data event.
In another aspect, the invention relates to a method for presenting to a user a client presentation page synchronized to a streaming data signal. This method includes receiving from a server a streaming data signal, the data signal comprising at least one embedded data event and a video component and/or an audio component, processing the data signal on the client and retrieving data in response to each data event in the data signal being processed on the client. The method further includes providing in the client presentation page the video and/or audio component portion of the data signal as the data signal is processed, and substantially simultaneously providing with the video and/or audio component portion in the client presentation page a representation of the data in response to each data event.
In another aspect, the invention relates to a system for synchronizing content of a client presentation page to a streaming data signal. The system includes a producer module for generating one or more data events in response to a user input, a data signal generation module in communication with the producer module for receiving each data event and in communication with a data source for receiving a source signal comprising a video component and/or an audio component, the data signal generation module embedding the data event into a digital representation of the source signal substantially simultaneously with generation of the data event to create a digital data signal and streaming the data signal to a client. The system further includes a web server module in communication with the producer module, the web server module transmitting to the client data corresponding to each embedded data event, and a client in communication with the web server module for receiving corresponding data and in communication with the data signal generation module for receiving the data signal. The client processes the data signal, retrieves the corresponding data in response to each data event in the processed data signal, and provides in the client presentation page the video and/or audio component portion of the data signal as the data signal is processed and a representation of the corresponding data in response to each data event, thereby substantially simultaneously providing with the video and/or audio component portion in the client presentation page the representation. In another embodiment, the digital data signal comprises an encoded digital signal. In another embodiment, the video component and audio component portion of the data signal is associated with a live production. In another embodiment, the client is configured for requesting from the web server module corresponding data in response to each data event in the processed data signal. In another embodiment, the client also includes a memory buffer for storing the received data. In another embodiment, the system includes an archive module for storing the streaming data signal. In another embodiment, the producer module is further configured to allow a power user to change receipt of the source signal from the data source to a second data source from which a second source signal is received. In another embodiment, the data source is a video switcher.
In another aspect, the invention relates to a server node for generating a data signal for synchronizing content of a client presentation page to a streaming data signal. The server node includes a producer module for generating one or more data events in response to a user input, a data signal generation module in communication with the producer module for receiving each data event and in communication with a data source for receiving a source signal comprising a video component and/or an audio component. The data signal generation module embeds the data event into a digital representation of the source signal substantially simultaneously with generation of the data event to create a digital data signal and streams the data signal to a client for display of the video and/or audio component portion of the digital data signal in the client presentation page as the digital data signal is processed by the client. The server node further includes a web server module in communication with the producer module, the web server module transmitting to the client data corresponding to each embedded data event for display of a representation of the data in the client presentation page in response to each data event, so that the representation is substantially simultaneously displayed with the video and/or audio component portion.
In another aspect, the invention relates to a client node presenting to a user a client presentation page synchronized to a streaming data signal. The client node includes a client in communication with a web server module for receiving corresponding data and in communication with a data signal generation module for receiving a digital data signal comprising at least one embedded data event and at least one video component and/or audio component. The client processes the digital data signal, retrieves the corresponding data in response to each data event in the processed digital data signal, and displays in the client presentation page the video and/or audio component portion of the digital data signal as the digital data signal is processed, and a representation of the corresponding data in response to each data event, thereby substantially simultaneously displaying with the video and/or audio component portion in the client presentation page the representation.
Brief Description of the Drawings
The foregoing and other objects, features and advantages of the present invention, as well as the invention itself, will be more fully understood from the following description of preferred embodiments, when read together with the accompanying drawings, in which:
FIG. la is a high level block diagram of an embodiment of the invention.
FIG. lb is a high level block diagram of an embodiment of a display produced by the invention.
FIG. 2 is a high level block diagram of another embodiment of the invention.
FIG. 3a is a screen shot of an embodiment of a configuration of a control panel used to control a video switcher, according to the invention.
FIG. 3b is a screen shot of an embodiment of a configuration of a control panel used to control the embedding of data events, according to the invention. FIG. 4 is a screen shot of an embodiment of a graphical interface used to create a synchronized client presentation page produced by the invention.
FIG. 5 is a screen shot of an embodiment of a graphical interface used to enter data to create a synchronized client presentation page produced by the invention.
FIG. 6 is a screen shot of an embodiment of a graphical interface used to enter data to create a synchronized client presentation page produced by the invention.
FIG. 7 is a screen shot of an embodiment of a graphical interface used to enter data to create a synchronized client presentation page produced by the invention.
FIG. 8 is a screen shot of an embodiment of a graphical interface used to enter data to create a synchronized client presentation page produced by the invention.
FIGS. 9a and 9b are screen shots of an embodiment of a graphical interface used to enter data to create a synchronized client presentation page produced by the invention.
FIG. 9c is a screen shot of an embodiment of a survey created by an operator for use in a synchronized client presentation page produced by the invention.
FIG. 9d is a screen shot of an embodiment of a survey result displayed in a synchronized client presentation page produced by the invention.
FIG. 10 is a screen shot of an embodiment of a graphical interface used to enter data to create a synchronized client presentation page produced by the invention with a customized layout. Detailed Description of Preferred Embodiments
In broad overview, FIGS, la and lb depict an exemplary embodiment of a multimedia presentation system 10. FIG. la depicts a partial page regeneration system 10 that includes a first computing system ("server node") 14 in communication with a second computing system ("client node") 18 over a network 22. The network 22 can be a local-area network (LAN), such as a company intranet or a wide area network (WAN) such as the Internet or the World Wide Web. The server node 14 and the client node 18 can be connected to the network 22 through a variety of connections including standard telephone lines, LAN or WAN links (e.g., Tl, T3, 56kb, X.25), broadband connections (ISDN, Frame Relay, ATM), and wireless connections. The connections to the network 22 and among the various modules can be established using a variety of communication protocols (e.g., TCP/IP, IPX, SPX, NetBIOS, Ethernet, RS232, and direct asynchronous connections).
As depicted in FIG. lb, the presentation system 10 displays to a user a synchronized client presentation page 26 (FIG. lb) on the display 30 at the client node 18. The synchronized client presentation page 26 includes several elements 34a, 34b, 34c, 34d, 34n (referred to generally as elements 34). The number of elements 34 varies depending on what an operator wants to display to a user. For example, the client presentation page 26 can be what is generally referred to as a web page. The client presentation page 26 can be written in any format comprehendible by the client node 18 including, for example, HTML XML, VRML, WML, (display) postscript and nroff.
Each element 34 represents something that is displayed to the user. An element 34 can be, for example, a picture, an image, a graphic, a title and/or a character string. An element 34 also can be, for example, a slide for Microsoft ® Power Point ® (manufactured by Microsoft, Inc. of Redmond, WA), a spreadsheet for Microsoft ® Excel (manufactured by Microsoft, Inc. of Redmond, WA), an HTML encoded page or the display of a chat session. The client presentation page 26 also includes a video component 38 and/or an audio component. The audio component is presented to the user through an audio player 42. The audio player 42 represents any type of device and/or software that converts a digital signal to a signal audible to the user. The video component 38 and/or audio component (e.g., provided through the audio player 42) are/is used to synchronize the other elements 34 of the client presentation page 26 and referred to generally as the synchronizing component(s). The video component 38 is separated to distinguish the element as the synchronizing component. Note however that other elements 34 can be video and/or audio signals and can even be streamed signals. The server node 14 (FIG. la) includes a producer module 62, a data signal generation module 66 and a web-server module 70. The server node 14 and the three modules 62, 66, 70 are a combination of hardware and software. The implementation of the functions described for each can vary depending on the "tools" the user has available to him/her to implement the described functions. Particular implementations discussed herein are for exemplary purposes only and do not limit the possibilities of implementation. In addition, the three modules 62, 66, 70 need not be physically located near one another or within hardware designated as the server. The three modules 62, 66, 70 can be located anywhere throughout the computer network and in electrical communication with one another. The term node indicates a logical unit rather than a physical unit. The producer module 62 coordinates synchronization of text/graphic data with audio and/or video signals. The producer module 62 is in electrical communication with the data signal generation module 66. The communication can be implemented using any communication protocol (e.g., TCP/IP), based on the requirements imposed by the data signal generation module 66. The producer module 62 transmits control signals to the data signal generation module 66 to control the conversion of the video and/or audio input signal 58 to the streaming data signal 50. The producer module 62 generates data events. Each data event represents the desired state of an element 34 (FIG. lb) at the particular point in the video/audio at which the data event is inserted. The producer module 62 transmits each data event to the data signal generation module 66 as it is generated to create real-time synchronization. The producer module 62 also is in communication with the web-server module 70.
The communication can be implemented using any communication protocol (e.g., TCP/IP) based on the requirements imposed by the web-server module 70. The producer module 62 transmits control signals to the web-server module 70 to control the access of client nodes 18 to the stored data needed for the presentation of the client presentation page 26. The producer module 62 also ensures that the data needed for the other elements 34 of the client presentation page 26 is stored on to the web-server module 70, so that client nodes 18 can retrieve data as the data is needed to respond to data events.
The data signal generation module 66 of the server node 14 generates the streaming data signal 50. The data signal generation module 66 receives a video and/or audio input signal 58. The video and or audio input signal 58 is the synchronizing component, i.e., the signal to which the other elements 34 of the client presentation page 26 are synchronized. The video and/or audio input signal 58 can be an analog signal or a digital signal. If the video and/or audio input signal 58 is an analog signal, the data signal generation module 66 converts the analog input into a digital signal. This conversion process can be done by using A/D hardware (e.g.,
Osprey ® 200 video capture card manufactured by ViewCast.com, Inc. of Dallas, Texas) or software that converts analog video and/or audio signals to digital signals.
Further, the digital signal can be compressed using known compression codecs (e.g., RealVideo G2 with SVT from RealNetworks, Inc. or any of the MPEG standards developed by MPEG, an ISO working group). The control signals received from the producer module 62 instruct the data signal generation module 66 in the encoding process. For example, the producer module 62 converts to control signals the user inputs in the encoder settings box 156, FIG. 10, such as the video quality, audio format, frame rate and data transmission rate.
The data signal generation module 66 also receives data events from the producer module 62. In another embodiment, the data signal generation module 66 receives data events from a third party source (not shown). For example, if the presentation is a sporting event, the database with the scores of the event can transmit new scores as they change, as a text type data event, to the data signal generation module 66. Similarly, as another example, if the presentation is an auction, the database with the prices of items for auction can transmit new prices as they change, as a text type data event, to the data signal generation module 66. As the data events are received from the producer module 66 or a third party source
(not shown), they are embedded into the digital version of the video and/or audio input signal 58. The data event is embedded based on the encoding used by the data signal generation module 66. For example, if the digital version of the video and/or audio input signal 58 is being compressed using a RealVideo codec, the data event is embedded after the compression process. The data signal generation module 66 embeds each data event in the digital signal as it is received from the producer module 62. Thus, the data event is synchronized to the video and/or audio input signal 58 by embedding the data event at the near exact location in the digital version of the video and or audio input signal 58 with which the data event is intended to coincide. The data signal generation module 66 streams the digital signal, with embedded data events, to the client node 18 using known streaming server software and/or hardware (e.g., RealNetworks G2 server). In one embodiment, the data signal generation module 66 includes an encoder server (not shown) for the conversion and compression and a streaming server to stream the streaming data signal to the client node 18.
The client node 18 includes a display 18 to display the synchronized client presentation page 26 and a data signal processing module 74. To receive the synchronized client presentation page 26, the client node 18 establishes a connection with the streaming server of the data signal generation module 66. In one embodiment, the client node 18 and the data signal generation module 66 communicate using a RealTime Streaming Protocol ("RTSP"), a protocol commonly used to transmit true streaming media to one or more viewers simultaneously. RTSP provides for viewers randomly accessing the stream and uses RealTime Transfer Protocol
("RTP") as the transport protocol. RTP transport protocol is created to deliver live media to one or more viewers simultaneously. In this embodiment, this protocol is a true streaming protocol, (i.e., the protocol matches the bandwidth of the media signal to the viewer's connection, so that the media is always seen in real-time.) and is the reason for one of the control signals being the data transfer rate .
In another embodiment, progressive download techniques (e.g., QuickTime's "fast start" feature) can be used. Progressive download allows the user to view the file as it is being downloaded. Progressive download files do not adjust to match the bandwidth of the user's connection like a "true streaming" format. Progressive download is also called "HTTP Streaming" because standard HTTP servers can deliver progressive download files, and no
special protocols are needed
When the client node 18 starts receiving the streaming data signal 50, the data signal processing module 74 processes the data events embedded in the streaming data signal 50. The data signal can be removed a number of ways, depending on the software the client is executing for displaying video signals. For example, RealPlayer by RealNetworks, Inc. removes the data events from the streaming data signal and transmits the data events to the data signal processing module 74. The data signal processing module 74 can include a typical web browser and additional code (e.g., JAVA) that instructs the web browser to retrieve data files from the web server module 70, using the data that is included in the data event. In response to each data event, if necessary, the client node 18 establishes a connection to the web-server module 70 of the server node 14 using any network communication protocol (e.g., TCP/IP). The client node 18 retrieves the data 54 needed for the data event and displays the graphical representation of the data in the synchronized client presentation page 26. Once the data requested has been received, the connection is terminated. When new data is required, the connection is reestablished. In another embodiment, the data is transmitted to the client node 18 before the client node 18 processes the data event. In this embodiment, the client node 18 does not establish a connection with the web-server module 70 to retrieve data. The client node 18 instead retrieves the data from a storage location within the client node 18. FIG. 2 depicts another embodiment of the presentation system 10'. The server node
14' further includes a data source 88, a control panel 80 and an archive module 84. The data source 88 is in communication with data signal generation module 66 for transmitting a video and/or audio input signal 58 and in communication with the producer module 62 for receiving control signals. The data source 88 is a device that receives multiple video and/or audio signals as inputs and creates an output signal to be used as the video and/or audio input signal 58. In one embodiment, the data source 88 selects the input to use based on the control signals from the producer module 62.
The data source 88 can be for example, a microphone, a compact disk ("CD") and/or an audio mixer. The data source 88 also can be for example, a video camera, a video tape recorder ("VTR") and/or a video switcher. For example, an ECHOlab 5000 video switcher (Manufactured by e-StudioLIVE, Inc. of Chelmsford, Massachusetts) can be used to allow the operator to add special effects to the video signal. A data source 88 allows the operator to select from a number of video inputs.
For example, the video inputs can be cameras at various angles of an item being displayed for an auction. The data source 88 selects from the various inputs and transmits an output that is received by the data signal generation module 66 and streamed to the client node 18. Thus, switching can be done at any time and there it is not necessary to rebuffer any video files on the client node 18 before the switchover can occur. The switchover is seamless and goes unnoticed by the user at the client node 18. Since the data signal generation module 66 creates synchronization by embedding a data event as the streaming data signal 50 is generated, the other elements 34 of the client presentation page 26 are synchronized to the new video signal input selected without any interruption.
The control panel 80 is in electrical communication with the data source 88 and the producer module 62 for transmitting control signals in response to input by an operator. In one embodiment, the data source 88 selects the video input to use based on the control signals from the control panel 80. In another embodiment, the control panel 80 uses control signals to create special effects on the video input. In one embodiment, the control panel 80 can be a configurable input device, for example the ECHOlab Commander by e-StudioLIVE, Inc. As shown in FIG. 3a, the control panel 80 can be configured with commands to control the data source 88 when the data source 88 is a video switcher. Each button 81 is configured to send a command to the video switcher (i.e., data source 88) to create a special effect. As shown in FIG. 3b, the control panel 80' also can be configured to control the insertion of predefined data events into the streaming data signal 50. In this configuration, each of the buttons 81 ' can represent a desired state of the other elements 34 of the client presentation page 26. The operator presses a button 81 ' at the point of the video presentation to change the current state of an element 34 to the state represented by the button. Pressing the button causes the producer module 62 to generate a data event to effect the desired state represented by the button.
The archive module 84 is in communication with the data signal generation module 66. The archive module captures the streaming data signal 50 as it is transmitted to the client node 18. The archive module 84 stores the captured streaming data signal 50 in a storage medium. A user may subsequently retrieve the stored streaming data signal 50 and watch the synchronized client presentation page 26, as it appeared during the original transmission. Users can view chat sessions and surveys of the archived presentation as they appeared in the original presentation. In other embodiments, the original chat sessions and surveys can be replaced or augmented with chat sessions and surveys of the users accessing the archived streaming data signal 50.
The streaming data signal 50 contains the synchronizing component (i.e., the video and/or audio input signal 58) and data events. The producer module 62 generates the data events. Before the producer module 62 can generate data events to embed into streaming data signal 50, the operator must define them.
FIG. 4 depicts a screenshot of an exemplary embodiment of a graphical interface generated by the producer module 62 that the operator uses to predefine the desired elements 34 of the client presentation page 26. The operator has access to the producer module 62 using an input device (not shown) in communication with the producer module 62 (e.g., a keyboard and mouse of a personal computer). In this graphical interface, the client presentation page 94 has been predefined with five elements 100, 101, 102, 103, 104. The synchronizing component is a video (with audio) signal displayed in element 101. In one embodiment, the elements 100, 101, 102, 103, 104 are defined to the client node 14 using the HTML framing tags. The synchronizing component is displayed in element 101. The operator creates a predefined list (e.g., a playlist 90) of the desired states of the elements 100, 101, 102, 103, 104 that the operator wants to use as part of the client presentation page 94 during the video presentation.
The playlist 90 in FIG. 4 has two defined events labeled "First Event" and "Second Event." These events represent the two desired states the operator uses to create his/her synchronized client presentation page 94. In the first event, frame 0, which corresponds to element 100, displays the string "This is some text". In frame 2, which corresponds to element 102, the image bottom_bar_right, stored as a GIF file, is displayed. In the second event, frame 0, which corresponds to element 100, displays the string "Second Event Text". In frame 4, which corresponds to element 104, the homepage associated with the listed URL http://www.e- studiolive.com is displayed. Since frame 2 is not changed in the second event, it continues displaying the image from event 1. The first and second events of the playlist 90 also include a command "pause 30." This command instructs the producer module 62 that after a 30 second pause, the producer module 62 should generate the next data event. Without the pause command, the producer module 62 waits until the operator instructs it to generate an event. Even with the pause command, the operator can override the automatic generation of a data event. An operator has ultimate control over the data events that are generated. However, absent intervention, the pause command causes the producer module 62 to generate data events automatically.
As an example, an operator is a professor giving a live lecture. The professor has 16 slides to use in aiding discussion, each stored on the web-server 70 as a GIF image file. The live presentation of the professor is displayed in element 101. The slides are presented in element
102. A chat session for student's questions is presented in element 103. Since the professor does not know in advance the length of time he needs to display the slides or if the order may change based on student questions, the professor defines a playlist with 16 events, one for each slide, without using the pause feature. The professor also uses a configurable control panel 80, where 16 of the keys are configured to represent each event. As the professor is giving his lecture, he presses the button on the control panel 80 representing the slide he wants displayed. While discussing the current slide, a student types in the chat session that he does not understand where the variable in the displayed equation was determined. Upon reading the question, the professor states that the variable in question was determined with the set of initial conditions. The professor presses the button on the control panel 80 to display slide six, on which the calculation of variable in question is shown.
When the professor presses the button on the control panel 80, the control panel 80 transmits a control signal representing the button pressed to the producer module 62. Upon receiving that control signal, the producer module 62 generates a data event that represents event six that, according to the playlist, displays the sixth slide. The producer module 62 transmits the data event to the data signal generation module 66 upon receipt of the control signal. The data signal generation module 66 embeds the data signal into the streaming data signal 50 being streamed to the client node 18. Upon receipt of the data event by the client node 18, the data signal processing module 74 obtains the streaming data signal 50 from the server node 14 and obtains the embedded data event. In response to that data event, the data signal processing module 74 retrieves the data needed for the data event, in this case slide six, from either the storage buffer of the client node 18, or if not there, from the web-server module 70. The client node 18 displays the representation of the retrieved data, the sixth slide on the display 30. At the same time the data signal processing module 74 is processing the data event and retrieving the associated data, the data signal processing module 74 is also decompressing the video signal and displaying the video in the display 30. The result is that at substantially the same time the video is being displayed, a representation of the data associated with the data event is retrieved and also displayed simultaneously. By placing the data event in the streaming data signal 50 at the server node 14 where the operator wants it to occur, the data event, and thus the display of the representation of the associated data, is synchronized to the video signal.
The data events inserted into the streaming data signal 50 include the data event frame identifier, the data event type and additional data. The frame identifier indicates to the data signal processing module 74 the frame of the client presentation page 26 in which the representation of the data should be displayed. The data event type indicates the type of data event, which indicates to the data signal processing module 74 what additional data parameters will follow the data event type. Data event types can include, for example, titles, text events, survey events, links, picture and HTML files. The additional data are the parameters needed to display a particular data type. The parameters will vary with each data event type. For example, the data event type can be a picture type event. When creating a picture event for the playlist 90, the operator enters the file name 170 (FIG. 7) that holds the data of the image, the scale of the picture when displayed in the frame 174 (FIG. 7), and the position of the picture in the frame 178 (FIG. 7). In a picture data event, the associated data of the data event includes the parameters for the file name, the scale and the position. In the previous professor example, the frame id is two (i.e., element 102) and the data event type is a picture. The additional data includes the file name, the file is slide_six.GIF, and the scale is to display the picture as 95% of the frame and to center the slide both horizontally and vertically in the frame.
Some data event types (e.g., title, link) are self-contained (e.g., do not include a file name but instead include all the necessary data). For example, the data event type can be a title type event. When creating a title event for the playlist 90, the operator enters the text of the title 126 (FIG. 6), the font of the title 130 (FIG. 6), the colors of the text and the background of the title 134 (FIG. 6), and the position of the title in the frame 138 (FIG. 6). In a title data event, the associated data of the data event includes the parameters for the text, the font, the text and background colors and the position. For this data type no file is retrieved because the text to be displayed, along with the parameters on how to display it, are all included in the data event. In another embodiment, as hardware and/or software capacity of the multimedia presentation system 10 allows, the file name in the additional data for all of the data event types can be replaced with the file itself. In this embodiment, all the data needed to create the desired states of the other elements 34 of the client presentation page 26 is included as a data event in the streaming data signal 50. The client node 18 does not need to retrieve any data from the webserver node 70.
To assist an operator in creating a playlist, the producer module 62 generates various graphical user interfaces ("GUI"). FIG. 5 depicts a GUI to create an event. The operator enters a description of the event in the description box 110, so that during a presentation, the operator can quickly identify which event to select. The operator can optionally select a pause time by checking the define pause box 114 and entering a time in the time box 118. When the event is complete, the user clicks on the OK box 122 and the event is added to the playlist. Once an event is defined, the operator can define the desired state of each of the elements 34 for that event.
FIG. 6 depicts a GUI to create an element that is a text string. The operator enters the string in the text box 126. The operator can select font characteristics 130 of the text string, text and background colors 134, and the position 138 of the text with respect to the placement within an element 34. In one embodiment, the producer module 62 has a predetermined layout, as depicted in FIG. 4, of 5 elements 100, 101, 102, 103, and 104. An operator using this GUI can select one of the 4 elements 100, 102, 103, 104 in which to display the text string by checking the location in the where box 142. Only four of the five elements can be chosen from because one element 101 is reserved for the video synchronizing component. Finally, if the operator wants to add a hyperlink to the text string the operator can check the link box 146 and enter a URL of the link in the address box 150. If a user while viewing a presentation selects the hyperlink, the browser of the client node 18 opens a separate window with the client presentation page of the hyperlink. When the text string is complete, the operator clicks on the OK box 122 and the text string is added to the playlist under the selected event.
Similarly, the producer module 62 generates GUIs to create elements containing an image (e.g., FIG. 7), HTML code (e.g., FIG. 8) or a survey (e.g., FIG. 9). The producer module 62 uses the data entered through the GUI to generate an associated data event. The operator has entered, through the GUI, the desired frame, the file or text string to be used and any additional data to aid in the display of the file.
Creating a survey requires slightly more data. In creating a survey, the producer module 62 creates a GUI (e.g., FIG. 9a) to allow entry of the questions and a (e.g., GUI FIG. 9b) to allow entry of the choice of answers. The producer module 62 also generates a view (e.g., FIG. 9c) of the survey to show the operator how the survey is presented to a user on the client node 18. The survey has two events associated with it. One is presenting to the user the survey. The other is displaying the results of the survey. FIG. 9d depicts a view of the results of a survey presented to users after those users were presented with and answered the questions of the survey.
The layout of the client presentation page 26 (i.e., the placement of the elements 34, including the video component 38) is controlled using a frameset file written in HTML. Even though in one embodiment the producer module 62 has a predetermined layout (e.g., client presentation page 94, FIG. 4), the producer module 62 provides a GUI to allow the operator to customize the layout. FIG. 10 depicts an example of a GUI that includes a show data tab 160. When the operator selects this tab 160, the operator has access to a custom layout button 164. When the operator selects this button 164, the producer module 62 generates a custom options box 168 for further information. The operator enters a template of his/her custom layout. If the custom layout template references any other files, they must also be entered. The list of additional files are needed so that the producer module 62 can verify that all of the files needed to transmit the synchronized client presentation page 26 to the client node 18 are located on the web server module 70. The custom layout template is a frameset file written in HTML.
Once the operator has designed a frameset, the operator can insert show elements, such as titles, pictures, links and surveys, in the same way as for a predetermined layout. The operator can simply right click in the playlist at the point where a data event is to be inserted and define the data event using one of the GUIs described above (e.g., FIG. 6, FIG. 7, FIG. 8, FIG. 9). A difference is that instead of selecting the frame in which to place the data event by clicking in a diagram of the frames in the where box 142, FIG. 6, a dropdown box labeled "Where" will list all frames by the name given in the frameset file template. The operator selects the frame by highlighting and clicking the name of the frame in the dropdown list.
For another illustrative example of a presentation, the synchronizing component can be a video component 38 that is a live broadcast of an sales agent webcasting a sales pitch for the latest product, the super widget, over the Internet. Element a 34a contains the registered trademark of the company. Element c 34c contains the name of the sales agent and all of his/her contact information (e.g., phone number, fax number and e-mail address). These two elements do not change during the sales pitch and are displayed at the beginning of the live broadcast and not changed. Element n 34n contains a chat session that displays user input. The users use the chat session to ask questions to the sales agent during the sales pitch, so the sales agent can tailor the live broadcast to the needs of the users.
Element b displays many different items to aid the sales agent. During some points of the sales pitch, the element 34b displays slides for Microsoft ® Power Point ® (manufactured by Microsoft, Inc. of Redmond, WA), portraying features of the superwidget. At other points, the element 34b displays still photographs of the superwidget in different colors. Still at other points, the element 34b contains video of the superwidget being used. The invention allows the sales agent to use any of these aids (e.g., slides for Microsoft ® Power Point ® (manufactured by Microsoft, Inc. of Redmond, WA), still photographs, video) during any point during the sales pitch. To be the most effective and since the invention allows real-time synchronization, the sales agent uses the chat session as a guide and displays the aids in response to the reaction of the users. The sales agent (or the operator) synchronizes the display of the aids with his live sales pitch as he/she is talking. Element d 34d displays an image of a bar graph with an entry for yes and an entry for no. During the sales pitch, the sales agent surveys the users with yes or no questions and as the results are received from the users, they are displayed on the graph. In other sales pitches, the sales agent has used multiple choice questions with four or five answers to choose from. Again, the displaying of the results are synchronized with the video presentation of the sales agent so he/she can discuss the results as they are presented to the users.
Equivalents
The invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The foregoing embodiments are therefore to be considered in all respects illustrative rather than limiting on the invention described herein. Scope of the invention is thus indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.

Claims

What is claimed is: 1. A method for synchronizing content of a client presentation page to a streaming data signal, the streaming data signal comprising at least one of a video component and an audio component, the method comprising: embedding one or more data events into the streaming data signal substantially simultaneously with the generation of the data event; transmitting to a client the streaming data signal; retrieving on a client data in response to each data event in the streaming data signal; providing in the client presentation page the at least one of a video component and an audio component portion of the data signal as the data signal is processed; and substantially simultaneously providing with the at least one of a video component and an audio component portion in the client presentation page a representation of the data in response to each data event.
2. The method of claim 1 further comprising generating the one or more data events in response to user input.
3. The method of claim 1 further comprising receiving at least one of the one or more data events from an external source.
4. The method of claim 1 further comprising the step of processing the streaming data signal on the client.
5. The method of claim 1 wherein the streaming data signal comprises an encoded digital signal.
6. The method of claim 1 wherein the at least one of a video component and an audio component portion of the digital signal is associated with a live production.
7. The method of claim 1 wherein the step of retrieving further comprises: requesting from a server the data in response to each data event in the data signal; and receiving the requested data at the client.
8. The method of claim 1 wherein the step of retrieving further comprises: receiving at the client prior to receiving the data event data that will be used for each data event in the data signal,; storing at the client the received data for subsequent use; and retrieving from client storage data in response to each data event in the data signal being processed on the client.
9. The method of claim 1 further comprising the step of storing the streaming data signal.
10. The method of claim 1 further comprising the step of changing by a power user a source from which the at least one of a video component and an audio component portion of the digital signal is received.
11. A method for synchronizing content of a client presentation page to a streaming data signal, the method comprising: providing a digital signal comprising at least one of a video component and an audio component; generating one or more data events in response to user input; embedding each data event into the digital signal substantially simultaneously with the generation of the data event to create a data signal; streaming to a client the data signal; processing the streaming data signal on the client; retrieving data in response to each data event in the data signal being processed on the client; providing in the client presentation page the at least one of a video component and an audio component portion of the data signal as the data signal is processed; and substantially simultaneously providing with the at least one of a video component and an audio component portion in the client presentation page a representation of the data in response to each data event.
12. The method of claim 11 further comprising receiving at least one data event from an external source.
13. The method of claim 11 wherein the digital signal comprises an encoded digital signal.
14. The method of claim 11 wherein the at least one of a video component and an audio component portion of the digital signal is associated with a live production.
15. The method of claim 11 wherein the step of retrieving further comprises: requesting from a server the data in response to each data event in the data signal; and receiving the requested data at the client.
16. The method of claim 11 wherein the step of retrieving further comprises: receiving at the client prior to receiving the data event data that will be used for each data event in the data signal,; storing at the client the received data for subsequent use; and retrieving from client storage data in response to each data event in the data signal being processed on the client.
17. The method of claim 11 further comprising the step of storing the streaming data signal.
18. The method of claim 11 further comprising the step of changing by a power user a source from which the at least one of a video component and an audio component portion of the digital
signal is received.
19. A method for generating a data signal for synchronizing content of a client presentation page to a streaming data signal, the method comprising: providing a digital signal comprising at least one of a video component and an audio component; generating one or more data events in response to user input; embedding each data event into the digital signal substantially simultaneously with the generation of the data event to create a data signal; streaming to a client the data signal for providing in the client presentation page the at least one of a video component and an audio component portion of the data signal as the data signal is processed; and transmitting to a client data corresponding to each data event for substantially simultaneously providing with the at least one of a video component and an audio component portion in the client presentation page a representation of the data in response to each data event.
20. A method for presenting to a user a client presentation page synchronized to a streaming data signal, the method comprising: receiving from a server a streaming data signal, the data signal comprising at least one embedded data event and at least one of a video component and an audio component; processing the data signal on the client; retrieving data in response to each data event in the data signal being processed on the client;
providing in the client presentation page the at least one of a video component and an audio component portion of the data signal as the data signal is processed; and substantially simultaneously providing with the at least one of a video component and an audio component portion in the client presentation page a representation of the data in response to each data event.
21. A system for synchronizing content of a client presentation page to a streaming data signal, the system comprising: a producer module for generating one or more data events in response to a user input; a data signal generation module in communication with the producer module for receiving each data event and in communication with a data source for receiving a source signal comprising at least one of a video component and an audio component, the data signal generation module embedding the data event into a digital representation of the source signal substantially simultaneously with generation of the data event to create a digital data signal and streaming the data signal to a client; a web server module in communication with the producer module, the web server module transmitting to the client data corresponding to each embedded data event; a client in communication with the web server module for receiving corresponding data and in communication with the data signal generation module for receiving the data signal, the client processing the data signal, retrieving the corresponding data in response to each data event in the processed data signal, and providing in the client presentation page the at least one of a video component and an audio component portion of the data signal as the data signal is processed and a representation of the corresponding data in response to each data event, thereby substantially simultaneously providing with the at least one of a video component and an audio component portion in the client presentation page the representation.
22. The system of claim 21 wherein the digital data signal comprises an encoded digital signal.
23. The system of claim 21 wherein the at least one of a video component and an audio component portion of the data signal is associated with a live production.
24. The system of claim 21 wherein the client is further configured for requesting from the web server module corresponding data in response to each data event in the processed data signal.
25. The system of claim 21 wherein the client further comprises a memory buffer for storing the received data.
26. The system of claim 21 wherein the data signal generation module is in communication with an external source for receiving a data event.
27. The system of claim 21 further comprising an archive module for storing the streaming data signal.
28. The system of claim 21 wherein the producer module is further configured to allow a power user to change receipt of the source signal from the data source to a second data source from which a second source signal is received.
29. The system of claim 21 wherein the data source is a video switcher.
30. A server node for generating a data signal for synchronizing content of a client presentation page to a streaming data signal, the server node comprising: a producer module for generating one or more data events in response to a user input; a data signal generation module in communication with the producer module for receiving each data event and in communication with a data source for receiving a source signal comprising at least one of a video component and an audio component, the data signal generation module embedding the data event into a digital representation of the source signal substantially simultaneously with generation of the data event to create a digital data signal and streaming the data signal to a client for display of the at least one of a video component and an audio component portion of the digital data signal in the client presentation page as the digital data signal is processed by the client; and a web server module in communication with the producer module, the web server module transmitting to the client data corresponding to each embedded data event for display of a representation of the data in the client presentation page in response to each data event, so that the representation is substantially simultaneously displayed with the at least one of a video component and an audio component portion.
31. A client node presenting to a user a client presentation page synchronized to a streaming data signal, the client node comprising: a client in communication with a web server module for receiving corresponding data and in communication with a data signal generation module for receiving a digital data signal comprising at least one embedded data event and at least one of a video component and an audio component, the client processing the digital data signal, retrieving the corresponding data in response to each data event in the processed digital data signal, and displaying in the client presentation page the at least one of a video component and an audio component portion of the digital data signal as the digital data signal is processed and a representation of the corresponding data in response to each data event, thereby substantially simultaneously displaying with the at least one of a video component and an audio component portion in the client presentation page the representation.
PCT/US2000/024642 1999-09-09 2000-09-08 Client presentation page content synchronized to a streaming data signal WO2001019088A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU73582/00A AU7358200A (en) 1999-09-09 2000-09-08 Client presentation page content synchronized to a streaming data signal

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US15313299P 1999-09-09 1999-09-09
US60/153,132 1999-09-09

Publications (1)

Publication Number Publication Date
WO2001019088A1 true WO2001019088A1 (en) 2001-03-15

Family

ID=22545912

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2000/024642 WO2001019088A1 (en) 1999-09-09 2000-09-08 Client presentation page content synchronized to a streaming data signal

Country Status (2)

Country Link
AU (1) AU7358200A (en)
WO (1) WO2001019088A1 (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002084638A1 (en) * 2001-04-10 2002-10-24 Presedia, Inc. System, method and apparatus for converting and integrating media files
JP2006081159A (en) * 2004-07-29 2006-03-23 Microsoft Corp Strategy for transmitting in-band control information
FR2925800A1 (en) * 2007-12-21 2009-06-26 Streamezzo Sa METHOD FOR SYNCHRONIZING RICH MEDIA ACTION WITH AUDIOVISUAL CHANGE, CORRESPONDING COMPUTER DEVICE AND COMPUTER PROGRAM, METHOD OF CREATING RICH MEDIA PRESENTATION, AND CORRESPONDING COMPUTER PROGRAM
US7685616B2 (en) 2004-07-29 2010-03-23 Microsoft Corporation Strategies for coalescing control processing
US8959016B2 (en) 2002-09-27 2015-02-17 The Nielsen Company (Us), Llc Activating functions in processing devices using start codes embedded in audio
US9100132B2 (en) 2002-07-26 2015-08-04 The Nielsen Company (Us), Llc Systems and methods for gathering audience measurement data
US9124769B2 (en) 2008-10-31 2015-09-01 The Nielsen Company (Us), Llc Methods and apparatus to verify presentation of media content
US9197421B2 (en) 2012-05-15 2015-11-24 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
US9210208B2 (en) 2011-06-21 2015-12-08 The Nielsen Company (Us), Llc Monitoring streaming media content
US9209917B2 (en) 2005-09-26 2015-12-08 The Nielsen Company (Us), Llc Methods and apparatus for metering computer-based media presentation
US9282366B2 (en) 2012-08-13 2016-03-08 The Nielsen Company (Us), Llc Methods and apparatus to communicate audience measurement information
US9313544B2 (en) 2013-02-14 2016-04-12 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
US9336784B2 (en) 2013-07-31 2016-05-10 The Nielsen Company (Us), Llc Apparatus, system and method for merging code layers for audio encoding and decoding and error correction thereof
US9380356B2 (en) 2011-04-12 2016-06-28 The Nielsen Company (Us), Llc Methods and apparatus to generate a tag for media content
US9609034B2 (en) 2002-12-27 2017-03-28 The Nielsen Company (Us), Llc Methods and apparatus for transcoding metadata
US9667365B2 (en) 2008-10-24 2017-05-30 The Nielsen Company (Us), Llc Methods and apparatus to perform audio watermarking and watermark detection and extraction
US9699499B2 (en) 2014-04-30 2017-07-04 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
US9711153B2 (en) 2002-09-27 2017-07-18 The Nielsen Company (Us), Llc Activating functions in processing devices using encoded audio and detecting audio signatures
US9711152B2 (en) 2013-07-31 2017-07-18 The Nielsen Company (Us), Llc Systems apparatus and methods for encoding/decoding persistent universal media codes to encoded audio
US9762965B2 (en) 2015-05-29 2017-09-12 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
US10003846B2 (en) 2009-05-01 2018-06-19 The Nielsen Company (Us), Llc Methods, apparatus and articles of manufacture to provide secondary content in association with primary broadcast media content
US10467286B2 (en) 2008-10-24 2019-11-05 The Nielsen Company (Us), Llc Methods and apparatus to perform audio watermarking and watermark detection and extraction

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5162904A (en) * 1991-03-28 1992-11-10 Abekas Video Systems, Inc. Video processing system having improved internal switching capability
WO1997022201A2 (en) * 1995-12-12 1997-06-19 The Board Of Trustees Of The University Of Illinois Method and system for transmitting real-time video
WO1998043437A1 (en) * 1997-03-21 1998-10-01 Canal+ Societe Anonyme Method of and apparatus for transmitting data for interactive tv applications
WO1998043432A1 (en) * 1997-03-21 1998-10-01 Canal+ Societe Anonyme Transmission and reception of television programmes and other data
US5818441A (en) * 1995-06-15 1998-10-06 Intel Corporation System and method for simulating two-way connectivity for one way data streams

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5162904A (en) * 1991-03-28 1992-11-10 Abekas Video Systems, Inc. Video processing system having improved internal switching capability
US5818441A (en) * 1995-06-15 1998-10-06 Intel Corporation System and method for simulating two-way connectivity for one way data streams
WO1997022201A2 (en) * 1995-12-12 1997-06-19 The Board Of Trustees Of The University Of Illinois Method and system for transmitting real-time video
WO1998043437A1 (en) * 1997-03-21 1998-10-01 Canal+ Societe Anonyme Method of and apparatus for transmitting data for interactive tv applications
WO1998043432A1 (en) * 1997-03-21 1998-10-01 Canal+ Societe Anonyme Transmission and reception of television programmes and other data

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
FURHT B ET AL: "IP SIMULCAST: A NEW TECHNIQUE FOR MULTIMEDIA BROADCASTING OVER THE INTERNET", CIT. JOURNAL OF COMPUTING AND INFORMATION TECHNOLOGY,HR,ZAGREB, vol. 6, no. 3, September 1998 (1998-09-01), pages 245 - 254, XP000870379, ISSN: 1330-1136 *

Cited By (57)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7039643B2 (en) * 2001-04-10 2006-05-02 Adobe Systems Incorporated System, method and apparatus for converting and integrating media files
WO2002084638A1 (en) * 2001-04-10 2002-10-24 Presedia, Inc. System, method and apparatus for converting and integrating media files
US9100132B2 (en) 2002-07-26 2015-08-04 The Nielsen Company (Us), Llc Systems and methods for gathering audience measurement data
US8959016B2 (en) 2002-09-27 2015-02-17 The Nielsen Company (Us), Llc Activating functions in processing devices using start codes embedded in audio
US9711153B2 (en) 2002-09-27 2017-07-18 The Nielsen Company (Us), Llc Activating functions in processing devices using encoded audio and detecting audio signatures
US9900652B2 (en) 2002-12-27 2018-02-20 The Nielsen Company (Us), Llc Methods and apparatus for transcoding metadata
US9609034B2 (en) 2002-12-27 2017-03-28 The Nielsen Company (Us), Llc Methods and apparatus for transcoding metadata
JP2006081159A (en) * 2004-07-29 2006-03-23 Microsoft Corp Strategy for transmitting in-band control information
EP1622386A3 (en) * 2004-07-29 2007-07-18 Microsoft Corporation Strategies for transmitting in-band control information
US7685616B2 (en) 2004-07-29 2010-03-23 Microsoft Corporation Strategies for coalescing control processing
US8266311B2 (en) 2004-07-29 2012-09-11 Microsoft Corporation Strategies for transmitting in-band control information
US9209917B2 (en) 2005-09-26 2015-12-08 The Nielsen Company (Us), Llc Methods and apparatus for metering computer-based media presentation
US8799356B2 (en) 2007-12-21 2014-08-05 Streamezzo Method for synchronizing a Rich Media action with an audiovisual change, corresponding device and computer software, method for generating a Rich Media presentation and corresponding computer software
WO2009083459A2 (en) * 2007-12-21 2009-07-09 Streamezzo Method for synchronising a rich media® action with an audiovisual change, corresponding device and computer software, method for generating a rich media® presentation, and corresponding computer software
FR2925800A1 (en) * 2007-12-21 2009-06-26 Streamezzo Sa METHOD FOR SYNCHRONIZING RICH MEDIA ACTION WITH AUDIOVISUAL CHANGE, CORRESPONDING COMPUTER DEVICE AND COMPUTER PROGRAM, METHOD OF CREATING RICH MEDIA PRESENTATION, AND CORRESPONDING COMPUTER PROGRAM
WO2009083459A3 (en) * 2007-12-21 2009-10-29 Streamezzo Method for synchronising a rich media® action with an audiovisual change, corresponding device and computer software, method for generating a rich media® presentation, and corresponding computer software
CN101953135A (en) * 2007-12-21 2011-01-19 斯特里米泽公司 Make method that the synchronous method of RICH MEDIA  action and audiovisual variation, corresponding device and computer software, generation RICH MEDIA  present and corresponding computer software
US11386908B2 (en) 2008-10-24 2022-07-12 The Nielsen Company (Us), Llc Methods and apparatus to perform audio watermarking and watermark detection and extraction
US11256740B2 (en) 2008-10-24 2022-02-22 The Nielsen Company (Us), Llc Methods and apparatus to perform audio watermarking and watermark detection and extraction
US10467286B2 (en) 2008-10-24 2019-11-05 The Nielsen Company (Us), Llc Methods and apparatus to perform audio watermarking and watermark detection and extraction
US10134408B2 (en) 2008-10-24 2018-11-20 The Nielsen Company (Us), Llc Methods and apparatus to perform audio watermarking and watermark detection and extraction
US11809489B2 (en) 2008-10-24 2023-11-07 The Nielsen Company (Us), Llc Methods and apparatus to perform audio watermarking and watermark detection and extraction
US9667365B2 (en) 2008-10-24 2017-05-30 The Nielsen Company (Us), Llc Methods and apparatus to perform audio watermarking and watermark detection and extraction
US11778268B2 (en) 2008-10-31 2023-10-03 The Nielsen Company (Us), Llc Methods and apparatus to verify presentation of media content
US11070874B2 (en) 2008-10-31 2021-07-20 The Nielsen Company (Us), Llc Methods and apparatus to verify presentation of media content
US10469901B2 (en) 2008-10-31 2019-11-05 The Nielsen Company (Us), Llc Methods and apparatus to verify presentation of media content
US9124769B2 (en) 2008-10-31 2015-09-01 The Nielsen Company (Us), Llc Methods and apparatus to verify presentation of media content
US10555048B2 (en) 2009-05-01 2020-02-04 The Nielsen Company (Us), Llc Methods, apparatus and articles of manufacture to provide secondary content in association with primary broadcast media content
US11004456B2 (en) 2009-05-01 2021-05-11 The Nielsen Company (Us), Llc Methods, apparatus and articles of manufacture to provide secondary content in association with primary broadcast media content
US10003846B2 (en) 2009-05-01 2018-06-19 The Nielsen Company (Us), Llc Methods, apparatus and articles of manufacture to provide secondary content in association with primary broadcast media content
US11948588B2 (en) 2009-05-01 2024-04-02 The Nielsen Company (Us), Llc Methods, apparatus and articles of manufacture to provide secondary content in association with primary broadcast media content
US9681204B2 (en) 2011-04-12 2017-06-13 The Nielsen Company (Us), Llc Methods and apparatus to validate a tag for media
US9380356B2 (en) 2011-04-12 2016-06-28 The Nielsen Company (Us), Llc Methods and apparatus to generate a tag for media content
US10791042B2 (en) 2011-06-21 2020-09-29 The Nielsen Company (Us), Llc Monitoring streaming media content
US9515904B2 (en) 2011-06-21 2016-12-06 The Nielsen Company (Us), Llc Monitoring streaming media content
US11252062B2 (en) 2011-06-21 2022-02-15 The Nielsen Company (Us), Llc Monitoring streaming media content
US9210208B2 (en) 2011-06-21 2015-12-08 The Nielsen Company (Us), Llc Monitoring streaming media content
US11784898B2 (en) 2011-06-21 2023-10-10 The Nielsen Company (Us), Llc Monitoring streaming media content
US11296962B2 (en) 2011-06-21 2022-04-05 The Nielsen Company (Us), Llc Monitoring streaming media content
US9838281B2 (en) 2011-06-21 2017-12-05 The Nielsen Company (Us), Llc Monitoring streaming media content
US9209978B2 (en) 2012-05-15 2015-12-08 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
US9197421B2 (en) 2012-05-15 2015-11-24 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
US9282366B2 (en) 2012-08-13 2016-03-08 The Nielsen Company (Us), Llc Methods and apparatus to communicate audience measurement information
US9313544B2 (en) 2013-02-14 2016-04-12 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
US9357261B2 (en) 2013-02-14 2016-05-31 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
US9711152B2 (en) 2013-07-31 2017-07-18 The Nielsen Company (Us), Llc Systems apparatus and methods for encoding/decoding persistent universal media codes to encoded audio
US9336784B2 (en) 2013-07-31 2016-05-10 The Nielsen Company (Us), Llc Apparatus, system and method for merging code layers for audio encoding and decoding and error correction thereof
US9699499B2 (en) 2014-04-30 2017-07-04 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
US11277662B2 (en) 2014-04-30 2022-03-15 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
US10721524B2 (en) 2014-04-30 2020-07-21 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
US11831950B2 (en) 2014-04-30 2023-11-28 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
US10231013B2 (en) 2014-04-30 2019-03-12 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
US11057680B2 (en) 2015-05-29 2021-07-06 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
US11689769B2 (en) 2015-05-29 2023-06-27 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
US10694254B2 (en) 2015-05-29 2020-06-23 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
US9762965B2 (en) 2015-05-29 2017-09-12 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media
US10299002B2 (en) 2015-05-29 2019-05-21 The Nielsen Company (Us), Llc Methods and apparatus to measure exposure to streaming media

Also Published As

Publication number Publication date
AU7358200A (en) 2001-04-10

Similar Documents

Publication Publication Date Title
WO2001019088A1 (en) Client presentation page content synchronized to a streaming data signal
US9584571B2 (en) System and method for capturing, editing, searching, and delivering multi-media content with local and global time
US6173317B1 (en) Streaming and displaying a video stream with synchronized annotations over a computer network
US6449653B2 (en) Interleaved multiple multimedia stream for synchronized transmission over a computer network
US7617272B2 (en) Systems and methods for enhancing streaming media
US6697569B1 (en) Automated conversion of a visual presentation into digital data format
US6230172B1 (en) Production of a video stream with synchronized annotations over a computer network
AU2014350067B2 (en) A video broadcast system and a method of disseminating video content
US20070124788A1 (en) Appliance and method for client-sided synchronization of audio/video content and external data
US20050154679A1 (en) System for inserting interactive media within a presentation
US6249914B1 (en) Simulating two way connectivity for one way data streams for multiple parties including the use of proxy
US20020120939A1 (en) Webcasting system and method
US10757365B2 (en) System and method for providing and interacting with coordinated presentations
EP0984584A1 (en) Internet multimedia broadcast system
US20110072466A1 (en) Browsing and Retrieval of Full Broadcast-Quality Video
US20050144305A1 (en) Systems and methods for identifying, segmenting, collecting, annotating, and publishing multimedia materials
WO2005048035A2 (en) Rich media event production system and method including the capturing, indexing, and synchronizing of rgb-based graphic content
England et al. Rave: Real-time services for the web
EP1971145A2 (en) Method and apparatus for displaying interactive data in real time
US20020019978A1 (en) Video enhanced electronic commerce systems and methods
US20020154244A1 (en) Moving images synchronization system
US20080276289A1 (en) System for video presentations with adjustable display elements
JP4565232B2 (en) Lecture video creation system
JP2004040502A (en) Information-reproducing apparatus, information-reproducing method, and information reproducing system
KR100359514B1 (en) System and method for internet data broadcast and media storing program source thereof

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP