WO2006045061A2 - Method for synchronizing events with streaming data - Google Patents

Method for synchronizing events with streaming data Download PDF

Info

Publication number
WO2006045061A2
WO2006045061A2 PCT/US2005/037951 US2005037951W WO2006045061A2 WO 2006045061 A2 WO2006045061 A2 WO 2006045061A2 US 2005037951 W US2005037951 W US 2005037951W WO 2006045061 A2 WO2006045061 A2 WO 2006045061A2
Authority
WO
WIPO (PCT)
Prior art keywords
stream
client
data
metadata
received
Prior art date
Application number
PCT/US2005/037951
Other languages
French (fr)
Other versions
WO2006045061A3 (en
Inventor
Samuel Sergio Tenembaum
Ivan A. Ivanoff
Jorge A. Estevez
Original Assignee
Porto Ranelli, Sa
Pi Trust
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Porto Ranelli, Sa, Pi Trust filed Critical Porto Ranelli, Sa
Publication of WO2006045061A2 publication Critical patent/WO2006045061A2/en
Publication of WO2006045061A3 publication Critical patent/WO2006045061A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43074Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of additional data with content streams on the same device, e.g. of EPG data or interactive icon with a TV program
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/43Querying
    • G06F16/438Presentation of query results
    • G06F16/4387Presentation of query results by the use of playlists
    • G06F16/4393Multimedia presentations, e.g. slide shows, multimedia albums
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
    • H04N21/4622Retrieving content or additional data from different sources, e.g. from a broadcast channel and the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/84Generation or processing of descriptive data, e.g. content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8455Structuring of content, e.g. decomposing content into time segments involving pointers to the content, e.g. pointers to the I-frames of the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/858Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot

Definitions

  • the present invention relates generally to a method for synchronizing events on a computer with a stream of data being received by such computer from another computer through a network.
  • the resulting synchronization of streamed data and pre-programmed events can be used to trigger actions on the local computer based on the amount of data being received.
  • the invention enables the presentation of offers and interactivity in streamed content, matching it to specific points within the content.
  • links By clicking on a hyperlink ("link") with a mouse, users effectively request the transmission of one or many data files their local computers. Until a few years ago, before streaming of files was available, the client computer could not start accessing data in the downloaded file until the entire file had been transferred. This caused large files, such as video or audio or to be impractical for most users, unless they were willing to wait 10 minutes or more after clicking on something on a screen before seeing the results?
  • Streaming files changed that by organizing data into a format that can be interpreted by the receiving machine as it arrives, in real time. Among other things, this permits video signals to be broadcast via a network, with the client machine rendering the media in real time as the data arrives, without having to wait for the entire file to arrive.
  • Many streaming formats have been developed by various consortiums and private entities, the most famous being part of the QuickTime (Apple), RealMedia (Real Networks) and Win Media (Microsoft) multimedia platforms.
  • streaming formats solved the problem of accessing large linear files via slower connections, making the distribution of video and music through the Internet a viable enterprise. Nevertheless, the nature of streaming files prevents the client computer from verifying the integrity of the data received, since it has to process it and move onto the next incoming package. If data is lost during transmission, it is lost, and any synchronicity between elements is lost with it. With streaming files, information can be expected to be lost.
  • the present invention solves this problem by using 2 parallel connections: the data stream, and an additional connection which relays metadata about the stream, such as how much has been transmitted by the streaming server.
  • the current invention functions by calculating the sync points between the stream and the programmed events based on the data sent by the server, not the data received by the client.
  • the invention utilizes two independent timelines:
  • the data stream and An event-sync connection.
  • the data stream which is decoded and rendered as it arrives.
  • This "media timeline” is completely linear: information is displayed as data arrives; data is used to generate the media (audio and video, for example).
  • a separate event-sync connection is established for sync purposes.
  • the information coming from this alternate connection is used on the client side to skip along the "events timeline".
  • This second timeline is independent from the stream and non-linear, meaning that the system can access any event at any time.
  • a parallel connection to the streaming server is used to report metadata (in the case of video: the amount of time, the frames per second, etc.).
  • This metadata is used by the client-side application to trigger events based on what the server indicates it has streamed, not based on what the client has received in the stream.
  • a client side component performs two parallel tasks: it receives the media stream and renders it, while at the same time triggering events based on metadata received from the streaming server via a separate link.
  • the currently preferred embodiment uses Macromedia Flash for the client side component and Macromedia Flash Server for the server side.
  • the client computer sets a mark in time effectively starting a stopwatch
  • the client component uses the other connection to figure out how much data the server has pushed
  • the client computer can accurately trigger events scheduled for T1 + 3 seconds.
  • Time is perceived by humans to move in a linear, sequential manner. TO comes before T1 , which comes before T2, which comes before T3, etc. Video is presented to observers in the same way, the first frame precedes the second frame, which in turn precedes the third frame, etc.
  • the sequence of frames, presented to a user in order is defined as a timeline: it is basically the linear arrangement of frames in order to represent passing time.
  • Table A shows the way in which REAL TIME and the VIDEO TIMELINE relate.
  • Table A shows the way in which REAL TIME and the VIDEO TIMELINE relate.
  • Table A shows the way in which REAL TIME and the VIDEO TIMELINE relate.
  • Table A shows the way in which REAL TIME and the VIDEO TIMELINE relate.
  • Table A shows the way in which REAL TIME and the VIDEO TIMELINE relate.
  • Table A shows a perfect match between elapsed time and presented frames. For every passing time unit the video timeline renders a unique and matching video frame. Since real time and the video timeline match, it would be possible to synchronize any event to the video timeline by using real time as a reference. Should one want to match any event to the image presented on frame 4, all that need done is to instruct the program to trigger the event on second 4 (T4). If this were a real world scenario, all that would need to synchronize events would be to trigger them based on elapsed real time.
  • Table B shows a case in which time and the video timeline loose their correspondence. If frame 3 becomes delayed during transmission and arrives at the client computer a second later, any event synchronized to it will trigger early.
  • the table shows no frame being rendered on T3, which causes a misstep and leaves the video timeline lagging with regards to elapsed time, placing event D on frame 3 instead of frame 4. It is clear that T4 now matches frame 3, therefore events synched to T4 will not take place in frame 4, but in frame 3.
  • Table B shows a case in which time and the video timeline loose their correspondence. If frame 3 becomes delayed during transmission and arrives at the client computer a second later, any event synchronized to it will trigger early. The table shows no frame being rendered on T3, which causes a misstep and leaves the video timeline lagging with regards to elapsed time, placing event D on frame 3 instead of frame 4. It is clear that T4 now matches frame 3, therefore events synched to T4 will not take place in frame 4, but in frame 3.
  • Table B shows a case in which time and the video timeline loose
  • the way to match frames and events, allowing for data loss, is to match the events to the transmitted data, and not to elapsed time. This is achieved using a sync signal, a parallel connection between the client and the server that serves as a control stream, albeit an intermittent one.
  • the current embodiment of the invention used to synchronize streamed video and client based events is built using technologies available from Macromedia and Adobe, among others.
  • Macromedia Flash is used to program a client side module that requests and receives the stream of video while simultaneously connecting to the Flash Content Server and triggering events based on the data from this intermittent connection. In order for this to work several steps need to take place.
  • the video stream timeline and “the event timeline” can be synchronized as they play, they need to be matched in authoring. This is done by using the video as a guide for building the events timeline. Since the events will be programmed using Flash in the current embodiment, we need to use a video format that is compatible with Flash.
  • the FLV streaming format is used in this example, since it is the same as what will be streamed.
  • ADOBE AFTER EFFECTS To turn video into an FLV file, we use ADOBE AFTER EFFECTS. The video is imported into After Effects, where adjustments can be made to its size, frame rate, duration, quality, compression, etc. Once the video is of desired size and quality, the FLV file is generated.
  • the FLV file is imported into Flash, where a key step is that its video properties are matched in the Flash file to the ones in the FLV file. Frame rate must be the same on both, otherwise the procedure will not work.
  • the resulting code will preferably look something like this (although alternative coding will be apparent to those skilled in the art):
  • FIG. 1 is a block diagram showing the computers involved and data transmitted between them, where block A is the client computer, running a web browser displaying an HTML document which holds a Flash file (swf).
  • Block B is a web server and block C is a Flash Content Server.
  • the first thing that takes place, as represented by data flow 1 is that the HTML document requests the SWF file from the web server.
  • data flow 2 the web server returns the SWF, which is executed and requests a connection to the Flash Content Server (FCS) in data flow 3.
  • FCS Flash Content Server
  • the connection is established via data flow 4, and through it the SWF requests the video stream as seen in data flow 5.
  • the SWF and the FCS keep communicated via data flow 7 intermittently.
  • TO as reference, along with the information in data flow 7 regarding the amount of data transmitted, the SWF file triggers events that are matched to their corresponding frame. In other words, the events timeline does not run linearly, but it jumps from one frame to another based on the following question: how much info has the FCS sent? Instead of: how much time has elapsed?
  • onBWDone function(p_bw) ⁇ tracefonBWDone: "+p_bw); bnc.closeO;

Abstract

The invention solves the problem of synchronizing events in a computer (A) with data being streamed into the same computer from a server (B) in real-time. It allows the coordination, organization and presentation of information, graphics, video, e-­commerce and any other computer event with data entering the computer as a streamed file (1), as it arrives (2). The commercial possibilities are various, and range from products and services being offered in accordance to what is being viewed by the user (C), to the enhancement of the multimedia experience by other means (C). The invention allows for the synchronization of various timelines (3, 4, 5, 6, 7).

Description

METHOD FOR SYNCHRONIZING EVENTS WITH STREAMING DATA
Field of the invention
The present invention relates generally to a method for synchronizing events on a computer with a stream of data being received by such computer from another computer through a network.
The resulting synchronization of streamed data and pre-programmed events can be used to trigger actions on the local computer based on the amount of data being received.
The most obvious use for the current invention is the interaction of streamed media (audio and video) on a webpage (or other internet enabled application) with client-based procedures. The value of this will be particularly appreciated by anyone familiar with Internet advertising or interactive narrative.
The invention enables the presentation of offers and interactivity in streamed content, matching it to specific points within the content.
Background of the invention
The transmission of files from one computer to another via a network has become commonplace in modern life. Anyone who uses a computer has, knowingly or unknowingly, requested a file from another computer or server. The most common method uses hyperlinks on a webpage.
By clicking on a hyperlink ("link") with a mouse, users effectively request the transmission of one or many data files their local computers. Until a few years ago, before streaming of files was available, the client computer could not start accessing data in the downloaded file until the entire file had been transferred. This caused large files, such as video or audio or to be impractical for most users, unless they were willing to wait 10 minutes or more after clicking on something on a screen before seeing the results?
Streaming files changed that by organizing data into a format that can be interpreted by the receiving machine as it arrives, in real time. Among other things, this permits video signals to be broadcast via a network, with the client machine rendering the media in real time as the data arrives, without having to wait for the entire file to arrive. Many streaming formats have been developed by various consortiums and private entities, the most famous being part of the QuickTime (Apple), RealMedia (Real Networks) and Win Media (Microsoft) multimedia platforms.
The advent of streaming formats solved the problem of accessing large linear files via slower connections, making the distribution of video and music through the Internet a viable enterprise. Nevertheless, the nature of streaming files prevents the client computer from verifying the integrity of the data received, since it has to process it and move onto the next incoming package. If data is lost during transmission, it is lost, and any synchronicity between elements is lost with it. With streaming files, information can be expected to be lost.
Summary of the Invention
The present invention solves this problem by using 2 parallel connections: the data stream, and an additional connection which relays metadata about the stream, such as how much has been transmitted by the streaming server.
In essence, the current invention functions by calculating the sync points between the stream and the programmed events based on the data sent by the server, not the data received by the client. The invention utilizes two independent timelines:
The data stream; and An event-sync connection.
The data stream, which is decoded and rendered as it arrives. This "media timeline" is completely linear: information is displayed as data arrives; data is used to generate the media (audio and video, for example).
A separate event-sync connection is established for sync purposes. The information coming from this alternate connection is used on the client side to skip along the "events timeline". This second timeline is independent from the stream and non-linear, meaning that the system can access any event at any time. In other words, in addition to the stream of data, a parallel connection to the streaming server is used to report metadata (in the case of video: the amount of time, the frames per second, etc.). This metadata is used by the client-side application to trigger events based on what the server indicates it has streamed, not based on what the client has received in the stream. In essence, a client side component performs two parallel tasks: it receives the media stream and renders it, while at the same time triggering events based on metadata received from the streaming server via a separate link.
The currently preferred embodiment uses Macromedia Flash for the client side component and Macromedia Flash Server for the server side.
A simple way to understand the invention is:
-at the exact moment when the stream starts arriving,
-the client computer sets a mark in time effectively starting a stopwatch,
-with that point in time as a starting line, the client component uses the other connection to figure out how much data the server has pushed
-this data is used to time the events.
Thus, for example, if the stream starts arriving at T1 , and the metadata indicates that 3 seconds of video have been delivered, then the client computer can accurately trigger events scheduled for T1 + 3 seconds.
This solves any inaccuracies that may be caused by dropped data, since the events are coordinated with the metadata and not the stream, and the metadata can be checked for errors.
Brief description of the drawings
The foregoing brief description, as well as further objects, features, and advantages of the present invention will be understood more completely from the following detailed description of a presently preferred, but nonetheless illustrative, embodiment with reference being had to the accompanying drawings, in which the only figure is a block diagram describing the exchange of data between the computers involved, where A is the computer client running a web browser, B is a web server and C is a Flash Content Server.
Detailed Description of the Preferred Embodiment
The present application describes the currently preferred embodiment of the invention, it is used for illustration purposes and is in no way the only manner in which to achieve the described results.
In order to further understand the invention, it is important to understand the concept of the timeline. Time is perceived by humans to move in a linear, sequential manner. TO comes before T1 , which comes before T2, which comes before T3, etc. Video is presented to observers in the same way, the first frame precedes the second frame, which in turn precedes the third frame, etc. The sequence of frames, presented to a user in order is defined as a timeline: it is basically the linear arrangement of frames in order to represent passing time.
Table A shows the way in which REAL TIME and the VIDEO TIMELINE relate. On Table A we see a perfect match between elapsed time and presented frames. For every passing time unit the video timeline renders a unique and matching video frame. Since real time and the video timeline match, it would be possible to synchronize any event to the video timeline by using real time as a reference. Should one want to match any event to the image presented on frame 4, all that need done is to instruct the program to trigger the event on second 4 (T4). If this were a real world scenario, all that would need to synchronize events would be to trigger them based on elapsed real time.
Table A
Figure imgf000006_0001
But Table B shows a case in which time and the video timeline loose their correspondence. If frame 3 becomes delayed during transmission and arrives at the client computer a second later, any event synchronized to it will trigger early. The table shows no frame being rendered on T3, which causes a misstep and leaves the video timeline lagging with regards to elapsed time, placing event D on frame 3 instead of frame 4. It is clear that T4 now matches frame 3, therefore events synched to T4 will not take place in frame 4, but in frame 3. Table B
Figure imgf000007_0001
This is the case when events are synched to real passing time or a clock. In the current scenario, the objective is to synchronize events to frames, not to ticking seconds. The problem is solved in Table C, where it can be seen that a skipped frame results in a skipped event, managing to save the relation between frames and events: event C matched frame 3. The fact that this takes place in T4 is irrelevant, since the objective is to match programmed events to video frames (or streamed data).
Table C
Figure imgf000007_0002
The way to match frames and events, allowing for data loss, is to match the events to the transmitted data, and not to elapsed time. This is achieved using a sync signal, a parallel connection between the client and the server that serves as a control stream, albeit an intermittent one.
The current embodiment of the invention used to synchronize streamed video and client based events is built using technologies available from Macromedia and Adobe, among others.
Macromedia Flash is used to program a client side module that requests and receives the stream of video while simultaneously connecting to the Flash Content Server and triggering events based on the data from this intermittent connection. In order for this to work several steps need to take place.
Before the "video stream timeline" and "the event timeline" can be synchronized as they play, they need to be matched in authoring. This is done by using the video as a guide for building the events timeline. Since the events will be programmed using Flash in the current embodiment, we need to use a video format that is compatible with Flash. The FLV streaming format is used in this example, since it is the same as what will be streamed. To turn video into an FLV file, we use ADOBE AFTER EFFECTS. The video is imported into After Effects, where adjustments can be made to its size, frame rate, duration, quality, compression, etc. Once the video is of desired size and quality, the FLV file is generated.
The FLV file is imported into Flash, where a key step is that its video properties are matched in the Flash file to the ones in the FLV file. Frame rate must be the same on both, otherwise the procedure will not work.
Using the embedded video as a place-holder, we now develop and program any events and match them to the image. Synchronicity is not an issue, since all elements are on a single computer and a common program, so any event timed for frame 3 will always happen in frame 3. Once the programming is done and the events all take place in synchronism with their target frames, we delete the video and replace it with a remote call to a streaming file.
The resulting code will preferably look something like this (although alternative coding will be apparent to those skilled in the art):
_root.filename = "an1 ";(without the ". flv" extension) jOot.direccion = "rtmp://200.47.135.86/killbill/";(Flash Communication Server internet address)
_root.iniciaMovie = 2;
_root.bufferTime = 2;(buffering time)
_root.MovieFps = 6;(framerate)
_root.movieRunTime = 29;
We now need to place the .FLV streaming file in the correct directory. The final step is to establish a connection between the events played inside the flash file, and the video streamed from the server. This can be better understood by looking reference to the figure. Figure 1 is a block diagram showing the computers involved and data transmitted between them, where block A is the client computer, running a web browser displaying an HTML document which holds a Flash file (swf). Block B is a web server and block C is a Flash Content Server.
The first thing that takes place, as represented by data flow 1 , is that the HTML document requests the SWF file from the web server. In data flow 2 the web server returns the SWF, which is executed and requests a connection to the Flash Content Server (FCS) in data flow 3. The connection is established via data flow 4, and through it the SWF requests the video stream as seen in data flow 5. Data flow 6 shows the data stream traveling from FCS to SWF, from the server to the client. The instant the stream starts arriving at the client the timeline is reset to T=O. The SWF and the FCS keep communicated via data flow 7 intermittently. Using TO as reference, along with the information in data flow 7 regarding the amount of data transmitted, the SWF file triggers events that are matched to their corresponding frame. In other words, the events timeline does not run linearly, but it jumps from one frame to another based on the following question: how much info has the FCS sent? Instead of: how much time has elapsed?
An exemplary embodiment of the present invention can be implemented using the following computer code:
Shoshmosis Complete code dynamic class shoshmosis extends MovieClip { function shoshmosis() { atachi = false; atach2 = false; timer = setlnterval(this, "syncronizo", 100); scope = this; detectarBW();
}
[lnspectable( type="String", defaultValue="rtmp://.../" )] public function set Sdireccion(direccion:String) { direccion = direccion; invalidate();
} public function get Sdireccion():String { return direccion; }
[lnspectable( type="String", defaultValue="videos/becool/becool" )] public function set SfileName(fileName:String) { fileName = fileName; invalidate();
} public function get SfileName():String { return fileName;
}
[lnspectable( type="Number", defaultValue="4" )] public function set SiniciaMovie(iniciaMovie:Number) { iniciaMovie = iniciaMovie; invalidate();
} public function get SiniciaMovie():Number { return iniciaMovie;
}
[lnspectable( type="Number", defaultValue="5" )] public function set SbufferTime(bufferTime:Number) { bufferTime = bufferTime; invalidateQ;
} public function get SbufferTime():Number { return bufferTime;
}
[lnspectable( type="String", defaultValue="12" )] public function set SMovieFps(MovieFps:String) { MovieFps = MovieFps; invalidate();
} public function get SpausePlayX():String { return pausePlayX;
}
[lnspectable( type="String", defaultValue="650" )] public function set SpausePlayX(pausePlayX:String) { pausePlayX = pausePlayX; invalidate();
} public function get SpausePlayY():String { return pausePlayY;
}
[lnspectable( type="String", defaultValue="335" )] public function set SpausePlayY(pausePlayY:String) { pausePlayY = pausePlayY; invalidateQ;
} public function get SrestopX():String { return restopX;
}
[lnspectable( type="String", defaultValue="700" )] public function set SrestopX(restopX:String) { restopX = restopX; invalidate();
} public function get SrestopY():String { return restopY;
}
[lnspectable( type="String", defaultValue="335" )] public function set SrestopY(restopY: String) { restopY = restopY; invalidate();
} public function get SMovieFps():String { return MovieFps;
}
[lnspectable( type="String", defaultValue="51.8" )] public function set SmovieRunTime(movieRunTime:String) { movieRunTime = movieRunTime; invalidate();
} public function get SmovieRunTime():String { return movieRunTime;
}
[lnspectable( type="Boolean", defaultValue="true" )] public function set SpausePlay(pausePlay:Boolean) { pausePlay = pausePlay; invalidate();
} public function get SpausePlay():Boolean { return pausePlay;
}
[lnspectable( type="Boolean", defaultValue="true" )] public function set Sireplay(sReplay:Boolean) { sReplay = sReplay; invalidate^);
} public function get Sireplay():Boolean { return sReplay;
}
[lnspectable( type="Boolean", defaultValue="true" )] public function set SocultaVideo(ocultaVideo:Boolean) { ocultaVideo = ocultaVideo; invalidate();
} public function get SocultaVideo():Boolean { return ocultaVideo;
} function detectarBWQ { bnc = new NetConnection(); sco = this; bnc.onStatus = function(info) { traceC'Level: "+info.level+" Code: "+info.code); if (info.code == "NetConnection.Connect.Success") { trace("— connected to: "+this.uri); }
};
NetConnection. prototype. onBWDone = function(p_bw) { tracefonBWDone: "+p_bw); bnc.closeO;
NetConnection. prototype. onBWDone = null; if (p_bw<=320) {
_root.calidad = "048";
} if (p_bw>=321 && p_bw<=504) { _root.calidad = "300";
} if (p_bw>=505) {
_root.calidad = "512";
} scope. SfileName = scope. SfileName.concat(_root.calidad); trace("++++"+scope.SfileName); scope. conectar();
};
NetConnection. prototype. onBWCheck = function() { return ++counter;
// Serverside, just ignore any return value and return the call count
}; bnc.connect("rtmp://cp13971.edgefcs.net/ondemand", true);
} function conectar() { myCon = new NetConnection(); myCon.onStatus = function(obj) { if (obj.code == "NetConnection.Connect.Success") { scope. streamQ;
}
}; trace(scope.Sdireccion) myCon.connect(scope.Sdireccion);
} function streamQ { scope. myNetStream = new NetStream(myCon); scope. vid_mc.attachVideo(scope. myNetStream); scope. myNetStream. setBufferTime(scope.SbufferTime); scope. myNetStream. onStatus = function(obj) { if (obj.code == "NetStream.Play.Start") { }
}; trace(scope. SfileName) scope. myNetStream. play(scope. SfileName); _root.SH_JS_START = "OK";
} function syncronizo() { frameFoward = int(scope.myNetStream.time*scope.SMovieFps); scope._parent.gotoAndStop(scope.SiniciaMovie+frameFoward); if (scope. SpausePlay) { if (scope. myNetStream.time>0 && !atach1 ) { thisBug = scope._parent.attachMovie("pause_play", "pause_play_mc", scope._parent.getNextHighestDepth(), (_x:SpausePlayX, _y:SpausePlayY}); thisBug3 = scope^parent.attachMovieC'restop", "restop_mc", scope._parent.getNextHighestDepth(), {_x:SrestopX, _y;SrestopY}); atachi = true; } } if (scope. myNetStream.time>scope.SmovieRunTime) { _root.SH_JS_END = "true"; _root.SH_JS_FINISH = "OK"; scope. myNetStream.close(scope.SfileName); clearlnterval(timer); thisBug.removeMovieClipO; if (!atach2) { if (scope. Sireplay) { thisBug2 = scope.jDarent.attachMovieC'replay", "replay_mc", scope._parent.getNextHighestDepth(), {_x:SpausePlayX, _y:SpausePlayY}); atach2 = true; } else if (scope. SocultaVideo) { scope._parent.gotoAndStop(SiniciaMovie-1 );
} atach2 = true;
}
} } function pause_play() { scope. myNetStream.pause(scope.SfileName);
} function replayQ { thisBug3.removeMovieClip(); thisBug2.removeMovieClip(); scope._parent.gotoAndPlay(SiniciaMovie-1 );
} function restopQ { thisBug3.removeMovieClip(); thisBug.removeMovieClipO; thisBug2.removeMovieClip(); scope._parent.gotoAndStop(SiniciaMovie-1 );
//el path que sigue es solo para este caso. Modificar segύn corresponda _root.sonido.snd.setVolume(0); _root.SH_JS_CLOSE = "OK"; } }
This function enables the synchronization process function syncronizo() { get the video streaming time elapsed get the flash movie frames per seconds calculate the exact next posicion of the playhead in the current timeline frameFoward = int(scope.myNetStream.time*scope.SMovieFps); move the playhead(in the user's timeline) to the accurate frame ("synchronized" whith the playhead in the video streaming, server side) scope._parent.gotoAndStop(scope.SiniciaMovie+frameFoward); handle the pause and stop process. if (scope. SpausePlay) { if (scope. myNetStream.time>0 && !atach1 ) { thisBug = scope._parent.attachMovie("pause_play", "pause_play_mc", scope._parent.getNextHighestDepth(), {_x:SpausePlayX, _y:SpausePlayY}); thisBug3 = scope.jDarent.attachMovieC'restop", "restopjmc", scope._parent.getNextHighestDepth(), {_x:SrestopX, _y:SrestopY}); atacM = true;
} } Control the video's "end of file" if (scope. myNetStream.time>scope.SmovieRunTime) { _root.SH_JS_END = "true"; _root.SH_JS_FINISH = "OK"; scope. myNetStream.close(scope.SfileName); clearlnterval(timer); thisBug.removeMovieClipO; if (!atach2) {
Control the replay process if (scope. Sireplay) { thisBug2 = scope^parent.attachMovieC'replay", "replay_mc", scope._parent.getNextHighestDepth(), (_x:SpausePlayX, _y:SpausePlayY}); atach2 = true;
Control the hide Video Process } else if (scope. SocultaVideo) { scope._parent.gotoAndStop(SiniciaMovie-1 );
} atach2 = true;
}
} While the invention has been particularly shown and described with reference to preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention.

Claims

WHAT IS CLAIMED:
1. A method for synchronizing events on a computer with a stream of data being received by such computer from another computer through a network.
2. A method of claim 1 in which advertising is synced to a video or sound stream.
3. A method of claim 1 in which two concurrent connections are used, a data stream and a separate connection, which relays metadata about the stream.
4. A method of claim 3 in which the data stream is QuickTime (Apple), RealMedia (Real Networks) or Win Media (Microsoft) multimedia platforms; and the metadata is sent via xml.
5. A method of claim 1 in which sync points between the stream and programmed events are calculated based on the data sent by the server, not on the data received by the client, using metadata on the client-side application to trigger events based on what the server indicates it has streamed, not based on what the client has received in the stream.
6. A method of claim 1 in which a client side component performs two parallel tasks: it receives the media stream and renders it, while at the same time triggering events based on metadata received from the streaming server via a separate link.
7. A method of claim 6 in which the client side component uses Macromedia Flash.
8. A method for synchronizing events in a computer with data being streamed onto the same computer from a server in real-time.
9. A method of claim 8 in which advertising is synced to a video or sound stream.
10. A method of claim 8 in which two concurrent connections are used, a data stream and a separate connection, which relays metadata about the stream.
1 1. A method of claim 10 in which the data stream is QuickTime (Apple), RealMedia (Real Networks) or Win Media (Microsoft) multimedia platforms; and the metadata is sent via xml.
12. A method of claim 8 in which sync points between the stream and programmed events are calculated based on the data sent by the server, not on the data received by the client, using metadata on the client-side application to trigger events based on what the server indicates it has streamed, not based on what the client has received in the stream.
13. A method of claim 8 in which a client side component performs two parallel tasks: it receives the media stream and renders it, while at the same time triggering events based on metadata received from the streaming server via a separate link.
14. A method of claim 13 in which the client side component uses Macromedia Flash.
15. A method for coordinating information, graphics, video, e-commerce and any other computer event with data entering the computer as a streamed file, as it arrives.
16. A method of claim 15 in which advertising is synced to a video or sound stream.
17. A method of claim 15 in which two concurrent connections are used, a data stream and a separate connection, which relays metadata about the stream.
18. A method of claim 17 in which the data stream is QuickTime (Apple), RealMedia (Real Networks) or Win Media (Microsoft) multimedia platforms; and the metadata is sent via xml.
19. A method of claim 15 in which sync points between the stream and programmed events are calculated based on the data sent by the server, not on the data received by the client, using metadata on the client-side application to trigger events based on what the server indicates it has streamed, not based on what the client has received in the stream.
20. A method of claim 15 in which a client side component performs two parallel tasks: it receives the media stream and renders it, while at the same time triggering events based on metadata received from the streaming server via a separate link.
21. A method of claim 20 in which the client side component uses Macromedia Flash.
22. A method for synchronization of streamed data and pre-programmed based on the amount of data being received.
23. A method of claim 22 in which advertising is synced to a video or sound stream.
24. A method of claim 22 in which two concurrent connections are used, a data stream and a separate connection, which relays metadata about the stream.
25. A method of claim 24 in which the data stream is QuickTime (Apple), RealMedia (Real Networks) or Win Media (Microsoft) multimedia platforms; and the metadata is sent via xml.
26. A method of claim 22 in which sync points between the stream and programmed events are calculated based on the data sent by the server, not on the data received by the client, using metadata on the client-side application to trigger events based on what the server indicates it has streamed, not based on what the client has received in the stream.
27. A method of claim 22 in which a client side component performs two parallel tasks: it receives the media stream and renders it, while at the same time triggering events based on metadata received from the streaming server via a separate link.
28. A method of claim 27 in which the client side component uses Macromedia Flash.
29. A method for enabling the interaction of streamed media (audio and video) on a webpage (or other internet enabled application) with client-based procedures.
30. A method of claim 29 in which advertising is synced to a video or sound stream.
31. A method of claim 29 in which two concurrent connections are used, a data stream and a separate connection, which relays metadata about the stream.
32. A method of claim 31 in which the data stream is QuickTime (Apple), RealMedia (Real Networks) or Win Media (Microsoft) multimedia platforms; and the metadata is sent via xml.
33. A method of claim 29 in which sync points between the stream and programmed events are calculated based on the data sent by the server, not on the data received by the client, using metadata on the client-side application to trigger events based on what the server indicates it has streamed, not based on what the client has received in the stream.
34. A method of claim 29 in which a client side component performs two parallel tasks: it receives the media stream and renders it, while at the same time triggering events based on metadata received from the streaming server via a separate link.
35. A method of claim 34 which the client side component uses Macromedia Flash.
36. A method for coordinating information, graphics, video, e-commerce and any other computer event with data entering the computer as a streamed file, as it arrives.
37. A method of claim 36 in which advertising is synced to a video or sound stream.
38. A method of claim 36 in which two concurrent connections are used, a data stream and a separate connection, which relays metadata about the stream.
39. A method of claim 38 in which the data stream is QuickTime (Apple), RealMedia (Real Networks) or Win Media (Microsoft) multimedia platforms; and the metadata is sent via xml.
40. A method of claim 36 in which sync points between the stream and programmed events are calculated based on the data sent by the server, not on the data received by the client, using metadata on the client-side application to trigger events based on what the server indicates it has streamed, not based on what the client has received in the stream.
41. A method of claim 36 in which a client side component performs two parallel tasks: it receives the media stream and renders it, while at the same time triggering events based on metadata received from the streaming server via a separate link.
42. A method of claim 41 in which the client side component uses Macromedia Flash.
PCT/US2005/037951 2004-10-19 2005-10-19 Method for synchronizing events with streaming data WO2006045061A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US62020704P 2004-10-19 2004-10-19
US60/620,207 2004-10-19

Publications (2)

Publication Number Publication Date
WO2006045061A2 true WO2006045061A2 (en) 2006-04-27
WO2006045061A3 WO2006045061A3 (en) 2006-06-22

Family

ID=36203718

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2005/037951 WO2006045061A2 (en) 2004-10-19 2005-10-19 Method for synchronizing events with streaming data

Country Status (1)

Country Link
WO (1) WO2006045061A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090158147A1 (en) * 2007-12-14 2009-06-18 Amacker Matthew W System and method of presenting media data

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030088511A1 (en) * 2001-07-05 2003-05-08 Karboulonis Peter Panagiotis Method and system for access and usage management of a server/client application by a wireless communications appliance
US20030229899A1 (en) * 2002-05-03 2003-12-11 Matthew Thompson System and method for providing synchronized events to a television application
US6701383B1 (en) * 1999-06-22 2004-03-02 Interactive Video Technologies, Inc. Cross-platform framework-independent synchronization abstraction layer
US6788333B1 (en) * 2000-07-07 2004-09-07 Microsoft Corporation Panoramic video

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6701383B1 (en) * 1999-06-22 2004-03-02 Interactive Video Technologies, Inc. Cross-platform framework-independent synchronization abstraction layer
US6788333B1 (en) * 2000-07-07 2004-09-07 Microsoft Corporation Panoramic video
US20030088511A1 (en) * 2001-07-05 2003-05-08 Karboulonis Peter Panagiotis Method and system for access and usage management of a server/client application by a wireless communications appliance
US20030229899A1 (en) * 2002-05-03 2003-12-11 Matthew Thompson System and method for providing synchronized events to a television application

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090158147A1 (en) * 2007-12-14 2009-06-18 Amacker Matthew W System and method of presenting media data
US9275056B2 (en) * 2007-12-14 2016-03-01 Amazon Technologies, Inc. System and method of presenting media data
US10248631B2 (en) 2007-12-14 2019-04-02 Amazon Technologies, Inc. System and method of presenting media data

Also Published As

Publication number Publication date
WO2006045061A3 (en) 2006-06-22

Similar Documents

Publication Publication Date Title
JP6783293B2 (en) Synchronizing multiple over-the-top streaming clients
US9171545B2 (en) Browsing and retrieval of full broadcast-quality video
US20050154679A1 (en) System for inserting interactive media within a presentation
US20090106357A1 (en) Synchronized Media Playback Using Autonomous Clients Over Standard Internet Protocols
US8737804B2 (en) System for delayed video viewing
CN111010614A (en) Method, device, server and medium for displaying live caption
EP1126714A2 (en) Data reception apparatus, data reception method, data transmission method, and data storage media
Van Deventer et al. Standards for multi-stream and multi-device media synchronization
Boronat et al. HbbTV-compliant platform for hybrid media delivery and synchronization on single-and multi-device scenarios
US20130057759A1 (en) Live Audio Track Additions to Digital Streams
CN104604245B (en) Time control is presented
CN109756744B (en) Data processing method, electronic device and computer storage medium
WO2019088853A1 (en) Live audio replacement in a digital stream
US20230336842A1 (en) Information processing apparatus, information processing method, and program for presenting reproduced video including service object and adding additional image indicating the service object
CN111669605B (en) Method and device for synchronizing multimedia data and associated interactive data thereof
CN114697712B (en) Method, device and equipment for downloading media stream and storage medium
WO2006045061A2 (en) Method for synchronizing events with streaming data
Concolato et al. Live HTTP streaming of video and subtitles within a browser
CN106537930B (en) It is a kind of for implementing the client of media stream business rendering method
van Deventer et al. Media synchronisation for television services through HbbTV
US11689776B2 (en) Information processing apparatus, information processing apparatus, and program
US20080148319A1 (en) Coordinating web media with time-shifted broadcast
KR102273795B1 (en) System and control method for video synchronization processing
Gibbon et al. Browsing and Retrieval of Full Broadcast-Quality Video
KR20030070932A (en) method and device to transmit several independent multimedia data

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV LY MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application
32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: COMMUNICATION UNDER RULE 69 EPC ( EPO FORM 1205A DATED 14/09/07 )

122 Ep: pct application non-entry in european phase

Ref document number: 05811829

Country of ref document: EP

Kind code of ref document: A2