WO2014078391A1 - Systems and methods for synchronizing content playback across media devices - Google Patents

Systems and methods for synchronizing content playback across media devices Download PDF

Info

Publication number
WO2014078391A1
WO2014078391A1 PCT/US2013/069859 US2013069859W WO2014078391A1 WO 2014078391 A1 WO2014078391 A1 WO 2014078391A1 US 2013069859 W US2013069859 W US 2013069859W WO 2014078391 A1 WO2014078391 A1 WO 2014078391A1
Authority
WO
WIPO (PCT)
Prior art keywords
story
stream
media
user
media device
Prior art date
Application number
PCT/US2013/069859
Other languages
French (fr)
Inventor
Brian Elan Lee
Michael Sean STEWART
James Stewartson
Original Assignee
Nant Holdings Ip, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nant Holdings Ip, Llc filed Critical Nant Holdings Ip, Llc
Publication of WO2014078391A1 publication Critical patent/WO2014078391A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/1066Session management
    • H04L65/1083In-session procedures
    • H04L65/1089In-session procedures by adding media; by removing media
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/401Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference
    • H04L65/4015Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference where at least one of the additional parallel sessions is real time or time sensitive, e.g. white board sharing, collaboration or spawning of a subconference

Definitions

  • the field of the invention is interactive digital technologies.
  • Paxson and all other extrinsic materials discussed herein are incorporated by reference in their entirety. Where a definition or use of a term in an incorporated reference is inconsistent or contrary to the definition of that term provided herein, the definition of that term provided herein applies and the definition of that term in the reference does not apply.
  • U.S. pat. publ. no. 2012/0030366 to Collart goes a step further by allowing for additional content to be played on a secondary device.
  • the Collart system sends content to the second device based on a device command and the Coniglio system is dependent on a user specifically requesting viewing of the secondary content.
  • the inventive subject matter provides apparatus, systems and methods in which one can provide a rich, synchronized multi-modal experience to users via a user device and preferably via multiple user devices.
  • One aspect of the inventive subject matter includes a story management engine capable of delivering content streams to one or more devices of a user, or even to multiple users.
  • the story management engine comprises a story server or database communicatively coupled with the user's devices.
  • the story server can obtain a story from the server.
  • stories can comprise one or more story media streams or other content that can be synchronously or asynchronously presented on one or more user media devices.
  • the story management engine can configure the user's one or more media devices to present the story according to the story streams.
  • the streams can be synchronized as a user logs in to the system.
  • the user may have a common login that is used across devices, or could have separate logins that are all associated with the user.
  • the system preferably the story management engine, can recognize the device and send a story stream to the new device in conjunction with presentation of a separate story stream on a first device.
  • the system can automatically synchronize presentation of content across the devices to provide a seamless and integrated experience for the user.
  • the specific story stream presented on the new device can depend on the device's capabilities and/or user preferences. Thus, the user may have different interaction with a story depending on the user' s devices being used.
  • FIG. 1 is a diagram of one embodiment of a multimodal story management system.
  • FIG. 2 is a flowchart of one embodiment of a method for managing multiple story streams or other related content across devices.
  • a server can include one or more computers operating as a web server, database server, or other type of computer server in a manner to fulfill described roles, responsibilities, or functions.
  • the software instructions preferably configure the computing device to provide the roles, responsibilities, or other functionality as discussed below with respect to the disclosed apparatus.
  • the various servers, systems, databases, or interfaces exchange data using standardized protocols or algorithms, possibly based on HTTP, HTTPS, AES, public-private key exchanges, web service APIs, known financial transaction protocols, or other electronic information exchanging methods.
  • Data exchanges preferably are conducted over a packet- switched network, the Internet, LAN, WAN, VPN, or other type of packet switched network.
  • the disclosed techniques provide many advantageous technical effects including synchronizing multiple distinct media devices to present a rich media entertainment experience to one or more users.
  • the systems and methods described herein facilitate immersion of a user within a story by presenting multiple story streams or other related content to a user across multiple devices, and preferably across different modalities.
  • a user is able to more deeply immerse within a story by interacting with different types of related content across multiple devices.
  • inventive subject matter provides many example embodiments of the inventive subject matter. Although each embodiment represents a single combination of inventive elements, the inventive subject matter is considered to include all possible combinations of the disclosed elements. Thus if one embodiment comprises elements A, B, and C, and a second embodiment comprises elements B and D, then the inventive subject matter is also considered to include other remaining combinations of A, B, C, or D, even if not explicitly disclosed.
  • a story is considered to comprise one or more data streams, herein referred to as "story streams", carrying experience-related content and device commands.
  • the device commands configure a user's media device to present the content of a story stream according to an overarching story, which is preferably defined in the background by a series of commands.
  • the story can include narrative (e.g., fiction, video, audio, etc.), interactive components (e.g., puzzles, games, etc.), promotions (e.g. , advertisements, contests, etc.), or other types of user- engaging features. Users can interact with the content in the story streams according to the programmed story.
  • a story server or database can store one or more stories, each of which has a set of story streams.
  • each stream can target a specific media device or type of media device.
  • all of the streams of a set may not be used depending on the user devices available.
  • a story stream is considered to include a sequenced presentation of data, preferably according to a time-based schedule.
  • sequence could be chronological or based on some other definition.
  • a second story stream can comprise a sequenced presentation of data, in that multiple content streams can be delivered sequentially with gaps in between and according to trigger points in a first story stream.
  • a stream can be presented according to other triggering criteria or based on user input. Triggering criteria can be based on biometrics, location, movement, or other acquired data, for example.
  • FIG. 1 illustrates one embodiment of a multimodal story management system 100.
  • Contemplated systems include a story management engine 104 coupled to one or more story servers 102 (databases) and one or more media devices 110A-N, thus operating as a multi-media delivery channel where the server(s) 102 via the story management engine 104 or content engine deliver content related to a multimodal experience to one or more target media devices 110A-N.
  • a story server 102 can be configured to deliver one or more story (media) streams to the target media devices 110A-N and cause the media devices 110A-N to present content of the story streams in a synchronized manner according to a desired modality.
  • the story streams are related in content.
  • Exemplary types of data that can be used to configure the media devices 110A-N to present different modal experiences include visual data (e.g. , images, video, etc.), audible data, haptic or kinesthetic data, metadata, web-based data, or even augmented or virtual reality data.
  • each media device 110A-N can receive a story stream according to a modality selected for that media device.
  • the specific modality could automatically be selected based upon the capabilities of a specific media device, and different media devices can thereby receive story streams having different modalities.
  • a laptop or other personal computer may receive audio and video data
  • a mobile phone may receive telephone calls and/or text or multimedia messages. In this manner, different pieces of a story can be delivered to different, sometimes unconnected, platforms.
  • a user can set preferences to state the permitted and disallowed modalities for use with that user.
  • Contemplated media devices capable of interacting with the story streams include portable computing devices (e.g. , laptops, netbooks, tablet PCs, and other portable computers, smart phones, MP3 players, personal digital assistants, vehicles, watches, computerized glasses, etc.), desktop computers, televisions, game consoles or other platforms, electronic picture frames, appliances, kiosks, radios, telephones, vehicles, sensor devices, or other types of devices.
  • Media devices 110A-N preferably comprise different types of media devices, and it is preferred that the media devices 110A-N are distinct devices associated with a single user.
  • a first media device could comprise a laptop computer
  • a second media device could comprise a smart phone.
  • the media devices 110A-N can be associated with multiple users where a first user may control a first media device 110A, and a second user may control a second user device HOB, and the first and second media devices 110A-B receive first and second story media streams, respectively.
  • a projector or television could comprise a first media device that receives a first story stream
  • a smart phone or tablet PC could comprise a second user device that receives a second story stream related to the first story stream.
  • the devices could both reside in a single location, but could have different owners.
  • the projector could be in a public setting where spectators could utilize their media devices to allow for more immersive interaction with the first story stream being presented on the first device.
  • one or more of the media devices 110A-N can include at least one sensor configured to collect ambient information about a user's environment.
  • sensors could include, for example, GPS, cellular triangulation, or other location discovery systems, cameras, video recorders, accelerometers, magnetometers, speedometers, odometers, altitude detectors, thermometers, optical sensors, motion sensors, heart rate monitors, proximity sensors, microphones, and so forth.
  • This ambient information could be used, for example, to select which story is presented to a user. It could also be used to determine which story streams of a story to present to a sure or when and/or how to present them. For example, by
  • a story related to a user' s location could be presented to the user.
  • the user may be presented with an interactive game across the user's multiple devices.
  • stories could be selected.
  • a user could be presented with a story related to the sport' s venue at a later time based on the location data collected that the user was at the sport's venue earlier in the day.
  • one or more of the various servers composing the story management engine 104 can be local or remote relative to the user' s media devices 110A-N.
  • the story server 102 could be local to a user on a common network or even on one of the user's media devices 110A-N.
  • Such an approach allows content or streams to be downloaded to a computing device local to the user or even to one or more of the user' s media devices 110A-N.
  • the one or more devices 110A-N can still present its story stream seamlessly according to the stream's schedule or triggering criteria.
  • portions of story streams can be downloaded in advance to act as a buffer in case of poor network connection, etc.
  • the story server 102 could be remote from the user's media devices 110A-N, such as located on a distal server accessible via the Internet 120.
  • Exemplary remote servers can include single purpose server farms, distal services, distributed computing platforms (e.g. , cloud based services, etc.), or even augmented or mixed reality computing platforms.
  • the story management engine 104 provides at least two story streams to at least two of the media devices 1 lOA- 110B in a synchronized manner according to a defined sequence. It is especially preferred that the story management engine 104 is configured to present a first story stream to the first media device 110A, and automatically present a second story stream to the second media device 110B as a function of a triggering event contained within the first story stream or an interaction of a user with the first story stream.
  • the story management engine 104 could
  • triggering event which could be, for example, reaching or passing a predefined point in the first story stream.
  • Other triggering events include, for example, a user pausing the first or second story stream on a user device, a loss of communication between a user device and the story management engine, an active selection by a user, and a change in geographical location of a user.
  • a first media device could present a first story stream to a user that includes one or more bookmarks or other triggering events.
  • the story management engine could cause a second story stream or a portion thereof (e.g.
  • At least some content to be communicated to the first user device or a second user media device of the user, depending on the level of immersion selected or permitted. For example, a user could be watching a scene and reach a defined point where a character's mobile phone is ringing. A triggering event can be included at that point such that the story management engine causes a stream to be sent to the user's mobile phone or other media device, such that the device rings during this portion of the scene.
  • the story management engine 104 can be configured to receive a time- shifting command from the second media device HOB based on an interaction of a user with the second story stream, which can cause playback of the first story stream to be altered on the first media device 110A as a function of the time- shifting command.
  • exemplary time-shifting commands include, for example, fast-forwarding the first story stream, rewinding the first story stream, playing the first story stream, pausing the first story stream, unlocking the first story stream, triggering an event, and skipping the first story stream.
  • the story management engine 104 or other component of system automatically causes playback of the first story stream to pause during playback of the second story stream on the second media device HOB.
  • the story management engine 104 can cause playback of the first story stream to resume upon completion of playback of the second story stream on the second media device HOB.
  • playback of the first story stream could be resumed, if paused, or could be skipped to a different point in the stream to align with the user' s interactions.
  • ending the call could send a command that causes the first story stream to continue playback on another device of the user.
  • Such command could also be sent at a predetermined position of the second story stream, such that playback of the first story stream continues seamlessly when playback of the second story stream or that portion ends.
  • Such interaction of the first and second devices could occur via commands sent directly to a device or via the story engine 104 or other component of the system 100.
  • such interaction could occur via a signal or other information being sent from one device that causes an action to occur on another device.
  • information could be sent to the first user device either directly or via some component of system 100, such that playback of the media stream (here, first story stream) is altered.
  • the story streams can be presented to the first and second media devices 110A and HOB preferably as synchronized streams, although in some embodiments as asynchronous streams.
  • the synchronization of the story streams, or lack thereof, will likely depend on the specific story parameters.
  • presenting synchronized story streams does not require that the streams be always presented simultaneously. Rather, presenting synchronized story streams is contemplated to include presenting data from two or more story streams at proper and defined times relative to one another. Thus, presentation of the story streams could overlap or be non-overlapping.
  • the story streams can be presented according to a programmed schedule where the schedule can include absolute times or relative times.
  • the story management engine 104 can be configured to automatically discover the second media device 110B. In this manner, the story management engine 104 or other component can automatically synchronize playback of the second story stream on the second media device HOB with playback of the first story stream upon discovery of the second media device HOB.
  • a media device can include, and preferably execute, a software application that has a user media interface configured to allow a user to interact with first and second story streams.
  • exemplary applications include, for example, a web-based application and an application program interface (API), through which commands or data can be exchanged interact with the story management engine's servers.
  • API application program interface
  • this could include an application for a smart phone such as those available through online stores offered by AppleTM and GoogleTM.
  • at least one user media device includes a software application that allows for user interaction with the story stream.
  • the story management engine 104 could be configured to present the first story stream to the user media device, and automatically present the second story stream to the same user media device as a function of a triggering event contained within the first story stream.
  • the second story stream could alternatively be presented to a second user media device as discussed above.
  • first and second story streams are presented to a single user media device
  • the application could be used to emulate multiple modalities such as a telephone, messaging, or other functions of the device.
  • the application can be used to view a first story stream, and a second story stream (e.g., a phone call), can be received and displayed through the application to the user.
  • a second story stream e.g., a phone call
  • the story management engine 104 automatically presents the second story stream to the media device upon occurrence of the triggering event.
  • playback of the first story stream can be paused during playback of the second story stream.
  • the application is preferably configured to simulate different modalities and present the first and second story streams in the different modalities on the media device to provide a user with a more immersive experience.
  • the application could simulate at least one of a telephone call, a messaging platform, and an email service or reader. This could be useful where such devices are unavailable or use of such devices is undesired.
  • Figure 2 illustrates one embodiment of a method 200 for managing multiple streams of content on disparate media devices.
  • access is provided to a story database configured to store at least one story that has first and second story streams.
  • step 220 access is provided to a story management engine that is coupled to the story database and first and second media devices.
  • At least one of the first and second media devices comprises a portable computing device in step 222. More specifically, at least one of the first and second media devices comprises a television, a mobile telephone, a tablet PC, a laptop computer, a desktop computer, a telephone, a radio, an appliance, an electronic picture frame, a vehicle, a game platform, and a sensor.
  • the first and second media devices are preferably controlled by a single user.
  • the first story stream is presented in step 230 on the first media device using the story management engine.
  • the first story stream can be presented on the first media device in step 232 as a function of an interaction of a user with the second story stream
  • step 240 the second story stream is presented on the second media device upon triggering of an event contained within the first story stream.
  • step 242 the event is triggered upon occurrence of the event. In other embodiments shown in step 244, the event is triggered upon reaching the event.
  • step 246 presentation or playback of the second story stream on the second media device can be synchronized with presentation of the first story stream as a function of presentation of the first story stream on the first media device.
  • Coupled to is intended to include both direct coupling (in which two elements that are coupled to each other contact each other) and indirect coupling (in which at least one additional element is located between the two elements). Therefore, the terms “coupled to” and “coupled with” are used synonymously.

Abstract

Multimodal story management systems and methods are described having a story server capable of delivering synchronized content streams of a story to multiple devices of a user, or even to multiple users. The story server can be coupled to a story management engine that is configured to present the first story stream to a first media device, and automatically present the second story stream to a second media device as a function of a triggering event contained within the first story stream.

Description

SYSTEMS AND METHODS FOR SYNCHRONIZING
CONTENT PLAYBACK ACROSS MEDIA DEVICES
[0001] This application claims the benefit of priority to U.S. provisional application having serial no. 61/725,799 filed on November 13, 2012. This and all other referenced extrinsic materials are incorporated herein by reference in their entirety. Where a definition or use of a term in a reference that is incorporated by reference is inconsistent or contrary to the definition of that term provided herein, the definition of that term provided herein is deemed to be controlling.
Field of the Invention
[0002] The field of the invention is interactive digital technologies. Background
[0003] The following description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[0004] Consumers seek out ever more immersive media experiences. With the advent of mobile computing, opportunities exist for integrating real- world experiences with immersive narratives bridging across a full spectrum of device capabilities. Rather than a consumer passively watching a television show or listening to an audio stream, the consumer can directly and actively engage with a narrative or story according to their own preferences.
[0005] Interestingly, previous efforts of providing immersive narratives seek to maintain a distinction between the "real-world" and fictional worlds. For example, U.S. pat. no. 7810021 to Paxson describes attempts at preserving a reader's immersive experience when reading literary works on electronic devices. Therefore, Paxson seeks to maintain discreet boundaries between the real- world and functional world. Unfortunately, narratives presented according to such approaches remain static, locked on a single device, or outside the influence of the consumer.
[0006] Paxson and all other extrinsic materials discussed herein are incorporated by reference in their entirety. Where a definition or use of a term in an incorporated reference is inconsistent or contrary to the definition of that term provided herein, the definition of that term provided herein applies and the definition of that term in the reference does not apply.
[0007] Although U.S. pat. no. 8190683 to Sloo and WIPO publ. no. 2012/009136 to Hulu, LLC discuss synchronizing playback of content across multiple devices, these references fail to contemplate that different content streams can be synchronized to play at specific time intervals with respect to one another.
[0008] U.S. pat. publ. no. 2012/0030366 to Collart (publ. Feb. 2012) and U.S. pat. publ. no. 2012/0233646 to Coniglio (publ. Sept. 2012) go a step further by allowing for additional content to be played on a secondary device. However, the Collart system sends content to the second device based on a device command and the Coniglio system is dependent on a user specifically requesting viewing of the secondary content.
[0009] Thus, there is still a need for rich transmedia user experiences, in which multiple content streams are presented across multiple devices in a preferably synchronized manner.
Summary of the Invention
[0010] The inventive subject matter provides apparatus, systems and methods in which one can provide a rich, synchronized multi-modal experience to users via a user device and preferably via multiple user devices. One aspect of the inventive subject matter includes a story management engine capable of delivering content streams to one or more devices of a user, or even to multiple users. In some embodiments, the story management engine comprises a story server or database communicatively coupled with the user's devices. When the user requests an experience, herein referred to as a "story", the story server can obtain a story from the server. Stories can comprise one or more story media streams or other content that can be synchronously or asynchronously presented on one or more user media devices. The story management engine can configure the user's one or more media devices to present the story according to the story streams.
[0011] In other embodiments, the streams can be synchronized as a user logs in to the system. For example, the user may have a common login that is used across devices, or could have separate logins that are all associated with the user. Once a user logs in on a device, the system, preferably the story management engine, can recognize the device and send a story stream to the new device in conjunction with presentation of a separate story stream on a first device. In this manner, the system can automatically synchronize presentation of content across the devices to provide a seamless and integrated experience for the user. The specific story stream presented on the new device can depend on the device's capabilities and/or user preferences. Thus, the user may have different interaction with a story depending on the user' s devices being used.
[0012] Various objects, features, aspects and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.
Brief Description of the Drawings
[0013] Fig. 1 is a diagram of one embodiment of a multimodal story management system.
[0014] Fig. 2 is a flowchart of one embodiment of a method for managing multiple story streams or other related content across devices.
Detailed Description
[0015] Throughout the following discussion, numerous references will be made regarding servers, services, interfaces, portals, platforms, or other systems formed from computing devices. It should be appreciated that the use of such terms is deemed to represent one or more computing devices having at least one processor configured to execute software instructions stored on a computer readable tangible, non-transitory medium (e.g. , hard drive, solid state drive, RAM, flash, ROM, etc.). For example, a server can include one or more computers operating as a web server, database server, or other type of computer server in a manner to fulfill described roles, responsibilities, or functions. The software instructions preferably configure the computing device to provide the roles, responsibilities, or other functionality as discussed below with respect to the disclosed apparatus. In especially preferred embodiments, the various servers, systems, databases, or interfaces exchange data using standardized protocols or algorithms, possibly based on HTTP, HTTPS, AES, public-private key exchanges, web service APIs, known financial transaction protocols, or other electronic information exchanging methods. Data exchanges preferably are conducted over a packet- switched network, the Internet, LAN, WAN, VPN, or other type of packet switched network. [0016] One should appreciate that the disclosed techniques provide many advantageous technical effects including synchronizing multiple distinct media devices to present a rich media entertainment experience to one or more users. For example, the systems and methods described herein facilitate immersion of a user within a story by presenting multiple story streams or other related content to a user across multiple devices, and preferably across different modalities. Thus, a user is able to more deeply immerse within a story by interacting with different types of related content across multiple devices.
[0017] The following discussion provides many example embodiments of the inventive subject matter. Although each embodiment represents a single combination of inventive elements, the inventive subject matter is considered to include all possible combinations of the disclosed elements. Thus if one embodiment comprises elements A, B, and C, and a second embodiment comprises elements B and D, then the inventive subject matter is also considered to include other remaining combinations of A, B, C, or D, even if not explicitly disclosed.
[0018] The following discussion describes presenting a multimodal experience to a user as a story. A story is considered to comprise one or more data streams, herein referred to as "story streams", carrying experience-related content and device commands. The device commands configure a user's media device to present the content of a story stream according to an overarching story, which is preferably defined in the background by a series of commands. The story can include narrative (e.g., fiction, video, audio, etc.), interactive components (e.g., puzzles, games, etc.), promotions (e.g. , advertisements, contests, etc.), or other types of user- engaging features. Users can interact with the content in the story streams according to the programmed story. A story server or database can store one or more stories, each of which has a set of story streams. Preferably, each stream can target a specific media device or type of media device. Thus, in such embodiments, it is contemplated that all of the streams of a set may not be used depending on the user devices available.
[0019] A story stream is considered to include a sequenced presentation of data, preferably according to a time-based schedule. Of course, such sequence could be chronological or based on some other definition. For example, a second story stream can comprise a sequenced presentation of data, in that multiple content streams can be delivered sequentially with gaps in between and according to trigger points in a first story stream. One should also note a stream can be presented according to other triggering criteria or based on user input. Triggering criteria can be based on biometrics, location, movement, or other acquired data, for example.
[0020] Figure 1 illustrates one embodiment of a multimodal story management system 100. Contemplated systems include a story management engine 104 coupled to one or more story servers 102 (databases) and one or more media devices 110A-N, thus operating as a multi-media delivery channel where the server(s) 102 via the story management engine 104 or content engine deliver content related to a multimodal experience to one or more target media devices 110A-N. For example, a story server 102 can be configured to deliver one or more story (media) streams to the target media devices 110A-N and cause the media devices 110A-N to present content of the story streams in a synchronized manner according to a desired modality. Preferably, the story streams are related in content.
[0021] Exemplary types of data that can be used to configure the media devices 110A-N to present different modal experiences include visual data (e.g. , images, video, etc.), audible data, haptic or kinesthetic data, metadata, web-based data, or even augmented or virtual reality data. It is contemplated that each media device 110A-N can receive a story stream according to a modality selected for that media device. Thus, for example, the specific modality could automatically be selected based upon the capabilities of a specific media device, and different media devices can thereby receive story streams having different modalities. For example, a laptop or other personal computer may receive audio and video data, while a mobile phone may receive telephone calls and/or text or multimedia messages. In this manner, different pieces of a story can be delivered to different, sometimes unconnected, platforms. In other embodiments, a user can set preferences to state the permitted and disallowed modalities for use with that user.
[0022] One embodiment of a platform for providing a transmedia environment is described in co-pending U.S. utility appl. having serial no. 13/414,192 filed on March 7, 2012.
[0023] Contemplated media devices capable of interacting with the story streams include portable computing devices (e.g. , laptops, netbooks, tablet PCs, and other portable computers, smart phones, MP3 players, personal digital assistants, vehicles, watches, computerized glasses, etc.), desktop computers, televisions, game consoles or other platforms, electronic picture frames, appliances, kiosks, radios, telephones, vehicles, sensor devices, or other types of devices. Media devices 110A-N preferably comprise different types of media devices, and it is preferred that the media devices 110A-N are distinct devices associated with a single user. Thus, for example, in one embodiment, a first media device could comprise a laptop computer, while a second media device could comprise a smart phone. By incorporating multiple devices, a user can receive multiple story streams across the devices, allowing a more immersive interaction with a story.
[0024] In other contemplated embodiments, the media devices 110A-N can be associated with multiple users where a first user may control a first media device 110A, and a second user may control a second user device HOB, and the first and second media devices 110A-B receive first and second story media streams, respectively. For example, a projector or television could comprise a first media device that receives a first story stream, and a smart phone or tablet PC could comprise a second user device that receives a second story stream related to the first story stream. In such embodiments, the devices could both reside in a single location, but could have different owners. The projector could be in a public setting where spectators could utilize their media devices to allow for more immersive interaction with the first story stream being presented on the first device.
[0025] Advantageously, it is preferred that one or more of the media devices 110A-N can include at least one sensor configured to collect ambient information about a user's environment. Such sensors could include, for example, GPS, cellular triangulation, or other location discovery systems, cameras, video recorders, accelerometers, magnetometers, speedometers, odometers, altitude detectors, thermometers, optical sensors, motion sensors, heart rate monitors, proximity sensors, microphones, and so forth. This ambient information could be used, for example, to select which story is presented to a user. It could also be used to determine which story streams of a story to present to a sure or when and/or how to present them. For example, by
understanding a user' s location, a story related to a user' s location could be presented to the user. Thus, for example, if the user was at a sport's venue, the user may be presented with an interactive game across the user's multiple devices. Similarly, by knowing previous locations of a user, stories could be selected. Using the example above, a user could be presented with a story related to the sport' s venue at a later time based on the location data collected that the user was at the sport's venue earlier in the day.
[0026] Although shown distal to the user media devices 1 10A-N, one or more of the various servers composing the story management engine 104 can be local or remote relative to the user' s media devices 110A-N. For example, the story server 102 could be local to a user on a common network or even on one of the user's media devices 110A-N. Such an approach allows content or streams to be downloaded to a computing device local to the user or even to one or more of the user' s media devices 110A-N. In this manner, should the user lose connectivity with a network, or the user's connectivity temporarily slow, the one or more devices 110A-N can still present its story stream seamlessly according to the stream's schedule or triggering criteria. It is also contemplated that portions of story streams can be downloaded in advance to act as a buffer in case of poor network connection, etc. It is further contemplated that the story server 102 could be remote from the user's media devices 110A-N, such as located on a distal server accessible via the Internet 120. Exemplary remote servers can include single purpose server farms, distal services, distributed computing platforms (e.g. , cloud based services, etc.), or even augmented or mixed reality computing platforms.
[0027] Preferably, the story management engine 104 provides at least two story streams to at least two of the media devices 1 lOA- 110B in a synchronized manner according to a defined sequence. It is especially preferred that the story management engine 104 is configured to present a first story stream to the first media device 110A, and automatically present a second story stream to the second media device 110B as a function of a triggering event contained within the first story stream or an interaction of a user with the first story stream.
[0028] In some contemplated embodiments, the story management engine 104 could
automatically present the second story stream to the second media device HOB upon occurrence of the triggering event, which could be, for example, reaching or passing a predefined point in the first story stream. Other triggering events include, for example, a user pausing the first or second story stream on a user device, a loss of communication between a user device and the story management engine, an active selection by a user, and a change in geographical location of a user. Thus, a first media device could present a first story stream to a user that includes one or more bookmarks or other triggering events. Upon reaching a preset bookmark during presentation of the first story stream, the story management engine could cause a second story stream or a portion thereof (e.g. , at least some content) to be communicated to the first user device or a second user media device of the user, depending on the level of immersion selected or permitted. For example, a user could be watching a scene and reach a defined point where a character's mobile phone is ringing. A triggering event can be included at that point such that the story management engine causes a stream to be sent to the user's mobile phone or other media device, such that the device rings during this portion of the scene.
[0029] In other contemplated embodiments, the story management engine 104 can be configured to receive a time- shifting command from the second media device HOB based on an interaction of a user with the second story stream, which can cause playback of the first story stream to be altered on the first media device 110A as a function of the time- shifting command. Exemplary time-shifting commands include, for example, fast-forwarding the first story stream, rewinding the first story stream, playing the first story stream, pausing the first story stream, unlocking the first story stream, triggering an event, and skipping the first story stream.
[0030] It is especially preferred that the story management engine 104 or other component of system automatically causes playback of the first story stream to pause during playback of the second story stream on the second media device HOB. In such embodiments, it is further contemplated that the story management engine 104 can cause playback of the first story stream to resume upon completion of playback of the second story stream on the second media device HOB. Thus, for example, based on a user' s interaction with second story stream, playback of the first story stream could be resumed, if paused, or could be skipped to a different point in the stream to align with the user' s interactions. Continuing the example, if the second story stream comprised a phone call to a user's cell phone, ending the call could send a command that causes the first story stream to continue playback on another device of the user. Such command could also be sent at a predetermined position of the second story stream, such that playback of the first story stream continues seamlessly when playback of the second story stream or that portion ends.
[0031] Such interaction of the first and second devices could occur via commands sent directly to a device or via the story engine 104 or other component of the system 100. Alternatively, such interaction could occur via a signal or other information being sent from one device that causes an action to occur on another device. Thus, for example, as a user interacts with content on a second user device, information could be sent to the first user device either directly or via some component of system 100, such that playback of the media stream (here, first story stream) is altered.
[0032] The story streams can be presented to the first and second media devices 110A and HOB preferably as synchronized streams, although in some embodiments as asynchronous streams. The synchronization of the story streams, or lack thereof, will likely depend on the specific story parameters. One should appreciate that presenting synchronized story streams does not require that the streams be always presented simultaneously. Rather, presenting synchronized story streams is contemplated to include presenting data from two or more story streams at proper and defined times relative to one another. Thus, presentation of the story streams could overlap or be non-overlapping. In some scenarios the story streams can be presented according to a programmed schedule where the schedule can include absolute times or relative times.
[0033] It is further contemplated that the story management engine 104 can be configured to automatically discover the second media device 110B. In this manner, the story management engine 104 or other component can automatically synchronize playback of the second story stream on the second media device HOB with playback of the first story stream upon discovery of the second media device HOB.
[0034] In one aspect, a media device can include, and preferably execute, a software application that has a user media interface configured to allow a user to interact with first and second story streams. Exemplary applications include, for example, a web-based application and an application program interface (API), through which commands or data can be exchanged interact with the story management engine's servers. As a tangible example, this could include an application for a smart phone such as those available through online stores offered by Apple™ and Google™. It is especially preferred that at least one user media device includes a software application that allows for user interaction with the story stream. In such embodiments, the story management engine 104 could be configured to present the first story stream to the user media device, and automatically present the second story stream to the same user media device as a function of a triggering event contained within the first story stream. Of course, the second story stream could alternatively be presented to a second user media device as discussed above.
[0035] In embodiments where the first and second story streams are presented to a single user media device, it is contemplated that the application could be used to emulate multiple modalities such as a telephone, messaging, or other functions of the device. In this manner, for example, the application can be used to view a first story stream, and a second story stream (e.g., a phone call), can be received and displayed through the application to the user. This
advantageously eliminates the need to call the user using landlines or other communication pathways, especially where such pathways are unavailable. It is preferred that the story management engine 104 automatically presents the second story stream to the media device upon occurrence of the triggering event. Optionally, playback of the first story stream can be paused during playback of the second story stream. The application is preferably configured to simulate different modalities and present the first and second story streams in the different modalities on the media device to provide a user with a more immersive experience. For example, the application could simulate at least one of a telephone call, a messaging platform, and an email service or reader. This could be useful where such devices are unavailable or use of such devices is undesired.
[0036] Figure 2 illustrates one embodiment of a method 200 for managing multiple streams of content on disparate media devices. In step 210, access is provided to a story database configured to store at least one story that has first and second story streams.
[0037] In step 220, access is provided to a story management engine that is coupled to the story database and first and second media devices. At least one of the first and second media devices comprises a portable computing device in step 222. More specifically, at least one of the first and second media devices comprises a television, a mobile telephone, a tablet PC, a laptop computer, a desktop computer, a telephone, a radio, an appliance, an electronic picture frame, a vehicle, a game platform, and a sensor.
[0038] In step 224, the first and second media devices are preferably controlled by a single user. [0039] The first story stream is presented in step 230 on the first media device using the story management engine. Optionally, the first story stream can be presented on the first media device in step 232 as a function of an interaction of a user with the second story stream
[0040] In step 240, the second story stream is presented on the second media device upon triggering of an event contained within the first story stream. In some contemplated
embodiments shown in step 242, the event is triggered upon occurrence of the event. In other embodiments shown in step 244, the event is triggered upon reaching the event.
[0041] It is further contemplated in step 246 that presentation or playback of the second story stream on the second media device can be synchronized with presentation of the first story stream as a function of presentation of the first story stream on the first media device.
[0042] In some embodiments, the numbers expressing quantities of ingredients, properties such as concentration, reaction conditions, and so forth, used to describe and claim certain
embodiments of the invention are to be understood as being modified in some instances by the term "about." Accordingly, in some embodiments, the numerical parameters set forth in the written description and attached claims are approximations that can vary depending upon the desired properties sought to be obtained by a particular embodiment. In some embodiments, the numerical parameters should be construed in light of the number of reported significant digits and by applying ordinary rounding techniques. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of some embodiments of the invention are
approximations, the numerical values set forth in the specific examples are reported as precisely as practicable. The numerical values presented in some embodiments of the invention may contain certain errors necessarily resulting from the standard deviation found in their respective testing measurements.
[0043] Unless the context dictates the contrary, all ranges set forth herein should be interpreted as being inclusive of their endpoints and open-ended ranges should be interpreted to include only commercially practical values. Similarly, all lists of values should be considered as inclusive of intermediate values unless the context indicates the contrary. [0044] As used in the description herein and throughout the claims that follow, the meaning of "a," "an," and "the" includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of "in" includes "in" and "on" unless the context clearly dictates otherwise.
[0045] The recitation of ranges of values herein is merely intended to serve as a shorthand method of referring individually to each separate value falling within the range. Unless otherwise indicated herein, each individual value with a range is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g. "such as") provided with respect to certain embodiments herein is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention otherwise claimed. No language in the specification should be construed as indicating any non-claimed element essential to the practice of the invention.
[0046] Groupings of alternative elements or embodiments of the invention disclosed herein are not to be construed as limitations. Each group member can be referred to and claimed individually or in any combination with other members of the group or other elements found herein. One or more members of a group can be included in, or deleted from, a group for reasons of convenience and/or patentability. When any such inclusion or deletion occurs, the
specification is herein deemed to contain the group as modified thus fulfilling the written description of all Markush groups used in the appended claims.
[0047] As used herein, and unless the context dictates otherwise, the term "coupled to" is intended to include both direct coupling (in which two elements that are coupled to each other contact each other) and indirect coupling (in which at least one additional element is located between the two elements). Therefore, the terms "coupled to" and "coupled with" are used synonymously.
[0048] It should be apparent to those skilled in the art that many more modifications besides those already described are possible without departing from the inventive concepts herein. The inventive subject matter, therefore, is not to be restricted except in the spirit of the appended claims. Moreover, in interpreting both the specification and the claims, all terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms "comprises" and "comprising" should be interpreted as referring to elements, components, or steps in a non-exclusive manner, indicating that the referenced elements, components, or steps may be present, or utilized, or combined with other elements, components, or steps that are not expressly referenced. Where the specification claims refers to at least one of something selected from the group consisting of A, B, C .... and N, the text should be interpreted as requiring only one element from the group, not A plus N, or B plus N, etc.

Claims

CLAIMS What is claimed is:
1. A multimodal story management system, comprising:
a story database configured to store at least one story that has first and second story
streams;
a story management engine coupled to the story database and first and second media devices; and
wherein the story management engine is configured to present the first story stream to the first media device, and automatically present the second story stream to the second media device as a function of a triggering event contained within the first story stream.
2. The system of claim 1, wherein the story management engine is further configured to automatically present the second story stream to the second media device upon occurrence of the triggering event.
3. The system of any of claims 1-2, wherein the story management engine is configured to automatically cause playback of the first story stream to pause while the second story stream is presented on the second media device.
4. The system of any of claims 1-3, wherein the story management engine is further configured to cause playback of the first story stream to resume after playback of the second story stream on the second media device is completed.
5. The system of any of claims 1-4, wherein the first media device is distinct from the second media device.
6. The system of any of claims 1-5, wherein at least one of the first and second media devices comprises a portable computing device.
7. The system of any of claims 1-6, wherein at least one of the first and second media devices comprises a television, a mobile telephone, a tablet PC, a laptop computer, a desktop computer, a telephone, a radio, an appliance, an electronic picture frame, a vehicle, a game platform, and a sensor.
8. The system of any of claims 1-7, wherein the story management engine is further configured to receive a time- shifting command from the second media device based on an interaction of a user with the second story stream, and wherein the time- shifting command alters playback of the first story stream on the first media device.
9. The system of claim 8, wherein the time-shifting command includes at least one of the following commands: fast-forwarding the first story stream, rewinding the first story stream, playing the first story stream, pausing the first story stream, unlocking the first story stream, triggering an event, and skipping the first story stream.
10. The system of any of claims 1-9, wherein the second media device comprises an application having a user media interface configured to allow a user to interact with the second story stream.
11. The system of any of claims 1-10, wherein the story management engine is further configured to present the first story stream to the first media device as a function of an interaction of a user with the second story stream.
12. The system of any of claims 1-11, wherein the story management engine is further configured to synchronize presentation of the second story stream on the second media device as a function of presentation of the first story stream on the first media device.
13. The system of any of claim 12, wherein the story management engine is further configured to synchronize presentation of the second story stream upon discovery of the second media device.
14. The system of any of claims 1-13, wherein the story management engine is further configured to present the first and second story streams as asynchronous streams on the first and second media devices, respectively.
15. The system of any of claims 1-14, wherein the first and second media devices are associated with a single user.
16. The system of any of claims 1-15, wherein the story management engine is further configured to discover the second media device.
17. The system of any of claims 1-16, wherein the first and second story streams comprise different modalities.
18. The system of any of claims 1-17, wherein the modalities include at least one of the following data types: visual data, audible data, haptic data, metadata, web-based data, and augmented reality data.
19. A multimodal story management system, comprising:
a story database configured to store at least one story having first and second story
streams;
a story management engine coupled to the story database and a first media device; and wherein the story management engine is configured to present the first story stream to the first media device, and automatically present the second story stream to the first media device as a function of a triggering event contained within the first story stream.
20. The system of claim 19, wherein the story management engine is configured to automatically present the second story stream to the first media device when the triggering event is reached.
21. The system of claim 19, wherein the story management engine is further configured to automatically present the second story stream to the first media device upon occurrence of the triggering event.
22. The system of any of claims 19-21, wherein the second story stream comprises content related to content of the first story stream.
23. The system of any of claims 19-22, wherein the first media device includes a software application configured to simulate different modalities and present the first and second story streams in the different modalities on the first media device.
24. The system of claim 23, wherein the application is further configure to simulate at least one of a telephone call, a messaging platform, and an email service or reader.
25. The system of claim 23, wherein the modalities include at least one of the following data types: visual data, audible data, haptic data, metadata, web-based data, and augmented reality data.
26. The system of any of claims 19-25, wherein the story management engine is configured to cause playback of the first story stream to pause during presentation of the second story stream.
27. The system of any of claims 19-25, wherein the first media device comprises a portable computing device.
28. The system of any of claims 19-25, wherein the first media device comprises a television, a mobile telephone, a tablet PC, a laptop computer, a desktop computer, a telephone, a radio, an appliance, an electronic picture frame, a vehicle, a game platform, and a sensor.
29. The system of any of claims 19-28, further comprising a second media device, and wherein the story management engine is further configured to present the first and second story streams to the second media device as a function of a second triggering event.
30. The system of claim 29, wherein the second triggering event comprises at least one of a user pausing the first or second story stream on the first media device, a loss of communication between the first media device and the story management engine, an active selection by a user, and a change in geographical location of a user.
31. A multimodal story management system, comprising:
a database configured to store at least one story that has a media stream and a set of content related to the first story stream;
a content engine coupled to the database and first and second computing devices; and wherein the content engine is configured to present the media stream to the first
computing device, and automatically present at least some of the content related to the media stream on the second computing device as a function of a triggering event contained within the media stream.
32. The system of claim 31, wherein the second computing device comprises a smart phone.
33. The system of claims 31 or 32, wherein the first computing device comprises a television, a mobile telephone, a tablet PC, a laptop computer, a desktop computer, a telephone, a radio, an appliance, an electronic picture frame, a vehicle, a game platform, or a sensor.
34. The system of any of claims 31-33, wherein the content engine is further configured to automatically present at least some of the content related to the media stream to the second computing device upon occurrence of the triggering event.
35. The system of any of claims 31-33, wherein the content engine is further configured to automatically present at least some of the content related to the media stream to the second computing device upon reaching the triggering event.
36. The system of any of claims 31-35, wherein the content engine is configured to automatically cause playback of the media stream to pause while at least some of the content is presented on the second computing device.
37. The system of any of claims 31-36, wherein the content engine is further configured to receive a time-shifting command from the second computing device based on an interaction of a user with the content, and wherein the time-shifting command alters playback of the media stream on the first computing device.
38. The system of claim 37, wherein the time-shifting command includes at least one of the following commands: fast-forwarding the media stream, rewinding the media stream, playing the media stream, pausing the media stream, unlocking the media stream, triggering an event, and skipping the media stream.
39. The system of any of claims 31-36, wherein the content engine is further configured to receive information from the second computing device based on an interaction of a user with the content, and wherein the information causes playback of the media stream on the first computing device to be altered.
40. The system of any of claims 31-36, wherein the first computing device is further configured to receive information from the second computing device based on an interaction of a user with the content, and wherein the information causes playback of the media stream on the first computing device to be altered.
41. The system of any of claims 31-40, wherein a user controls the first and second computing devices.
42. The system of any of claims 31-41, wherein the second computing device comprises a software application with a user interface configured to allow a user to interact with the content.
43. The system of any of claims 31-42, wherein the content engine is further configured to synchronize presentation of the content on the second computing device as a function of presentation of the media stream on the first computing device.
44. The system of any of claims 31-43, wherein the first and second story streams comprise different modalities.
45. The system of any of claims 31-44, wherein the content engine is further configured to present the media stream to the first computing device as a function of an interaction of a user with the content on the second computing device.
46. A method for managing multiple streams of content on disparate media devices, comprising: providing access to a story database configured to store at least one story that has first and second story streams;
providing access to a story management engine that is coupled to the story database and first and second media devices;
presenting the first story stream on the first media device using the story management engine; and
presenting the second story stream on the second media device upon triggering of an event contained within the first story stream.
47. The method of claim 46, wherein the event is triggered upon occurrence of the event.
48. The method of claim 46, wherein the event is triggered upon reaching the event.
49. The method of any of claims 46-48, wherein at least one of the first and second media devices comprises a portable computing device.
50. The method of any of claims 46-48, wherein at least one of the first and second media devices comprises a television, a mobile telephone, a tablet PC, a laptop computer, a desktop computer, a telephone, a radio, an appliance, an electronic picture frame, a vehicle, a game platform, and a sensor.
51. The method of any of claims 46-50, further comprising the story management engine discovering the second media device.
52. The method of any of claims 46-51, wherein the first and second media devices are controlled by a single user.
53. The method of any of claims 46-52, wherein the step of presenting the first story stream further comprises presenting the first story stream on the first media device as a function of an interaction of a user with the second story stream.
54. The method of any of claims 46-53, wherein the step of presenting the second story stream further comprises the story management engine synchronizing playback of the second story stream on the second media device as a function of presentation of the first story stream on the first media device.
55. The method of any of claims 46-54, further comprising the story management engine pausing presentation of the first story stream while the second story stream is presented on the second media device.
56. The method of any of claims 46-55, wherein the first and second story streams comprise different modalities.
PCT/US2013/069859 2012-11-13 2013-11-13 Systems and methods for synchronizing content playback across media devices WO2014078391A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261725799P 2012-11-13 2012-11-13
US61/725,799 2012-11-13

Publications (1)

Publication Number Publication Date
WO2014078391A1 true WO2014078391A1 (en) 2014-05-22

Family

ID=50731647

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/069859 WO2014078391A1 (en) 2012-11-13 2013-11-13 Systems and methods for synchronizing content playback across media devices

Country Status (1)

Country Link
WO (1) WO2014078391A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2224323A1 (en) * 2009-02-27 2010-09-01 Research In Motion Limited A method and handheld electronic device for triggering advertising on a display screen
US20110149160A1 (en) * 2009-12-21 2011-06-23 Sony Corporation System and method for actively managing play back of demo content by a display device based on customer actions
US20110149159A1 (en) * 2009-12-21 2011-06-23 Sony Corporation System and method for actively managing playback of demo content by display device
US20110314132A1 (en) * 2000-12-12 2011-12-22 Landmark Digital Services Llc Method and system for interacting with a user in an experiential environment
US20120020651A1 (en) * 2010-07-22 2012-01-26 Comcast Cable Communications, Llc Apparatus and method for recording content

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110314132A1 (en) * 2000-12-12 2011-12-22 Landmark Digital Services Llc Method and system for interacting with a user in an experiential environment
EP2224323A1 (en) * 2009-02-27 2010-09-01 Research In Motion Limited A method and handheld electronic device for triggering advertising on a display screen
US20110149160A1 (en) * 2009-12-21 2011-06-23 Sony Corporation System and method for actively managing play back of demo content by a display device based on customer actions
US20110149159A1 (en) * 2009-12-21 2011-06-23 Sony Corporation System and method for actively managing playback of demo content by display device
US20120020651A1 (en) * 2010-07-22 2012-01-26 Comcast Cable Communications, Llc Apparatus and method for recording content

Similar Documents

Publication Publication Date Title
US11231841B2 (en) Continuation of playback of media content by different output devices
US11475062B2 (en) Play control of content on a display device
US10194189B1 (en) Playback of content using multiple devices
US9832516B2 (en) Systems and methods for multiple device interaction with selectably presentable media streams
CN107029429B (en) System, method, and readable medium for implementing time-shifting tutoring for cloud gaming systems
US9380343B2 (en) Watch next service
US9473548B1 (en) Latency reduction in streamed content consumption
US9271015B2 (en) Systems and methods for loading more than one video content at a time
CN107113468B (en) Mobile computing equipment, implementation method and computer storage medium
US20120089923A1 (en) Dynamic companion device user interface
US20120233347A1 (en) Transmedia User Experience Engines
WO2014066257A2 (en) Hybrid advertising supported and user-owned content presentation
JP2016184774A (en) Information processing device, information processing method, information processing program, and distribution device
EP3335430A1 (en) Methods, systems, and media for presenting a content item while buffering a video
WO2016169439A1 (en) Multimedia sharing method and related device and system
US9283477B2 (en) Systems and methods for providing social games for computing devices
WO2014078391A1 (en) Systems and methods for synchronizing content playback across media devices
US11089352B1 (en) Techniques for synchronizing content playback across devices
JP2017017738A (en) Information processing apparatus, information processing method, information processing program, and distribution apparatus
WO2016161307A1 (en) Localized day parting
US8982175B2 (en) Integrating a video with an interactive activity
WO2014078416A1 (en) Systems and methods for identifying narratives related to a media stream
US20160294944A1 (en) Method and system for reach and frequency control when presenting a second video with one or more first videos
EP2879395A1 (en) Dynamic enhancement of media experience
WO2016179467A1 (en) Video reach and frequency control

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13855436

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13855436

Country of ref document: EP

Kind code of ref document: A1