WO2002037943A2 - Synchronous control of media in a peer-to-peer network - Google Patents

Synchronous control of media in a peer-to-peer network Download PDF

Info

Publication number
WO2002037943A2
WO2002037943A2 PCT/US2001/051410 US0151410W WO0237943A2 WO 2002037943 A2 WO2002037943 A2 WO 2002037943A2 US 0151410 W US0151410 W US 0151410W WO 0237943 A2 WO0237943 A2 WO 0237943A2
Authority
WO
WIPO (PCT)
Prior art keywords
user
display
broadcast
workstation
user workstation
Prior art date
Application number
PCT/US2001/051410
Other languages
French (fr)
Other versions
WO2002037943A3 (en
Inventor
Michael Sprague
Eric Beckman
Original Assignee
Wavexpress, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wavexpress, Inc. filed Critical Wavexpress, Inc.
Priority to AU2002236689A priority Critical patent/AU2002236689A1/en
Priority to KR10-2003-7005563A priority patent/KR20030094214A/en
Priority to EP01986238A priority patent/EP1337989A2/en
Priority to CA002426917A priority patent/CA2426917A1/en
Publication of WO2002037943A2 publication Critical patent/WO2002037943A2/en
Publication of WO2002037943A3 publication Critical patent/WO2002037943A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/16Analogue secrecy systems; Analogue subscription systems
    • H04N7/173Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
    • H04N7/17309Transmission or handling of upstream communications
    • H04N7/17318Direct or substantially direct transmission and handling of requests
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/27Server based end-user applications
    • H04N21/274Storing end-user multimedia data in response to end-user request, e.g. network recorder
    • H04N21/2743Video hosting of uploaded data from client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4314Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for fitting data in a restricted space on the screen, e.g. EPG data in a rectangular grid
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4318Generation of visual interfaces for content selection or interaction; Content or additional data rendering by altering the content in the rendering process, e.g. blanking, blurring or masking an image region
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams
    • H04N21/4396Processing of audio elementary streams by muting the audio signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
    • H04N21/4622Retrieving content or additional data from different sources, e.g. from a broadcast channel and the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47205End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L69/00Network arrangements, protocols or services independent of the application payload and not provided for in the other groups of this subclass
    • H04L69/30Definitions, standards or architectural aspects of layered protocol stacks
    • H04L69/32Architecture of open systems interconnection [OSI] 7-layer type protocol stacks, e.g. the interfaces between the data link level and the physical level
    • H04L69/322Intralayer communication protocols among peer entities or protocol data unit [PDU] definitions
    • H04L69/329Intralayer communication protocols among peer entities or protocol data unit [PDU] definitions in the application layer [OSI layer 7]

Definitions

  • the present invention relates generally to a system and method of creating and sharing enhancements to and in connection with a broadcast program to enhance the viewing experience of a number of viewers of the broadcast program. More particularly, the present invention concerns a method of synchronously controlling another party's media (computer, television, etc.) in a peer-to-peer network configuration.
  • Prior art systems which integrate television broadcasts with other video or audio content such as a stream of data broadcast over the internet. Additionally, instant messaging and/or chat room interfacing over the internet, World-Wide-Web or other network is also known. Such prior art, however, does not allow one party to synchronously and dynamically control another party' s media in a peer-to-peer network to create a truly interactive display for a user.
  • FIG. 1 is a schematic diagram of one exemplary system embodying the principles of the present invention, wherein multiple users view a broadcast program and simultaneously share information over a wide area network;
  • FIG. 2 is a more detailed schematic diagram of each viewer display and manipulation system according to the present invention.
  • FIG. 3 is a more detailed schematic diagram illustrating the inputs to a dynamic display controller of the present invention and an exemplary dynamically changed output;
  • FIG. 4 is diagram showing the multiple layers that are displayed on a viewer display device;
  • FIG. 5 shows a converged display including the multiple layers of FIG. 4, including a background layer for displaying a broadcast program and a user-prepared enhancement overlay layer;
  • FIG. ⁇ is a schematic diagram of another exemplary system embodying the principles of the present invention, wherein multiple system users enhance a broadcast program via a set of multi-media tools provided by a Web server over the Internet;
  • FIG. 7 is another diagram showing the multiple layers that are displayed on a viewer display in the embodiment of FIG. 6;
  • FIG. 8 shows a converged display including the multiple layers of FIG. 7, including a broadcast program (background) layer, a user-prepared enhancement overlay layer and a multi-media tool overlay layer; and
  • FIG. 9 is a flow chart of one exemplary method of generating, providing and displaying user-prepared enhancements to a plurality of viewers of a broadcast program.
  • workstations workstations
  • the term user and viewer will be used interchangeably in the remainder of this description and should be construed to mean a person who perceives a broadcast program using his or her senses, including but not limited to sight and hearing.
  • the term multi-media presentation system is used herein to indicate a system capable or presenting audio and video information to a user. However, the presentation of more than one media should not be construed as a limitation of the present invention.
  • multi-media presentation systems 12 include personal computer (PC) systems, PC televisions (PCTVs) and the like.
  • Each multi-media presentation system 12 typically includes a viewer computer 14, at least one display device 16, such as a monitor or television set, and at least one audio output 18, such as one or more speaker that may be an internal component of a television set display device or provided as a separate speaker or multiple speakers.
  • Each user multi-media presentation system 12 also includes at least one input device 20, such as a keyboard, mouse, digitizer pad, writing pad, microphone, camera or other pointing or input generating device which allows the user to provide user input the workstations 12.
  • each multi-media presentation system 12 is typically adapted to receive at least one broadcast program signal 22, which may be provided in the form of broadcast television programming (including cable and satellite television) , closed circuit television, Internet web-TV or the like, received by means of a standard television broadcast signal over the air waves, cable television or satellite television, utilizing a tuner in each user computer 14.
  • each multi-media presentation system interfaces with a computer network 24, which may be provided in the form of a local area network (LAN) , a wide area network (WAN) , telephone network or a global computer network, such as the Internet (World-Wide-Web) .
  • LAN local area network
  • WAN wide area network
  • telephone network such as the Internet
  • World-Wide-Web the Internet
  • Each user computer includes a central processing unit (CPU) 26, which controls the functions of the presentation system.
  • the CPU interfaces a broadcast receiver 28, which itself receives, as its input, the broadcast program signal 22.
  • the broadcast receiver 28 is a broadcast channel tuner that receives broadcast signals from a source such as a television broadcasting station or other programming provider or source.
  • Each user computer 14 also includes one or more internal storage devices 30, such as a disk drive, memory or CD ROM where data, including user input from other users or from within the same workstation, overlays, or other data related to the display on the user workstation may be stored.
  • a communications controller 32 is also provided in each user computer 14, to control inputs received from and outputs transmitted to the other viewers via computer network 24.
  • the communications controller 32 may act as a second receiver for receiving a second data stream provided to the user computer over the computer network.
  • the communications controller 32 may include a device such as a modem (for example, a telephone, RF, wireless or cable modem) and/or a network interface card that receives information from a local or wide area network.
  • a dynamic display controller 34 (also referred to herein as a broadcast browser) is also provided with each user computer 14.
  • the dynamic display controller interfaces the CPU 26, broadcast receiver 28 and communications controller 32 and receives, as input, the multiple data streams provided to the user computer by one or more of the broadcast program signal 22, the computer network 22 (via the communications controller 32) and the internal storage device 30.
  • the dynamic display controller 34 merges the multiple input signals and outputs a merged data signal to the display device 16.
  • An audio processor 36 may also be provided, as necessary, to receive audio data from the multiple data sources and to provide the same to the audio output device (s) 18.
  • the dynamic display controller 34 is implemented as computer software in the form of a browser user interface operating on the user computer 14, which is typically a personal computer or other similar individual computer workstation.
  • Other embodiments contemplated include a client server configuration whereby a user computer 14 is connected to a server (not shown) that contains all or at least part of such computer software forming the dynamic display controller 34.
  • Each multi-media presentation system 12 also includes at least one input device 20, which allows a first user to direct input to the dynamic display controller 34 to control what is displayed on the display device 16, thereby allowing the user to control (i.e. generate) their viewing experience and in addition, to control the saving and/or displaying of the experience to the remaining users of the system 10, as will be explained in greater detail below.
  • each user computer CPU 26 receives, as a first input, a first data stream, such as a multi-media broadcast program signal 22 via broadcast receiver 28. It may also receive, as a second input, a data stream 40 including one or more third party, user-prepared, enhancements or additions to the broadcast signal input by a system user using one or more input device 20. Typically the user would interject images (video, hand drawn images, pictures, clip art, or the like) , objects, audio (voice or other sound(s)) and/or text (instant message (IM) or chat, which will be displayed on his or her display device 16. In this manner, a user can dynamically create a user experience in accordance with his or her personal preferences.
  • a first data stream such as a multi-media broadcast program signal 22 via broadcast receiver 28. It may also receive, as a second input, a data stream 40 including one or more third party, user-prepared, enhancements or additions to the broadcast signal input by a system user using one or more input device 20.
  • the user would interject images
  • this user can also share his or her dynamically created user-prepared enhancements with other system users, to enhance their viewing experience or allow others to further modify and share their experience as well.
  • the user an also create a data stream which can control another user' s viewing experience such as by controlling the broadcast station that another user's display device is tuned to, or store data to another user' s storage device for later recall and displaying.
  • each user computer CPU may receive, via communications controller 32, a third data stream 42, which is made up of shared enhancements to the broadcast program signal which were created by other user(s) of the system and transmitted to the user' s computer over the computer network 24.
  • the user computer CPU 26 merges the two or more data streams and provides a merged signal 44 to the display device 16.
  • the CPU also provides, to communications controller 32 and under control of the dynamic display controller, a data stream made up of the user-prepared enhancements, which the communications controller 32, in turn, transmits as a shared enhancement data stream 42' to the other users of the system.
  • the user enhanced data stream 42' can include information to be displayed on a display as well as trigger or alignment indications 47 which can be used to synchronize the user enhanced data stream 42' with a broadcast presentation on another user's display device.
  • the system may include, on one or more user workstations 12 pattern recognition software or other means to align the user enhanced data stream 42' with an image pattern on a broadcast signal using one or more well known pattern recognition or "signature" type algorithms.
  • the enhanced data stream 42' may also be stored on the creating user's or receiving user's internal storage device 30 for later replay or later transmission to others.
  • a user can enhance not only his or her viewing experience by preparing user- prepared enhancements, but he or she can also enhance the viewing experience of any or all users of the system by sharing his or her user-prepared enhancements to the other users of the system or by forcing the display device of another user to be switched to another display (i.e. television channel) with or without enhancement, thereby creating a "community" viewing experience for any or all connected/subscribed users.
  • FIGS. 4 and 5 show how a layering or "overlay” strategy is utilized by the dynamic display controller 34 to control the display of the data provided by a broadcast signal and data representing user-prepared enhancements so that all of the data may be displayed in a single window or screen on each display device 16.
  • the dynamic display controller displays, in a "background” layer 50, the broadcast signal. Then, an overlay is displayed in the same window in at least one additional layer 54 on top of the background layer 50.
  • the second layer utilizes a substantially transparent background 56 or, as is disclosed herein, a background from a tool set called or named "broadcast" to signify the source of the background information.
  • the system also provided a plurality of user-selectable multimedia tools 56, which are provided in the form of a toolbar 58, typically although not necessarily displayed on the overlay layer 54.
  • the toolbar 58 may be positioned to any portion of the screen as the user desires as is well known in the art.
  • the user-selectable tools 56 allow a user to manipulate the overlay to modify the layers displayed on his or her display device.
  • Examples of user-selectable tools include drawing tools that allow a user to reference or comment on one or more objects appearing in the underlying broadcast signal on the background layer of the display. Such drawing tools may include lines, arrows, and text boxes, thought bubbles, speech bubbles and the like.
  • the user-selectable tools may also include one or more graphic insertion tools, which are responsive to a user input, to insert a graphic (image, picture, drawing, video clip, etc.) obtained from a graphic library into the overlay being displayed in the additional layer 54.
  • Such graphics libraries may be stored in internal storage 30 provided by the user computer or may be stored in remote databases, which are accessible via the computer network.
  • the user-selectable multi-media tools may also include an audio device to receive, store, edit and/or otherwise provide user-prepared auditory enhancements to the broadcast program.
  • user-prepared auditory enhancements can also be transmitted to the additional system users over the computer network where they would be output on audio output devices included at each user's multi-media presentation system.
  • the toolbar may also include a user-selectable delivery icon, which can be used by the user to trigger the delivery of any user-prepared enhancements to those of the plurality of additional system users who are included on a delivery list maintained by the user of the system that has created the user-prepared enhancements.
  • the user created enhanced broadcast may be stored on a storage device of another user for viewing at la later time by the user.
  • the resulting display appearing on the display device will appear in a single window 60, where the user-prepared enhancements will directly coincide with the portions of the underlying broadcast data stream to which they are directed if the user creating the enhancement creates and sends/stores them as they coincide with the broadcast signal the.
  • speech bubbles 62 or thought bubbles 64 can be positioned adjacent a character 66 to which the speech or thought is to be attributed, text or speech inserted, and then transmitted (such as by hitting the return key or clicking the "mouse" button) or stored such that the respective alignment of the enhancements with the broadcast signal is maintained.
  • Text boxes 68 may be positioned where they will minimize interference with important objects appearing in the underlying broadcast.
  • Text boxes 68 may include an "instant message" or a chat window, both of which can also be used to change or affect the display of another user.
  • An additional tool may also include a tool to change the display of another user to a channel of the first user's choice either immediately or later.
  • FIGS. 6-8 show an alternative embodiment of a system 10 for communicating between a plurality of multi-media presentation participants.
  • each user multi-media presentation system 12 interfaces with a Web server 70 via the Internet 72.
  • the Web server 70 provides a multi-media tool overlay 74 as well as a user-prepared enhancement overlay 76.
  • Each user multi-media presentation system 12 is similar to those described above with respect to the embodiment of FIGS. 1 and 2.
  • each user computer accesses the web server 70, where the overlay information is maintained. Nonetheless, each user computer would still include a dynamic display controller 34 for merging the overlay information accessed and manipulated via the web server with the broadcast presentation 22 received directly by each user system.
  • a display strategy utilizing three or more layers may be utilized.
  • each system user can access the same tool overlay and use the tool overlay to create and store user-prepared enhancements to the broadcast signal that are stored on a third display layer 52.
  • Each user will have a unique third display layer 52, which may also be referred to as a user-prepared enhancement overlay.
  • a user-prepared enhancement overlay While there will be a common multi-media tools overlay, each user will create his or her own user-prepared enhancement overlay.
  • the user-prepared enhancement overlay will then be transmitted to the other users of the system in a manner similar to that described above with respect to the self-contained, peer-to- peer system of FIGS. 1 and 2.
  • the use of transparent backgrounds on the each overlay layer will allow the display to appear as if the user-prepared enhancements were simply inserted into the underlying broadcast as is shown in FIG. 8.
  • a special tool may be provided with the plurality of multi-media tools. This tool will be referred to as a "broadcast mute" tool.
  • the purpose of the broadcast mute "tool” is to dampen or otherwise minimize the interference of the underlying broadcast signal so that the user-prepared enhancement overlay appear more prominently in the merged display.
  • One means by which the broadcast mute feature may emphasize the user-prepared enhancement overlay is to provide a video mute feature.
  • the video mute feature may be implemented as a control for the brightness and/or contrast signal of the underlying broadcast signal sent to the display device.
  • the video mute feature By lowering either or both of the brightness or contrast signal to the display device, the appearance of the broadcast data in the merged display will be dampened so that the user-prepared enhancements will be more prominent. Since the purpose of the broadcast mute tool is to provide emphasis to the user-prepared enhancements, when such enhancements are provided to the remainder of the users as shared enhancements, selection of the broadcast mute tool will affect the underlying broadcast signal of all users to whom the enhancement is shared.
  • the tool set 58 may also include an audio mute tool.
  • the audio mute tool will operate generally in a similar manner to the video mute tool. However, instead of affecting the underlying broadcast's video signal, it would allow audio enhancements to be highlighted by reducing the volume of the underlying broadcast signal. Of course both the video mute and audio mute features could be used together.
  • FIG. 9 A method of generating and providing user-prepared enhancements to a plurality of viewers of a broadcast program 100 is shown in FIG. 9.
  • a plurality of viewers of the broadcast program will utilize a display device for viewing the broadcast program.
  • Each viewer will also have a computer for controlling the display device and for interfacing each user to the other viewers over a computer network.
  • the method 100 begins by displaying a broadcast program in a background layer on at least one viewer display device, act 110.
  • at least one overlay layer is provided on each viewer display device, act 120.
  • Each overlay layer includes a transparent background to allow the broadcast program being displayed on the background layer to "bleed through”.
  • At least one of the overlay layers includes a plurality of user selectable multi-media tools, which are responsive to user input, for manipulating at least one overlay layer by including user-prepared enhancements thereupon.
  • any user-prepared enhancements input by a viewer using the tools is stored, act 130.
  • the user-prepared enhancements are then transmitted to any additional users of the system who are viewing the underlying broadcast presentation, act 140.
  • the user-prepared enhancements are transmitted in response to a user selectable delivery icon so that the user can complete the user-prepared enhancement and then deliver the enhancement when he or she so desires and to whom he or she desires.
  • the user-prepared enhancement that has been transmitted to the additional system users is either displayed on at least one overlay layer on top of the broadcast layer being displayed on a display device at a receiving user' s system or stored on a storage device which is part of the receiving user's system.
  • the user prepared enhancement that has been received can be used to control the display of the receiving user including changing a broadcast channel of the user either immediately of at a predetermined time or date in the future .

Abstract

In a peer-to-peer multi-media communication network (10), a system for controlling a broadcast viewing experience of one user by another user. Each user has access to a user workstation (12) including at least an input device (20) and a display device (16). The second user's workstation (12) includes a storage device (30) for storing at least user input for controlling a display (60) on a display device (16) coupled to the second user workstation (12). The first user workstation (12) includes a dynamic display controller (34), responsive to an input device (20) on the first user workstation (12), for receiving input (40) from the first user workstation (12) and for transmitting (44) the user input to at least the second user workstation (12). The user input (44, 45) received by the second user workstation (12) controls the display (60) on the second user workstation display device (16).

Description

SYNCHRONOUS CONTROL OF MEDIA IN A PEER-TO-PEER NETWORK
FIELD OF THE INVENTION The present invention relates generally to a system and method of creating and sharing enhancements to and in connection with a broadcast program to enhance the viewing experience of a number of viewers of the broadcast program. More particularly, the present invention concerns a method of synchronously controlling another party's media (computer, television, etc.) in a peer-to-peer network configuration.
BACKGROUND OF THE INVENTION Prior art systems are known which integrate television broadcasts with other video or audio content such as a stream of data broadcast over the internet. Additionally, instant messaging and/or chat room interfacing over the internet, World-Wide-Web or other network is also known. Such prior art, however, does not allow one party to synchronously and dynamically control another party' s media in a peer-to-peer network to create a truly interactive display for a user.
BRIEF DESCRIPTION OF THE DRAWINGS The present invention will be better understood by reading the following detailed description, taken together with the drawings wherein:
FIG. 1 is a schematic diagram of one exemplary system embodying the principles of the present invention, wherein multiple users view a broadcast program and simultaneously share information over a wide area network;
FIG. 2 is a more detailed schematic diagram of each viewer display and manipulation system according to the present invention;
FIG. 3 is a more detailed schematic diagram illustrating the inputs to a dynamic display controller of the present invention and an exemplary dynamically changed output; FIG. 4 is diagram showing the multiple layers that are displayed on a viewer display device;
FIG. 5 shows a converged display including the multiple layers of FIG. 4, including a background layer for displaying a broadcast program and a user-prepared enhancement overlay layer;
FIG. β is a schematic diagram of another exemplary system embodying the principles of the present invention, wherein multiple system users enhance a broadcast program via a set of multi-media tools provided by a Web server over the Internet;
FIG. 7 is another diagram showing the multiple layers that are displayed on a viewer display in the embodiment of FIG. 6; FIG. 8 shows a converged display including the multiple layers of FIG. 7, including a broadcast program (background) layer, a user-prepared enhancement overlay layer and a multi-media tool overlay layer; and
FIG. 9 is a flow chart of one exemplary method of generating, providing and displaying user-prepared enhancements to a plurality of viewers of a broadcast program.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT A system 10, FIG. 1, on which the present invention can be utilized and which embodies the present invention, includes a plurality of multi-media presentation systems (workstations) 12 maintained by a plurality of system users or viewers, typically at least two. (The term user and viewer will be used interchangeably in the remainder of this description and should be construed to mean a person who perceives a broadcast program using his or her senses, including but not limited to sight and hearing.) The term multi-media presentation system is used herein to indicate a system capable or presenting audio and video information to a user. However, the presentation of more than one media should not be construed as a limitation of the present invention. Examples of such multi-media presentation systems 12 include personal computer (PC) systems, PC televisions (PCTVs) and the like. Each multi-media presentation system 12 typically includes a viewer computer 14, at least one display device 16, such as a monitor or television set, and at least one audio output 18, such as one or more speaker that may be an internal component of a television set display device or provided as a separate speaker or multiple speakers. Each user multi-media presentation system 12 also includes at least one input device 20, such as a keyboard, mouse, digitizer pad, writing pad, microphone, camera or other pointing or input generating device which allows the user to provide user input the workstations 12.
As will be described more fully below, each multi-media presentation system 12 is typically adapted to receive at least one broadcast program signal 22, which may be provided in the form of broadcast television programming (including cable and satellite television) , closed circuit television, Internet web-TV or the like, received by means of a standard television broadcast signal over the air waves, cable television or satellite television, utilizing a tuner in each user computer 14. In addition, each multi-media presentation system interfaces with a computer network 24, which may be provided in the form of a local area network (LAN) , a wide area network (WAN) , telephone network or a global computer network, such as the Internet (World-Wide-Web) . The components of one example of a multi-media presentation system/workstation 12 are shown in FIG. 2. The heart of each such system is the user computer 14. Each user computer includes a central processing unit (CPU) 26, which controls the functions of the presentation system. The CPU interfaces a broadcast receiver 28, which itself receives, as its input, the broadcast program signal 22. In one embodiment, the broadcast receiver 28 is a broadcast channel tuner that receives broadcast signals from a source such as a television broadcasting station or other programming provider or source. Each user computer 14 also includes one or more internal storage devices 30, such as a disk drive, memory or CD ROM where data, including user input from other users or from within the same workstation, overlays, or other data related to the display on the user workstation may be stored. A communications controller 32 is also provided in each user computer 14, to control inputs received from and outputs transmitted to the other viewers via computer network 24. The communications controller 32 may act as a second receiver for receiving a second data stream provided to the user computer over the computer network. In the preferred embodiment, the communications controller 32 may include a device such as a modem (for example, a telephone, RF, wireless or cable modem) and/or a network interface card that receives information from a local or wide area network.
A dynamic display controller 34 (also referred to herein as a broadcast browser) is also provided with each user computer 14. The dynamic display controller interfaces the CPU 26, broadcast receiver 28 and communications controller 32 and receives, as input, the multiple data streams provided to the user computer by one or more of the broadcast program signal 22, the computer network 22 (via the communications controller 32) and the internal storage device 30. The dynamic display controller 34 merges the multiple input signals and outputs a merged data signal to the display device 16. An audio processor 36 may also be provided, as necessary, to receive audio data from the multiple data sources and to provide the same to the audio output device (s) 18.
In the preferred embodiment of the present invention, which is disclosed for illustrative purposes only and not considered a limitation of the present invention, the dynamic display controller 34 is implemented as computer software in the form of a browser user interface operating on the user computer 14, which is typically a personal computer or other similar individual computer workstation. Other embodiments contemplated include a client server configuration whereby a user computer 14 is connected to a server (not shown) that contains all or at least part of such computer software forming the dynamic display controller 34. Each multi-media presentation system 12 also includes at least one input device 20, which allows a first user to direct input to the dynamic display controller 34 to control what is displayed on the display device 16, thereby allowing the user to control (i.e. generate) their viewing experience and in addition, to control the saving and/or displaying of the experience to the remaining users of the system 10, as will be explained in greater detail below.
As can be seen more clearly from FIG. 3, each user computer CPU 26 receives, as a first input, a first data stream, such as a multi-media broadcast program signal 22 via broadcast receiver 28. It may also receive, as a second input, a data stream 40 including one or more third party, user-prepared, enhancements or additions to the broadcast signal input by a system user using one or more input device 20. Typically the user would interject images (video, hand drawn images, pictures, clip art, or the like) , objects, audio (voice or other sound(s)) and/or text (instant message (IM) or chat, which will be displayed on his or her display device 16. In this manner, a user can dynamically create a user experience in accordance with his or her personal preferences. As will become more fully apparent below, this user can also share his or her dynamically created user-prepared enhancements with other system users, to enhance their viewing experience or allow others to further modify and share their experience as well. The user an also create a data stream which can control another user' s viewing experience such as by controlling the broadcast station that another user's display device is tuned to, or store data to another user' s storage device for later recall and displaying.
As a third optional input, each user computer CPU may receive, via communications controller 32, a third data stream 42, which is made up of shared enhancements to the broadcast program signal which were created by other user(s) of the system and transmitted to the user' s computer over the computer network 24.
The user computer CPU 26 merges the two or more data streams and provides a merged signal 44 to the display device 16. The CPU also provides, to communications controller 32 and under control of the dynamic display controller, a data stream made up of the user-prepared enhancements, which the communications controller 32, in turn, transmits as a shared enhancement data stream 42' to the other users of the system. The user enhanced data stream 42' can include information to be displayed on a display as well as trigger or alignment indications 47 which can be used to synchronize the user enhanced data stream 42' with a broadcast presentation on another user's display device. In this embodiment, the system may include, on one or more user workstations 12 pattern recognition software or other means to align the user enhanced data stream 42' with an image pattern on a broadcast signal using one or more well known pattern recognition or "signature" type algorithms. The enhanced data stream 42' may also be stored on the creating user's or receiving user's internal storage device 30 for later replay or later transmission to others.
As can be appreciated, using such a system, a user can enhance not only his or her viewing experience by preparing user- prepared enhancements, but he or she can also enhance the viewing experience of any or all users of the system by sharing his or her user-prepared enhancements to the other users of the system or by forcing the display device of another user to be switched to another display (i.e. television channel) with or without enhancement, thereby creating a "community" viewing experience for any or all connected/subscribed users.
FIGS. 4 and 5 show how a layering or "overlay" strategy is utilized by the dynamic display controller 34 to control the display of the data provided by a broadcast signal and data representing user-prepared enhancements so that all of the data may be displayed in a single window or screen on each display device 16. The dynamic display controller displays, in a "background" layer 50, the broadcast signal. Then, an overlay is displayed in the same window in at least one additional layer 54 on top of the background layer 50. (It is understood that the order or layers can be reversed, if desired.) In order to allow the broadcast signal in the background layer 50 to be visible through the second or overlay layer 54, the second layer utilizes a substantially transparent background 56 or, as is disclosed herein, a background from a tool set called or named "broadcast" to signify the source of the background information. The system also provided a plurality of user-selectable multimedia tools 56, which are provided in the form of a toolbar 58, typically although not necessarily displayed on the overlay layer 54. The toolbar 58 may be positioned to any portion of the screen as the user desires as is well known in the art. The user-selectable tools 56 allow a user to manipulate the overlay to modify the layers displayed on his or her display device. Examples of user-selectable tools include drawing tools that allow a user to reference or comment on one or more objects appearing in the underlying broadcast signal on the background layer of the display. Such drawing tools may include lines, arrows, and text boxes, thought bubbles, speech bubbles and the like. The user-selectable tools may also include one or more graphic insertion tools, which are responsive to a user input, to insert a graphic (image, picture, drawing, video clip, etc.) obtained from a graphic library into the overlay being displayed in the additional layer 54. Such graphics libraries may be stored in internal storage 30 provided by the user computer or may be stored in remote databases, which are accessible via the computer network.
The user-selectable multi-media tools may also include an audio device to receive, store, edit and/or otherwise provide user-prepared auditory enhancements to the broadcast program. Of course, like the video signals transmitted to the other users, user-prepared auditory enhancements can also be transmitted to the additional system users over the computer network where they would be output on audio output devices included at each user's multi-media presentation system. In addition to the text, graphic and audio tools, the toolbar may also include a user-selectable delivery icon, which can be used by the user to trigger the delivery of any user-prepared enhancements to those of the plurality of additional system users who are included on a delivery list maintained by the user of the system that has created the user-prepared enhancements. Of course, only those additional system users that are logged onto their system and viewing the same underlying broadcast program as the user creating the enhancements will be able to display or otherwise output the shared enhancements on their display or audio output devices however, the user created enhanced broadcast may be stored on a storage device of another user for viewing at la later time by the user.
When the multiple data streams are merged, the resulting display appearing on the display device will appear in a single window 60, where the user-prepared enhancements will directly coincide with the portions of the underlying broadcast data stream to which they are directed if the user creating the enhancement creates and sends/stores them as they coincide with the broadcast signal the. For example, speech bubbles 62 or thought bubbles 64 can be positioned adjacent a character 66 to which the speech or thought is to be attributed, text or speech inserted, and then transmitted (such as by hitting the return key or clicking the "mouse" button) or stored such that the respective alignment of the enhancements with the broadcast signal is maintained. Text boxes 68 may be positioned where they will minimize interference with important objects appearing in the underlying broadcast. Text boxes 68 may include an "instant message" or a chat window, both of which can also be used to change or affect the display of another user. An additional tool may also include a tool to change the display of another user to a channel of the first user's choice either immediately or later. FIGS. 6-8 .show an alternative embodiment of a system 10 for communicating between a plurality of multi-media presentation participants. In this embodiment, each user multi-media presentation system 12 interfaces with a Web server 70 via the Internet 72. The Web server 70 provides a multi-media tool overlay 74 as well as a user-prepared enhancement overlay 76. Each user multi-media presentation system 12 is similar to those described above with respect to the embodiment of FIGS. 1 and 2. However, instead of storing a multi-media tool overlay in local system memory and having the dynamic display controller retrieve the overlay from the system memory, in this embodiment, each user computer accesses the web server 70, where the overlay information is maintained. Nonetheless, each user computer would still include a dynamic display controller 34 for merging the overlay information accessed and manipulated via the web server with the broadcast presentation 22 received directly by each user system.
In this embodiment, since multiple users will access a common multi-media tool overlay 74, a display strategy utilizing three or more layers may be utilized. In this manner, each system user can access the same tool overlay and use the tool overlay to create and store user-prepared enhancements to the broadcast signal that are stored on a third display layer 52. Each user will have a unique third display layer 52, which may also be referred to as a user-prepared enhancement overlay. While there will be a common multi-media tools overlay, each user will create his or her own user-prepared enhancement overlay. The user-prepared enhancement overlay will then be transmitted to the other users of the system in a manner similar to that described above with respect to the self-contained, peer-to- peer system of FIGS. 1 and 2. Once the layers are merged by the dynamic display controller, the use of transparent backgrounds on the each overlay layer will allow the display to appear as if the user-prepared enhancements were simply inserted into the underlying broadcast as is shown in FIG. 8. In order to emphasize user-prepared enhancements, a special tool may be provided with the plurality of multi-media tools. This tool will be referred to as a "broadcast mute" tool. The purpose of the broadcast mute "tool" is to dampen or otherwise minimize the interference of the underlying broadcast signal so that the user-prepared enhancement overlay appear more prominently in the merged display. One means by which the broadcast mute feature may emphasize the user-prepared enhancement overlay is to provide a video mute feature. The video mute feature may be implemented as a control for the brightness and/or contrast signal of the underlying broadcast signal sent to the display device. By lowering either or both of the brightness or contrast signal to the display device, the appearance of the broadcast data in the merged display will be dampened so that the user-prepared enhancements will be more prominent. Since the purpose of the broadcast mute tool is to provide emphasis to the user-prepared enhancements, when such enhancements are provided to the remainder of the users as shared enhancements, selection of the broadcast mute tool will affect the underlying broadcast signal of all users to whom the enhancement is shared.
In a similar manner as the broadcast mute tool, the tool set 58 may also include an audio mute tool. The audio mute tool will operate generally in a similar manner to the video mute tool. However, instead of affecting the underlying broadcast's video signal, it would allow audio enhancements to be highlighted by reducing the volume of the underlying broadcast signal. Of course both the video mute and audio mute features could be used together.
A method of generating and providing user-prepared enhancements to a plurality of viewers of a broadcast program 100 is shown in FIG. 9. To utilize the method, a plurality of viewers of the broadcast program will utilize a display device for viewing the broadcast program. Each viewer will also have a computer for controlling the display device and for interfacing each user to the other viewers over a computer network.
The method 100 begins by displaying a broadcast program in a background layer on at least one viewer display device, act 110. Next, at least one overlay layer is provided on each viewer display device, act 120. Each overlay layer includes a transparent background to allow the broadcast program being displayed on the background layer to "bleed through". At least one of the overlay layers includes a plurality of user selectable multi-media tools, which are responsive to user input, for manipulating at least one overlay layer by including user-prepared enhancements thereupon.
Then, user interaction with the provided multi-media tools is monitored and any user-prepared enhancements input by a viewer using the tools is stored, act 130. The user-prepared enhancements are then transmitted to any additional users of the system who are viewing the underlying broadcast presentation, act 140. Preferably, the user-prepared enhancements are transmitted in response to a user selectable delivery icon so that the user can complete the user-prepared enhancement and then deliver the enhancement when he or she so desires and to whom he or she desires.
In act 150, the user-prepared enhancement that has been transmitted to the additional system users is either displayed on at least one overlay layer on top of the broadcast layer being displayed on a display device at a receiving user' s system or stored on a storage device which is part of the receiving user's system. Next, the user prepared enhancement that has been received can be used to control the display of the receiving user including changing a broadcast channel of the user either immediately of at a predetermined time or date in the future . Accordingly, the system and method described above, which embody the present invention, allows viewers of a broadcast presentation to enhance their own viewing experience and enhance the viewing experience of others by dynamically and synchronously preparing, changing and sharing multi-media enhancements to the underlying broadcast presentation. Modifications and substitutions by one of ordinary skill in the art are considered to be within the scope of the present invention that is not to be limited except by the claims which follow.

Claims

CLAIMS What is claimed is:
1. In a peer-to-peer multi-media communication network, a system for controlling a broadcast viewing experience of one user by another user, the system comprising: a first user workstation including at least an input device and a display device; a second user workstation, coupled to said first user workstation, and including a storage device for storing at least user input for controlling a display on a display device coupled to said second user workstation; and said first user workstation further including a dynamic display controller, responsive to said first user workstation input device, for receiving input from said first user workstation and for transmitting said user input to at least said second user workstation, said user input for controlling said display on said second user workstation display device.
The system of claim 1 wherein said display includes a broadcast presentation.
3. The system of claim 2 wherein said broadcast presentation includes a television broadcast presentation.
4. The system of claim 1 wherein said system synchronously and dynamically controls said broadcast viewing experience of one user by another user.
5. The system of claim 1 wherein said communication network is selected from the group consisting of a computer network, telephone network, a wide area network, a local area network, and the World-Wide-Web.
6. The system of claim 1 wherein said user input from said first user workstation is stored on said storage device of said second user workstation for later display on said second user workstation.
7. The system of claim 6 wherein said user input controls when said display will occur on said second user workstation.
8. The system of claim 6 wherein said stored user input controls what will be displayed on said second user workstation.
9. The system of claim 1 wherein each of said first and second user workstations include a multi-media display device displaying a broadcast presentation including a single window layered display and a computer controlling said multimedia display device and interfacing each of said first and second workstations over a computer network, said single- window layered display including: a broadcast layer, for displaying said broadcast presentation in a background layer of said layered display; and at least one overlay displayed in at least a second layer of said layered display on top of said broadcast layer on said single-window, layered display, said at least one overlay having a substantially transparent background and allowing said broadcast presentation in said broadcast layer to be viewed through said at least one overlay.
10. The system of claim 9 wherein said at least one user workstation includes a plurality of user-selectable multi-media tools, for allowing a user at said first user workstation to manipulate said at least one overlay to add user-prepared enhancements to said broadcast presentation, and wherein said dynamic display controller transmits said user-prepared enhancements to at least said second user workstation.
11. The system of claim 9 wherein said user input includes an instant message to be displayed on said at least one overlay.
12. The system of claim 9 wherein said user input includes a chat message to be displayed on said second user workstation.
PCT/US2001/051410 2000-10-20 2001-10-19 Synchronous control of media in a peer-to-peer network WO2002037943A2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
AU2002236689A AU2002236689A1 (en) 2000-10-20 2001-10-19 Synchronous control of media in a peer-to-peer network
KR10-2003-7005563A KR20030094214A (en) 2000-10-20 2001-10-19 Synchronous control of media in a peer to peer network
EP01986238A EP1337989A2 (en) 2000-10-20 2001-10-19 Synchronous control of media in a peer-to-peer network
CA002426917A CA2426917A1 (en) 2000-10-20 2001-10-19 Synchronous control of media in a peer-to-peer network

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US24191000P 2000-10-20 2000-10-20
US60/241,910 2000-10-20

Publications (2)

Publication Number Publication Date
WO2002037943A2 true WO2002037943A2 (en) 2002-05-16
WO2002037943A3 WO2002037943A3 (en) 2002-08-22

Family

ID=22912669

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2001/051410 WO2002037943A2 (en) 2000-10-20 2001-10-19 Synchronous control of media in a peer-to-peer network

Country Status (5)

Country Link
EP (1) EP1337989A2 (en)
KR (1) KR20030094214A (en)
AU (1) AU2002236689A1 (en)
CA (1) CA2426917A1 (en)
WO (1) WO2002037943A2 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004032516A3 (en) * 2002-10-02 2004-05-21 Hrl Lab Llc Dynamic video annotation
EP1443765A2 (en) * 2003-01-30 2004-08-04 Broadcom Corporation Media channel setup in a media exchange network
EP1463324A1 (en) * 2003-03-25 2004-09-29 Broadcom Corporation Automated routing and consumption of media through a media exchange network
GB2400200A (en) * 2003-04-05 2004-10-06 Hewlett Packard Development Co Use of nodes to monitor or manage peer to peer network
US7127577B2 (en) 2003-01-21 2006-10-24 Equallogic Inc. Distributed snapshot process
US7461146B2 (en) 2003-01-20 2008-12-02 Equallogic, Inc. Adaptive storage block data distribution
US7571206B2 (en) 2002-08-12 2009-08-04 Equallogic, Inc. Transparent request routing for a partitioned application service
US7627650B2 (en) 2003-01-20 2009-12-01 Equallogic, Inc. Short-cut response for distributed services
US7881315B2 (en) 2006-06-27 2011-02-01 Microsoft Corporation Local peer-to-peer digital content distribution
US7937551B2 (en) 2003-01-21 2011-05-03 Dell Products L.P. Storage systems having differentiated storage pools
US8037264B2 (en) 2003-01-21 2011-10-11 Dell Products, L.P. Distributed snapshot process
EP2437512A1 (en) * 2010-09-29 2012-04-04 TeliaSonera AB Social television service
US8499086B2 (en) 2003-01-21 2013-07-30 Dell Products L.P. Client load distribution
US8621102B2 (en) 2002-12-11 2013-12-31 Broadcom Corporation Automated routing of media through a media exchange network

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5318450A (en) * 1989-11-22 1994-06-07 Gte California Incorporated Multimedia distribution system for instructional materials
US5491743A (en) * 1994-05-24 1996-02-13 International Business Machines Corporation Virtual conference system and terminal apparatus therefor
US5905508A (en) * 1997-04-01 1999-05-18 Microsoft Corporation Method and system for dynamically plotting an element on an image using a table
US5926179A (en) * 1996-09-30 1999-07-20 Sony Corporation Three-dimensional virtual reality space display processing apparatus, a three-dimensional virtual reality space display processing method, and an information providing medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5318450A (en) * 1989-11-22 1994-06-07 Gte California Incorporated Multimedia distribution system for instructional materials
US5491743A (en) * 1994-05-24 1996-02-13 International Business Machines Corporation Virtual conference system and terminal apparatus therefor
US5926179A (en) * 1996-09-30 1999-07-20 Sony Corporation Three-dimensional virtual reality space display processing apparatus, a three-dimensional virtual reality space display processing method, and an information providing medium
US5905508A (en) * 1997-04-01 1999-05-18 Microsoft Corporation Method and system for dynamically plotting an element on an image using a table

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7571206B2 (en) 2002-08-12 2009-08-04 Equallogic, Inc. Transparent request routing for a partitioned application service
US8055706B2 (en) 2002-08-12 2011-11-08 Dell Products, L.P. Transparent request routing for a partitioned application service
US7925696B2 (en) 2002-08-12 2011-04-12 Dell Products L.P. Transparent request routing for a partitioned application service
JP2006518117A (en) * 2002-10-02 2006-08-03 レイセオン・カンパニー Dynamic video annotation
AU2003275435B2 (en) * 2002-10-02 2009-08-06 Raytheon Company Dynamic video annotation
WO2004032516A3 (en) * 2002-10-02 2004-05-21 Hrl Lab Llc Dynamic video annotation
US8621102B2 (en) 2002-12-11 2013-12-31 Broadcom Corporation Automated routing of media through a media exchange network
US7627650B2 (en) 2003-01-20 2009-12-01 Equallogic, Inc. Short-cut response for distributed services
US7461146B2 (en) 2003-01-20 2008-12-02 Equallogic, Inc. Adaptive storage block data distribution
US7962609B2 (en) 2003-01-20 2011-06-14 Dell Products, L.P. Adaptive storage block data distribution
US7127577B2 (en) 2003-01-21 2006-10-24 Equallogic Inc. Distributed snapshot process
US8966197B2 (en) 2003-01-21 2015-02-24 Dell Products L.P. Distributed snapshot process
US8209515B2 (en) 2003-01-21 2012-06-26 Dell Products Lp Storage systems having differentiated storage pools
US8612616B2 (en) 2003-01-21 2013-12-17 Dell Products, L.P. Client load distribution
US7937551B2 (en) 2003-01-21 2011-05-03 Dell Products L.P. Storage systems having differentiated storage pools
US8037264B2 (en) 2003-01-21 2011-10-11 Dell Products, L.P. Distributed snapshot process
US8499086B2 (en) 2003-01-21 2013-07-30 Dell Products L.P. Client load distribution
EP1443765A3 (en) * 2003-01-30 2004-08-25 Broadcom Corporation Media channel setup in a media exchange network
EP1443765A2 (en) * 2003-01-30 2004-08-04 Broadcom Corporation Media channel setup in a media exchange network
EP1463324A1 (en) * 2003-03-25 2004-09-29 Broadcom Corporation Automated routing and consumption of media through a media exchange network
US7536471B2 (en) 2003-04-05 2009-05-19 Hewlett-Packard Development Company, L.P. Use of nodes to monitor or manage peer to peer networks
GB2400200A (en) * 2003-04-05 2004-10-06 Hewlett Packard Development Co Use of nodes to monitor or manage peer to peer network
GB2400771B (en) * 2003-04-05 2006-06-28 Hewlett Packard Development Co Use of nodes to monitor or manage peer to peer networks
US7881315B2 (en) 2006-06-27 2011-02-01 Microsoft Corporation Local peer-to-peer digital content distribution
EP2437512A1 (en) * 2010-09-29 2012-04-04 TeliaSonera AB Social television service
US9538140B2 (en) 2010-09-29 2017-01-03 Teliasonera Ab Social television service

Also Published As

Publication number Publication date
KR20030094214A (en) 2003-12-11
AU2002236689A1 (en) 2002-05-21
EP1337989A2 (en) 2003-08-27
WO2002037943A3 (en) 2002-08-22
CA2426917A1 (en) 2002-05-16

Similar Documents

Publication Publication Date Title
US20030078969A1 (en) Synchronous control of media in a peer-to-peer network
US20040012717A1 (en) Broadcast browser including multi-media tool overlay and method of providing a converged multi-media display including user-enhanced data
AU2004248274C1 (en) Intelligent collaborative media
US6064420A (en) Simulating two way connectivity for one way data streams for multiple parties
JP4346688B2 (en) Audio visual system, headend and receiver unit
US6519771B1 (en) System for interactive chat without a keyboard
JP4433441B2 (en) System for dual display interaction with integrated television and internet content
US6732373B2 (en) Host apparatus for simulating two way connectivity for one way data streams
EP1316209B1 (en) Video interaction
US20020087974A1 (en) System and method of providing relevant interactive content to a broadcast display
US6249914B1 (en) Simulating two way connectivity for one way data streams for multiple parties including the use of proxy
JP4229997B2 (en) Partial or person-linked multiple linkage display system
EP1161053B1 (en) Communication system for network advertising
US20040078814A1 (en) Module-based interactive television ticker
EP1337989A2 (en) Synchronous control of media in a peer-to-peer network
JPH11196345A (en) Display system
JP2001177579A (en) High-video programming system and method for supplying distributed community network
JP2006101561A (en) Master-slave joint type display system
WO2001060071A2 (en) Interactive multimedia user interface using affinity based categorization
Steinmetz et al. Multimedia applications
WO2019056001A1 (en) System and method for interactive video conferencing
JP2008118665A (en) Slave-screen relative type multi-set joint type display system
JP2000181421A (en) Master and slave interlocking type display system
JP2008104210A (en) Multi-channel display system connected with a plurality of interlocking display apparatuses
JP2008118664A (en) Time-base relative type multi-set joint type display system

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

AK Designated states

Kind code of ref document: A3

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 1020037005563

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 2426917

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 2001986238

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2001986238

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWP Wipo information: published in national office

Ref document number: 1020037005563

Country of ref document: KR

WWW Wipo information: withdrawn in national office

Ref document number: 2001986238

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: JP