CA2140850C - Networked system for display of multimedia presentations - Google Patents

Networked system for display of multimedia presentations

Info

Publication number
CA2140850C
CA2140850C CA002140850A CA2140850A CA2140850C CA 2140850 C CA2140850 C CA 2140850C CA 002140850 A CA002140850 A CA 002140850A CA 2140850 A CA2140850 A CA 2140850A CA 2140850 C CA2140850 C CA 2140850C
Authority
CA
Canada
Prior art keywords
audio
video
network
data
reducing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CA002140850A
Other languages
French (fr)
Other versions
CA2140850A1 (en
Inventor
Howard Paul Katseff
Bethany Scott Robinson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AT&T Corp
Original Assignee
American Telephone and Telegraph Co Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by American Telephone and Telegraph Co Inc filed Critical American Telephone and Telegraph Co Inc
Publication of CA2140850A1 publication Critical patent/CA2140850A1/en
Application granted granted Critical
Publication of CA2140850C publication Critical patent/CA2140850C/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/80Responding to QoS
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • H04L29/06
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L47/00Traffic control in data switching networks
    • H04L47/10Flow control; Congestion control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L47/00Traffic control in data switching networks
    • H04L47/10Flow control; Congestion control
    • H04L47/26Flow control; Congestion control using explicit feedback to the source, e.g. choke packets
    • H04L47/263Rate modification at the source after receiving feedback
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L47/00Traffic control in data switching networks
    • H04L47/10Flow control; Congestion control
    • H04L47/30Flow control; Congestion control in combination with information about buffer occupancy at either end or at transit nodes
    • H04L65/4084
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/61Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
    • H04L65/612Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio for unicast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/75Media network packet handling
    • H04L65/752Media network packet handling adapting media to network capabilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/146Data rate or code amount at the encoder output
    • H04N19/152Data rate or code amount at the encoder output by measuring the fullness of the transmission buffer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/23614Multiplexing of additional data and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/2368Multiplexing of audio and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/24Monitoring of processes or resources, e.g. monitoring of server load, available bandwidth, upstream requests
    • H04N21/2401Monitoring of the client buffer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/266Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
    • H04N21/2662Controlling the complexity of the video stream, e.g. by scaling the resolution or bitrate of the video stream based on the client capabilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43072Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of multiple content streams on the same device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4341Demultiplexing of audio and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4348Demultiplexing of additional data and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/654Transmission by server directed to the client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6587Control parameters, e.g. trick play commands, viewpoint selection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • H04L29/06027
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/50Reducing energy consumption in communication networks in wire-line communication networks, e.g. low power modes or reduced link rate

Abstract

Disclosed is a networked multimedia information system which may be utilized to record, store and distribute multimedia presentations together with any supplemental materials that may be referenced during the presentation. The recorded presentation, together with the associated supplemental materials, may be simultaneously presented on a display containing two separate viewing windows. The effects of network congestion are minimized by prefetching audio and video data for storage in audio and video buffers. An adaptive control algorithm compensates for network congestion by dynamically varying the rate at which video frames are retrieved over the network, in response to network traffic conditions. The audio playback speed is reduced if the audio data does not arrive fast enough over the network to maintain the desired size of the audio buffer after the amount of video data transmitted across the network has been reduced to a minimum value.

Description

_..

NETWORKED SYSTEM FOR DISPLAY OF MULTIL~DIA PRESENTATIONS
FIELD OF THE INVENTION
The present invention relates to a networked multimedia information system, and more particularly, to a system for the storage and distribution of a recorded multimedia presentation, such as a seminar or conference, together with any supplemental materials, such as viewgraphs, slides or handouts, that are referenced during the presentation.
BACKGROUND OF THE INVENTION
Businesses today require access to ever-increasing amounts of information, such as reports, correspondence, contracts, engineering drawings and marketing materials.
Making this information available to a large number of distributed corporate employees is often a difficult problem.
There are a number of commercially available document management systems, such as NCR's Document Management SystemTT"' and Bellcore's SuperBook Document BrowserT"', which allow still images of documents, photographs and diagrams to be electronically stored for subsequent access over a network.
For example, NCR's Document Management System controls the flow of image documents over a corporate network. A user employing a microcomputer or workstation can access documents that have been stored in an electronic format on a central file server using imaging technology.
However, there are few commercially available document management systems that also allow continuous media, such as recordings of audio and video presentations, to be stored and distributed over a network. Although multimedia applications make enormous demands on the resources of a computer system, recent advances in storage, processing, compression and network technologies have facilitated the development of networked multimedia information systems. These technologies have advanced to where it is now possible to combine numerous media, including documents, audio and video, into a single system.
Since the number of employees who may participate in a corporate presentation, such as a conference or a seminar, is often limited by scheduling and location constraints, it is desirable to provide a system that can record and store corporate presentations for subsequent viewing by additional corporate employees at convenient times and locations.
A major difficulty with subsequent viewing of recorded presentations, however, is the poor reproduction of supplemental materials associated with the recorded presentation, such as viewgraphs, slides and handouts.
Accordingly, a need exists for an improved networked multimedia system capable of storing and indexing multimedia presentations, such as seminars and conferences, for subsequent access over a communications network. A further need exists for a networked multimedia system that separately processes the supplemental materials, such as viewgraphs, slides and handouts, associated with a recorded presentation in order to improve the clarity of the reproduced supplemental materials. In addition, a need exists for a networked multimedia system that is capable of synchronizing the audio and video components of a recorded presentation with the display of any supplemental materials referenced during the presentation.
SUMMARY OF THE INVENTION
Generally, according to the invention, a networked multimedia information system is utilized to store and distribute multimedia objects, such as recorded presentations, over a network to a plurality of workstations. The networked multimedia information system includes an information retrieval system, a still image storage and retrieval system and a continuous media storage and retrieval system.
In accordance with one aspect of the present invention there is provided a method for use by a computing system receiving audio and video data over a network for presentation to a user, said method compensating for congestion on said network which causes delayed arrival of said audio and video data, said video data being transmitted over said network to said computing system at a requested video transmittal rate, said audio data being presented to said user at an audio playback rate, said network congestion compensation method comprising the steps of: maintaining an audio buffer for storing a predefined amount of said audio data received over said network; maintaining a video buffer for storing ~ predefined amount of said video data received over said network; monitoring said audio and video buffers to determine when said amount of audio or video data in said buffer falls below said predefined amounts; reducing said requested video transmittal rate if said monitoring step determines that said amount of audio or video data in said buffers has fallen below said predefined amounts; and reducing said audio playback rate if said amount of audio data in said audio buffer is below said predefined amount of audio data after said step of reducing said requested video transmittal rate.

- 3a -In accordance with another aspect of the present invention there is provided a computer-readable storage medium comprising encoded computer-readable program instructions for use in conjunction with a programmable computer receiving audio and video data over a network for presentation to a user, which instructions cause the computer to compensate for congestion on the network that results in delayed arrival of the audio and video data, the video data being transmitted over the network at a requested video transmittal rate to a video buffer for storing a predefined amount of the video data and the audio data being presented to the user at an audio playback rate, wherein a predefined amount of the audio data is stored in an audio buffer, the program instructions defining steps to be performed by the programmable computer, the steps comprising: monitoring the audio and video buffers to determine when the amount of audio or video data in the buffers falls below the predefined amounts; reducing the requested video transmittal rate if the monitoring step determines that the amount of audio or video data in the buffers has fallen below the predefined amounts; and reducing the audio playback rate if the amount of audio data in the audio buffer is below the predefined amount of audio data after the step of reducing the requested video transmittal rate.
According to a feature of the invention, the networked multimedia system is utilized to record, store and distribute multimedia presentations, such as seminars, meetings or conferences, and any supplemental materials that may be referenced during the presentation, such as viewgraphs, slides, blackboard annotations or handouts.

- 3b -According to a feature of the invention, the recorded presentation, and the associated supplemental materials may be simultaneously presented on a display containing two separate viewing windows. Preferably, the presentation of the audio and video components of the recorded presentation is synchronized with the presentation of the supplemental materials. In one embodiment, the synchronization of the recorded presentation and the associated supplemental materials is accomplished by means of hyperlink files, which identify the supplemental materials and the corresponding frame number in which the supplemental materials are referenced.
According to a further feature of the invention, a user can manipulate the supplemental materials displayed in one viewing window to control the video playback in the second viewing window, thereby allowing the supplemental materials to serve as an index to the recorded presentation. Once the user has selected the desired page or portion of the supplemental materials, the recorded presentation can be restarted from the video frame where ~1~0850 the selected page or portion of the supplemental materials is referenced in the recorded presentation.
Another feature of the invention minimizes the effects of network congestion by prefetching audio and video data for storage in audio and video buffers. In one embodiment, the networked multimedia system compensates for network congestion by using an adaptive control algorithm, which dynamically varies the rate at which video frames are retrieved from the respective file server over the network, in response to network traffic conditions. In order to maximize the playback quality of the recorded presentation, the audio component of the recorded presentation is given preference over the video component.
Thus, when network congestion conditions are extreme, the network multimedia system will transmit only audio data, without any video data, to the respective workstation. If the audio data does not arrive fast enough over the network to maintain the desired size of the audio buffer when there is no video data being transmitted, the network multimedia system will reduce the speed at which the audio data is played by the workstation until the amount of audio data in the audio buffer has returned to the desired size.
A more complete understanding of the present invention may be had by reference to the following Detailed Description with the accompanying drawings.
BRIEF DESCRIPTION OP DRA~IINGS
FIG. 1 is a schematic block diagram illustrating a networked multimedia information system according to the present invention;
FIG. 2 is a schematic block diagram illustrating a still image information capture subsystem;
FIG. 3 is a schematic block diagram illustrating a continuous media information capture subsystem;
FIG. 4 illustrates an object profile that maintains bibliographic information for a multimedia object stored in the networked multimedia system of FIG. 1;
FIG. 5 illustrates a workstation display having two windows for simultaneous display of a recorded presentation, together with its associated supplemental materials;
FIG. 6 illustrates a still image hyperlink file that maintains the starting video frame number associated with each page of the still image supplemental materials;
FIG. 7 illustrates a flow chart describing an exemplary still image synchronization subroutine according to the present invention as utilized by a known video process to synchronize the display of the recorded presentation with its still image supplemental materials on the workstation display of FIG. 5;
FIG. 8 illustrates a continuous media hyperlink file that maintains an event marker and the associated starting video frame number for each blackboard event, or stroke, that is placed on an electronic blackboard;
FIG. 9 illustrates a flow chart describing an exemplary continuous media synchronization subroutine according to the present invention as utilized by a known video process to synchronize the display of the recorded presentation with its continuous media supplemental materials on the workstation display of FIG. 5; and FIG. 10 illustrates a flow chart describing an exemplary data buffer monitoring subroutine according to the present invention as utilized to maintain a pre-defined amount of data in audio and video buffers.
DETAILED DESCRIPTION
The networked multimedia system 10 according to the present invention is illustrated in FIG. 1. The multimedia system 10 stores and indexes multimedia objects, such as documents, diagrams, photographs, audio, video and animation, for access by a user over a network 20.
A user employs a workstation, such as workstation 15, which may be a Sun SPARC~'"' workstation, model lOSX, lOM or classic M, produced by Sun Microsystems, Inc., of Mountain View, CA., or a similar workstation offering high-resolution imaging and support for real-time audio. Each workstation, such as the workstation 15, may be comprised of a display 510, a processing unit 16, a memory unit 18, buses 22, audio hardware 23, an audio buffer 110, a video buffer 115 and a user interface 24, such as a mouse and/or a keyboard. A preferred embodiment of the display 510 is discussed below relative to FIG. 5.
The network 20 may be a local area network (LAN) or a wide area network (WAN), such as a nationwide corporate Internet consisting of Ethernet networks interconnected by leased data circuits. Alternatively, the network 20 may be embodied as, e.g., an asynchronous transfer mode (ATM) network or an Integrated Services Digital Network (ISDN).
As discussed further below, to accommodate the low bandwidth which is typical of these networks, data files are preferably compressed prior to transmission over the network 20.
As illustrated in FIG. 1 and discussed further below, the networked multimedia system 10 utilizes an information retrieval system 50 to coordinate the accessing of multimedia objects stored by a plurality of file servers, such as file servers 35, 40. The information retrieval system 50 will interact with a still image storage and retrieval system 60 to retrieve the still image objects stored by an associated file server 35. In addition, the information retrieval system 50 will interact with a continuous media storage and retrieval system 70 to _ 7 _ retrieve the continuous media objects stored by an associated file server 40.
The information retrieval system 50 is preferably embodied as the Linusz'''' Database System, developed by AT&T, which is a "front end" system that allows a workstation to access information over a network that is stored in a number of distributed databases. For a description of the Linus Database System, see Lorinda L. Cherry and Robert K.
Waldstein, "Electronic Access to Full Document Text and Images through Linus," AT&T Technical Journal, Vol. 68, No. 4, pp. 72-90 (July/August 1989).
An information retrieval system 50, such as the Linus Database System, typically provides the ability to store, index, search, and retrieve information from a plurality of databases. In addition, such information retrieval systems 50 typically provide an authentication mechanism that ensures that stored information is viewed only by authorized users.
The still image storage and retrieval system 60 is preferably embodied as the Ferret Document Browsing System, developed by AT&T, which together with file server 35, provides storage and retrieval functions for a collection of databases, such as databases 80, 85, 90, containing still images, such as images of documents, drawings and photographs. For a description of the Ferret Document Browsing System, see Howard P. Katseff and Thomas B. London, "The Ferret Document Browser," Proc. USENIX
Summer 1993 Technical Conference, Cincinnati, OH. (June 1993).
Alternatively, the information retrieval system 50 and the still image storage and retrieval system 60 may be embodied as a single integrated system, such as the Document Management System, commercially available from NCR Corp., or the RightPages Image-Based Electronic '~~':~~,~~
_8_ Library System, developed by AT&T Bell Laboratories. For a description of the RightPages Image-Based Electronic Library System, see Guy A. Story et al., "The RightPages Image-Based Electronic Library for Alerting and Browsing,"
Computer, 17-26 (September 1992); Lawrence 0'Gorman, "Image and Document Processing Techniques for the RightPages Electronic Library System," Proc. 11th International Conference on Pattern Recognition, The Netherlands, 260-63 (August 30-September 3, 1992).
The still image storage and retrieval system 60 preferably includes a still image information capture subsystem 210, discussed further below relative to FIG. 2, which is utilized to obtain an electronic representation of each still image object and to store the electronic representation on a file server, such as file server 35.
The still image objects are preferably stored by a still image file server, such as file server 35, on a plurality of disk arrays or other secondary storage devices. The still image file server 35 may be embodied as a Sun SPARC workstation, commercially available from Sun Microsystems, Inc., of Mountain View, CA.
Alternatively, the still image objects may be stored on other mass storage devices that allow stored data to be accessed and shared by a plurality of workstations 15, 25, 30 over a network 20. In a preferred embodiment, file server 35 can be embodied as a plurality of distributed file servers which may be positioned at a plurality of remote locations throughout network 20.
The continuous media storage and retrieval system 70, together with file server 40, provides storage and retrieval functions for a collection of databases, such as databases 95, 100, containing continuous media, such as video, audio and animated graphics. For a discussion of an illustrative continuous media storage and retrieval - ~~~'~~~, system 70, see Lawrence A. Rowe and Brian C. Smith, "A
Continuous Media Player," Proc. 3d Int'1 Workshop on Network and Operating System Support~for Digital Audio and Video, San Diego, CA. (Nov. 1992). The Continuous Media Player system is available from the University of California, Berkeley.
The continuous media storage and retrieval system 70, such as the Continuous Media Player system referenced above, preferably provides the mechanism for digitizing and compressing audio, video and other continuous media for storage by a number of distributed file servers, in a manner described further below. In addition, the continuous media storage and retrieval system 70 provides the mechanism for accessing stored continuous media data files over network 20 for presentation on a workstation, such as workstation 15.
Alternatively, the continuous media storage and retrieval system 70 may be embodied as the multimedia network system commercially available from Fluent, Inc., now called Novell Multimedia.
The continuous media storage and retrieval system 70 preferably includes a continuous media information capture subsystem 310, discussed further below relative to FIG. 3, which is utilized to obtain an electronic representation of each continuous media object and to store the electronic representation on a file server, such as file server 40.
The continuous media objects are preferably stored by a continuous media file server, such as file server 40, on a plurality of disk arrays or other secondary storage devices. For a discussion of illustrative continuous media file servers, see Roger L. Haskin, "The Shark Continuous-Media File Server," Proc. IEEE COMPCON '93, pp.
12-15, San Francisco, CA. (Feb. 1993); Fouad A. Tobagi &
Joseph Pang, "StarWorksT'''--A Video Applications Server,"

2i4os~o Proc. IEEE COMPCON '93, pp. 4-11, San Francisco, CA. (Feb.
1993). In a preferred embodiment, file server 40 can be embodied as a plurality of distributed file servers which may be positioned at a plurality of remote locations throughout network 20.
In a preferred embodiment, each multimedia object stored in the multimedia system 10 is indexed by a content profiler 75 that acquires bibliographic information on each multimedia object for storage in a profile database 105, as discussed further below.
As illustrated in FIGS. 2 and 3, multimedia objects are stored by the networked multimedia system 10 by capture processes that obtain an electronic representation of each multimedia object. Multimedia information that is received in an electronic format, such as a document received directly from a computing system in electronic form, may be entered directly in the system 10.
Multimedia information that is received in a non-electronic format, however, such as information received on paper or film, must first be converted into an electronic format using imaging technology.
An illustrative still image information capture subsystem 210 for use by the still image storage and retrieval system 60 is shown in FIG. 2. The still image information capture subsystem 210 will convert a printed document 215 or diagram 220 into an electronic format using a document scanner 225, which will convert a printed page 215, 220 into an electronic format by making a bitmap image of the page. Similarly, a photograph 230 can preferably be converted into an electronic format for entry into the system 10 using a grayscale scanner 235.
The electronic format images produced by scanner 225 and grayscale scanner 235 preferably conform to the well-known Tag Image File Format (TIFF) specification. The scanned images are preferably compressed by compression circuitry 245, 250, respectively, prior to storage by file server 35.
Multimedia information that is received in an electronic form, such as documents that are received directly from a source computer 255 in electronic form, can be stored directly by file server 35 following compression by compression circuitry 260.
The textual portions of a document 215 or a diagram 220 are preferably converted into a computer-readable format by an optical character recognition system 240 for transfer to the content profiler 75 via data link 270.
Similarly, a copy of each document received directly from a source computer 255 in electronic computer-readable form is also preferably transferred to the content profiler 75, via data link 275. As discussed further below, the computer-readable form of a multimedia object may be utilized by the content profiler 75 for the automatic generation of an object profile, which may consist of an abstract and other bibliographic information.
An illustrative continuous media information capture subsystem 310 for use by the continuous media storage and retrieval system 70 is shown in FIG. 3.
An analog video input module 315 is preferably configured to receive analog video signals via data link 320 from, e.g., a video cassette recorder 325 and/or a video camera 330, as shown in FIG. 3. Similarly, an analog audio input module 340 is preferably configured to receive analog audio signals via data link 345, from, e.g., an audio tape recorder 350 and/or a microphone 355.
Video camera 330 and microphone 355 may be positioned during a presentation, such as a meeting or seminar, to capture the visual and audio portions of the presentation, respectively.
In this manner, analog video and audio signals generated during a presentation by camera 330 and/or _214085 microphone 355, respectively, may be received by input modules 315, 340 for storage in the multimedia system 10.
Preferably, the analog video and audio signals received by input modules 315, 340 are processed by a digitizer/compressor 360, prior to storage by the file server 40. The digitizer/compressor 360 may be embodied as a Parallax XVideo System, consisting of a board having a JPEG CODEC chip and associated software, commercially available from Parallax Graphics, Inc. of Santa Clara, CA.
The digitizer/compressor 360 will receive analog video and audio inputs from input modules 315, 340. The digitizer/compressor 360 will preferably digitize and compress the received video input and digitize the received audio input.
The data stream generated by the digitizer/compressor 360 is preferably in the JPEG Movie File format, which is described in "JPEG Movie File Specification, Release 1.0,"
Parallax Graphics, Inc., Santa Clara, CA. (Nov. 5, 1992), incorporated herein by reference. The JPEG Movie File format interleaves one frames' worth of audio with a frame of video. For example, if video signals are being stored for a display rate of 10 frames per second (fps), the resulting interleaved data file will include a repeating pattern of one frame of video, followed by a tenth of a second's worth of audio.
The JPEG Movie File format is preferred because of the inherent synchronization of the audio and video streams that results from the paired storage of a video frame with its associated audio, as discussed further below. In addition,'since the JPEG technique compresses each video frame independently, there is a resulting frame independence that allows a video frame to be easily dropped in the event of network congestion, as discussed below. Other video compression schemes may be utilized, such as the MPEG compression standard. However, the MPEG

_ 240850 compression technique, which takes into account the similarity between successive images, is less preferred because the resulting frame dependence makes it more difficult to drop video frames in the event of network congestion.
It is noted that typical workstations, such as the Sun workstations discussed above, include audio hardware 23 for receiving analog audio inputs and storing the audio signals in a digitized format. Thus, where analog audio signals are being processed, without any associated video, a workstation, such as a Sun workstation, can receive the analog audio signal via its audio input port, and then digitize the audio signal for storage as a multimedia object by a file server, such as the file server 40.
An electronic blackboard 358 is preferably provided, so that annotations made by a speaker on the blackboard 358 during a presentation may be electronically captured for storage in the multimedia information system 10. The digital signals generated by the electronic blackboard 358 can be stored directly by file server 40 in an electronic file following compression by compression circuitry 372.
A digital audio/video input module 370 is preferably configured to receive digital signals from a plurality of digital audio or video sources, such as a source 365, which may be a digital videodisk player. The digital signals received by input module 370 can be stored directly by file server 40 following compression by compression circuitry 375.
Other continuous multimedia information that is received in an electronic form directly from a source computer 380, such as animated graphics, can be stored by file server 40 following compression by compression circuitry 385.
Once the multimedia information has been stored in an electronic format by the information capture subsystems 210, 310 in the manner described above, the resulting multimedia object may be selected by a user and then transmitted over the network 20 to a workstation 15. A
workstation, such as the workstation 15, will receive the transmitted multimedia object for presentation to the user.
In a preferred embodiment, the content profiler 75 will create an object profile 400, shown illustratively in FIG. 4, for each multimedia object. Each object profile 400 includes bibliographic information that may be used for indexing and searching each multimedia object. In a preferred embodiment, each object profile 400 is stored centrally in a profile database 105 associated with the content profiler 75, as shown in FIG. 1.
The object profile 400 can include a plurality of entries for listing bibliographic information that describes relevant aspects of each multimedia object, such as a title entry 410, an author or speaker entry 420, an abstract entry 430, a date entry 440 and a keyword entry 450 for listing the keywords associated with the multimedia object.
In addition, each object profile 400 preferably includes an entry 460 containing a pointer to the memory location where the multimedia object is stored, i.e., an indication of the particular file server, such as file server 35 or 40, which controls the storage of the multimedia object. In one embodiment of the invention, discussed further below, entry 460 can be a multi-valued entry containing pointers to a number of related multimedia objects and/or data files.
As indicated above, the content profiler 75 can automatically generate object profiles, such as the object profile 400, for those multimedia objects that are stored in a computer-readable form. The content profiler 75 will receive multimedia objects in a computer-readable form via data link 270 from the optical character recognition system 240 and via data link 275 from source computer 255, as illustrated in FIG. 2. The object profiles associated with multimedia objects that are not stored in a computer-s readable form, however, such as the object profiles for photographs, audio and video, can be generated manually for entry in the content profiler 75.
The information retrieval system 50 allows a user to access the multimedia information that has been stored in the plurality of databases, such as databases 80, 85, 90, 95, 100, associated with file servers 35, 40. As discussed above, each multimedia object has an associated object profile, such as the object profile 400, that is stored in a profile database 105. The information retrieval system 50 preferably provides a mechanism for searching the plurality of object profiles stored in the profile database 105 to identify those multimedia objects that satisfy a user-specified query.
A user may employ a workstation, such as the workstation 15, to enter a query specifying parameters for retrieving multimedia objects from the multimedia system 10. The information retrieval system 50 will retrieve all the object profiles from the profile database 105 that correspond to multimedia objects that satisfy the user-specified query. Thereafter, the information retrieval system 50 will transmit the object profiles satisfying the entered query to the workstation 15. The user can review those object profiles that satisfy the entered query and select those multimedia objects that should be downloaded to the workstation 15.
Once a multimedia object has been selected by a user, the information retrieval system 50 will access the entry 460 in the object profile that indicates the memory locations where the selected multimedia objects are stored. The information retrieval system 50 will send a retrieval message over network 20 to the respective storage and retrieval system 60, 70 which accesses the appropriate file server 35, 40 indicated by the retrieved pointer. The compressed file containing the requested multimedia object is transmitted from the respective file server to the workstation 15, in a known manner.
If the retrieved multimedia object is a still image object with many pages, such as a document, the workstation 15 will thereafter decompress each page of the multimedia object as requested for display by the user, in a known manner.
If, however, the retrieved multimedia object is comprised of continuous media, such as audio or video, the entire multimedia object will not normally be retrieved at once, due to the memory limitations associated with many workstations. Rather, the workstation 15 will utilize a prefetch routine to retrieve several frames worth of audio and video from the respective file server for storage in an audio buffer 110 and a video buffer 115, respectively, as described further below. Accordingly, the received video frames will be decompressed by the workstation 15 as they are received over the network 20 from the respective file server, in a known manner.
For a multimedia object that contains a plurality of media, such as the audio and video portions of a recorded presentation, the workstation 15 will preferably have a mechanism for synchronizing the presentation of the various outputs. In the preferred embodiment discussed above, a multimedia object containing the audio and video portions of a recorded presentation will be stored as a data file containing interleaved audio and video information in the JPEG Movie File format.
The workstation 15 can process the interleaved JPEG
data file, having a frame of video paired with its associated audio, and direct the audio information, in _ _ ~i4pg~p real time, to the audio hardware 23 of the workstation 15, in a known manner, while the video information is decompressed utilizing a software routine for presentation on display 510, in a known manner.
Alternatively, the multimedia system 10 can implement known synchronization algorithms to control the recording and playback of continuous media. During playback, the synchronization algorithm monitors the internal clock of the workstation, such as workstation 15, to determine which audio and video frames should be played and makes adjustments, if necessary.
According to one aspect of the invention, the above networked multimedia system 10 is utilized to record and store multimedia presentations, such as seminars, meetings or conferences, together with any supplemental materials that may be referenced during the presentation, such as viewgraphs, slides, handouts or blackboard annotations.
The recorded presentation, together with the stored supplemental materials, may thereafter be accessed over network 20 for presentation to a user on a workstation, such as the workstation 15.
The audio and video portions of the presentation are preferably captured and converted into a multimedia object using the continuous media information capture subsystem 310, in the manner described above relative to FIG. 3.
The supplemental materials that consist of still images, i.e., viewgraphs, slides and handouts, are preferably separately processed from the recorded presentation and stored as a separate multimedia object by the still image information capture subsystem 210, in the manner described above relative to FIG. 2.
In addition, according to a further feature of the invention discussed further below, the supplemental materials that consist of continuous media, i.e., the annotations made by a speaker on an electronic blackboard ~~~as~a 358, are preferably stored as a separate multimedia object by the continuous media information capture subsystem 310, in the manner described above relative to FIG. 3.
In a preferred embodiment, illustrated in FIG. 5, the recorded presentation, together with the stored supplemental materials, may be simultaneously presented on a display, such as display 510 of workstation 15, containing two separate viewing windows 520, 530. In this manner, the appropriate supplemental materials may be displayed in window 520 as they are referenced during the recorded presentation being displayed in window 530.
Preferably, the presentation of the audio and video components of the recorded presentation is synchronized with the presentation of the supplemental materials in window 520. As discussed further below, the supplemental materials are an additional medium to be synchronized with the presentation of the audio and video portions of the recorded presentation.
As is known, a video presentation consists of a large number of sequential video frames which may be recorded at a fixed rate, such as a frame rate of 10 frames per second. Thus, the video frame number can be utilized as an index into the video stream. In addition, as discussed further below, the supplemental materials that are associated with the recorded presentation, such as the viewgraphs, may be identified with the video frames in which they are referenced.
SYNCHRONIZATION OF THE RECORDED PRESENTATION AND
STILL IMAGE SUPPLEMENTAL MATERIALS
4~here the supplemental materials associated with the recorded presentation is comprised of a multi-page still image object, such as viewgraphs and handouts, each page of the still image multimedia object may be numbered sequentially in the order in which it is referenced during the recorded presentation. The recorded presentation and the associated still image supplemental materials may thereafter be synchronized by means of a still image hyperlink file, such as the still image hyperlink file 600 illustrated in FIG. 6.
The still image hyperlink file 600 preferably consists of a plurality of video frame-still image page pairs, i.e., the page number of the still image multimedia object in column 620 together with the corresponding frame number in column 610 in which the still image page is first referenced.
The still image hyperlink file 600 will include a plurality of rows, such as rows 625, 630, 635, 640, 642, 644, 646, each corresponding to a video frame-still image page pair. Each time a page of the still image supplemental material object, such as a viewgraph, is referenced during the presentation, an entry is made in column 620 of the still image hyperlink file 600 with the appropriate page number of the supplemental material, together with the number in column 610 of the video frame in which the supplemental material page is first referenced. If the same viewgraph, or other still image supplemental material, is referenced more than once during a recorded presentation, it is preferably assigned a unique page number for each appearance of the still image page.
According to a feature of the invention, a pair of independent processes, which may interact by means of a message passing mechanism, are preferably utilized to display a recorded presentation and its associated still image supplemental materials. A video process controls the presentation of the audio and video portions of the recorded presentation. The video process preferably incorporates the features and functions of the presentation processes of the Continuous Media Player, ~. 2140850 described in the reference incorporated above, as well as the additional features and functions described below.
Where the recorded presentation has associated still image supplemental materials, the video process will interact with a still image presentation process in order to present the appropriate still image supplemental materials in the window 520 of display 510. The still image presentation process preferably incorporates the features and functions of the still image presentation processes of the Ferret Document Browser, described in the reference incorporated above, as well as the additional features and functions described below.
The video process preferably utilizes a still image synchronization subroutine, illustrated in FIG. 7, to synchronize the still image supplemental materials in window 520 with the ongoing video presentation in window 530. If the recorded presentation has associated still image supplemental materials, the video process will enter the still image synchronization subroutine at step 700 as each video frame is presented in window 530, as shown in FIG. 7.
During step 710, the still image synchronization subroutine will access the still image hyperlink file 600 to determine if the appropriate page of the still image file is being displayed in window 520. The still image synchronization subroutine will access the row of the still image hyperlink file 600 having the largest video frame number that is less than or equal to the frame number currently being displayed.
Once the appropriate row of the still image hyperlink file 600 is accessed, the still image synchronization subroutine will retrieve, during step 720, the appropriate still image page number from the corresponding entry in column 620. The still image synchronization subroutine will compare during step 730 the still image page number retrieved from the still image hyperlink file 600 in the previous step to the page number of the still image currently displayed. If it is determined during step 730 that the retrieved still image page number does equal the currently displayed still image page number, the appropriate still image is being displayed, and the still image synchronization subroutine will be exited at step 760. Process control will thereafter return to the video process.
If, however, it is determined during step 730 that the retrieved still image page number does not equal the currently displayed still image page number, the synchronization subroutine will send a message to the still image presentation process during step 750, which includes an indication of the still image page number which should be displayed in window 520, as retrieved during step 720. The still image synchronization subroutine will thereafter be exited during step 760 and process control will return to the video process.
When the still image presentation process receives the still image page number from the still image synchronization subroutine, transmitted during step 750, the still image presentation process will decompress the appropriate page of the supplemental materials multimedia object corresponding to the received still image page number for presentation in window 520, in a known manner.
SYNCHRONIZATION OF THE RECORDED PRESENTATION AND
CONTINUOUS MEDIA SUPPLEMENTAL MATERIALS
According to a further feature of the invention, the supplemental materials associated with a recorded presentation may consist of continuous media, such as the annotations made by a speaker on an electronic blackboard 358 during the associated presentation. Preferably, the electronic file that stores the blackboard annotations or ~~.4os~o other continuous media supplemental materials associated with a recorded presentation is marked with a time stamp or other counter which may serve as an index into the recorded blackboard annotations. In this manner, recorded blackboard events, i.e., the placement of strokes on the electronic blackboard 358, can be presented with the video frames in which the blackboard event is referenced.
The recorded presentation and associated continuous media supplemental materials are preferably synchronized by means of a continuous media hyperlink file, such as the continuous media hyperlink file 800 illustrated in FIG. 8.
The continuous media hyperlink file 800 preferably consists of a plurality of video frame-event marker pairs, i.e., a marker that identifies the position of the recorded blackboard event within the associated electronic file in column 820 together with the corresponding frame number in column 810 in which the corresponding blackboard event is first referenced.
The continuous media hyperlink file 800 will include a plurality of rows, such as rows 830, 840, 850, 860, 870, each corresponding to a video frame-event marker pair.
For each blackboard event, or stroke, that is placed on the electronic blackboard 358 during the presentation, an entry is preferably made in column 820 of the continuous media hyperlink file 800 with the event marker that identifies the position of the recorded blackboard event within the associated supplemental material electronic file. In addition, the video frame number in which the blackboard event is first referenced is placed in column 810.
When the recorded presentation has associated continuous media supplemental materials, the video process, discussed above, will preferably interact with a continuous media presentation process in order to coordinate the presentation of the appropriate continuous media supplemental materials in the window 520 of display 510. The continuous media presentation process preferably incorporates the features and functions of the presentation processes of the Continuous Media Player, in a manner similar to the video process, as well as the additional features and functions described below.
The video process preferably utilizes a continuous media synchronization subroutine, illustrated in FIG. 9, to synchronize the continuous media supplemental materials in window 520 with the ongoing video presentation in window 530. If the recorded presentation has associated continuous media supplemental materials, the video process will enter the continuous media synchronization subroutine at step 900 as each video frame is presented in window 530, as shown in FIG. 9.
During step 910, the continuous media synchronization subroutine will access the continuous media hyperlink file 800 to determine if the appropriate segment of the electronic file, i.e., the appropriate blackboard event, is being presented in window 520. The continuous media synchronization subroutine will access the row of the continuous media hyperlink file 800 having the largest video frame number that is less than or equal to the frame number currently being displayed.
Once the appropriate row of the continuous media hyperlink file 800 is accessed, the continuous media synchronization subroutine will retrieve, during step 920, the appropriate event marker from the corresponding entry in column 820. During step 930, the continuous media synchronization subroutine will compare the event marker retrieved from the continuous media hyperlink file 800 during step 920 to the event marker associated with the blackboard event currently being displayed.
If it is determined during step 930 that the retrieved event marker equals the event marker associated with the blackboard event currently being displayed, the appropriate blackboard event is being displayed, and the continuous media synchronization subroutine will be exited at step 960. Process control will thereafter return to the video process.
If, however, it is determined during step 930 that the retrieved event marker does not equal the event marker associated with the blackboard event currently being displayed, the continuous media synchronization subroutine will send a message to the continuous media presentation process during step 950, which includes the event marker as an indication of the file position from which the associated electronic file that contains the recorded blackboard annotations should be restarted. The continuous media synchronization subroutine will thereafter be exited during step 960 and process control will return to the video process.
When the continuous media presentation process receives the event marker from the continuous media synchronization subroutine, transmitted during step 950, the continuous media presentation process will make an adjustment to the presentation of the continuous media supplemental materials in window 520 by restarting the presentation from the position in the electronic file indicated by the received event marker, in a known manner.
When a recorded presentation has associated still image and/or continuous media supplemental materials, a single object profile, similar to the object profile 400 illustrated in FIG. 4, is preferably utilized to profile the multimedia objects corresponding to the recorded presentation and associated supplemental materials. The entry 460 of object profile 400 is preferably a multi-valued entry containing a plurality of pointers, e.g., a pointer to the multimedia object corresponding to the ~1~0850 recorded presentation, a pointer to the multimedia object corresponding to the supplemental materials and, where appropriate, a pointer to the hyperlink file 600 associated with the recorded presentation.
Thus, when a user selects a recorded presentation having associated supplemental materials, in the manner described above, the information retrieval system 50 will access the entry 460 to retrieve the plurality of pointers. Thereafter, the information retrieval system 50 will send a retrieval message over network 20 to each file server indicated by the plurality of retrieved pointers.
The data files containing the recorded presentation, corresponding supplemental materials, and, where appropriate, the respective hyperlink file 600, 800, will be transmitted by the respective file servers to the workstation, such as workstation 15.
Window 530 of display 510 includes a frame number scroll bar 540 that allows a user to scroll through video frames. Once the user has selected a starting frame using the frame number scroll bar 540, the video process will restart the video, together with the corresponding audio, from the selected video frame. If the recorded presentation has associated still image supplemental materials, the video process will utilize the still image synchronization subroutine of FIG. 7 to retrieve the appropriate still image page number from the still image hyperlink file 600 corresponding to the selected starting video frame, for transmittal to the still image presentation process.
Similarly, if the recorded presentation has associated continuous media supplemental materials, the video process will utilize the continuous media synchronization subroutine of FIG. 9 to retrieve the appropriate event marker from the continuous media hyperlink file 800 corresponding to the starting video frame selected using the frame number scroll bar 540, for transmittal to the continuous media presentation process.
It is noted that if a user has selected a frame number that has not yet been received by the workstation 15 from the respective file server, the video process must first transmit a retrieval request to the file server to retrieve the necessary frames.
In addition, window 530 includes a playback speed scroll bar 550 that allows a user to control the playback speed, in frames per second, of the recorded presentation, in a known manner. The playback speed scroll bar allows the user to adjust the playback speed from a minimum of zero frames per second, i.e., a still image, up to the maximum recorded frame rate of the video, in either forward or reverse mode. Once the user has selected a video playback speed using the playback speed scroll bar 550, the video process will adjust the rate of data being requested from the storage and retrieval system 70 to the selected playback speed, in addition to making local adjustments to the video and audio outputs of workstation 15.
It is noted that where the recorded presentation has associated still image supplemental materials, the video process will utilize the still image synchronization subroutine of FIG. 7 to retrieve the appropriate still image page number from the still image hyperlink file 600 in the same manner for both forward and reverse playback of the video, as described above. Similarly, where the recorded presentation has associated continuous media supplemental materials, the video process will utilize the continuous media synchronization subroutine of FIG. 9 to retrieve the appropriate event marker from the continuous media hyperlink file 800 in the same manner for both forward and reverse playback of the video, as described above.

In a preferred embodiment, a user can manipulate the still image or continuous media supplemental materials in window 520 to control the video playback in window 530.
In this manner, the presentation of the supplemental materials can serve as an index to the recorded presentation. A user can proceed through the supplemental materials in window 520, e.g., by clicking the left button of a mouse in window 520 to move through the supplemental materials in a forward direction or clicking the right button of the mouse to move in a reverse direction.
Once the user has selected a desired still image page or continuous media event in window 520, the recorded presentation in window 530 can be restarted from the video frame where the still image page or continuous media event is referenced in the recorded presentation. Upon selection of a desired page or event in window 520, the still image presentation process or the continuous media presentation process, as appropriate, preferably sends a message to the video process with the selected still image page number or continuous media event marker. The video process will thereafter access the respective hyperlink file 600, 800 to retrieve the video frame that corresponds to the selected still image page or continuous media event. The video process will then restart the video from the retrieved frame number.
It is noted that where the supplemental materials associated with a recorded presentation consist of a combination of still images and continuous media, a third viewing window (not shown) may be provided on display 510 for the simultaneous presentation of the recorded presentation, together with the stored still image supplemental materials and continuous media supplemental materials. Alternatively, window 520 may be configured to alternately present still images or continuous media, as ._ _ 21~~~~~

they are referenced during the recorded presentation, in a known manner.
NETWORK CONGESTION
In order to compensate for congestion on network 20 that causes the delayed arrival of audio and video data, the video process will preferably prefetch audio and video frames from the respective file server 40 for storage in audio and video buffers 110, 115, respectively, on the user's workstation to minimize the effect of network congestion on the playback of the recorded presentation.
Since two-way video conferencing applications demand rapid response times, the buffering of audio and video data would not normally be tolerated in such applications. It is noted, however, that during the playback of recorded presentations the buffering of audio and video data would not be detected by a user.
According to one feature of the invention, the networked multimedia system compensates for congestion on network 20 using an adaptive control algorithm to dynamically vary the rate at which video frames are retrieved from the respective file server 40 over network 20, in response to increases or decreases in the amount of other data being transmitted across the network 20. For a discussion of an illustrative adaptive control algorithm, see section 2.3 of the Continuous Media Player reference, incorporated above.
In order to maximize the playback quality of the recorded presentation, the audio component of the recorded presentation is preferably given preference over the video component. It has been found that a person viewing a recorded presentation will object more strongly to defects in the audio portion of the recorded presentation than to defects in the video portions of the presentation.

_2140850 In a preferred embodiment, the video process utilizes a data buffer monitoring subroutine, illustrated in FIG.
10, to maintain a pre-defined amount of audio and video data in the audio and video buffers 110, 115. The data buffer monitoring subroutine will continuously monitor the audio and video buffers 110, 115 during step 1000, until the amount of audio or video data stored in the audio or video buffers 110, 115 drops below a predefined threshold value, as detected by step 1010.
If it is determined during step 1010 that the amount of audio or video data in the audio or video buffers 110, 115, respectively, has fallen below the desired threshold value, the data is not arriving at the workstation 15 from the respective file server 40 over network 20 as fast as it is being presented to a user by the workstation 15.
Thus, the data buffer monitoring subroutine will reduce the requested video playback rate during step 1015 by requesting that the file server 40 transmit fewer video frames per second to the workstation 15. It is noted, however, that the data buffer monitoring subroutine will preferably continue to request all of the audio data from the file server 40.
After the requested video playback rate is reduced during step 1015, a test is performed during step 1020 to determine if the amount of audio or video data in the audio or video buffers 110, 115 is still below the desired threshold value. If it is determined during step 1020 that the amount of audio or video data in the audio or video buffers 110, 115 has risen above the desired threshold value, program control will proceed to step 1060, discussed below.
If, however, it is determined during step 1020 that the amount of audio or video data in the audio or video buffers 110, 115 is still below the desired threshold value, a test is performed during step 1025 to determine ._ _ 214os~o if the requested video playback rate has been reduced to the minimum value, i.e., a playback rate of 0 frames per second (fps).
If it is determined during step 1025 that the requested video playback rate has not yet been reduced to the minimum value, program control will return to step 1015 for a further reduction in the requested video playback rate.
If it is determined during step 1025 that the requested video playback rate has been reduced to the minimum value, network congestion conditions are so extreme that even though no video data is being transmitted across the network 20, the audio data is still not arriving fast enough over the network 20 to maintain the desired size of the audio buffer 110. In a preferred embodiment, the data buffer monitoring subroutine will compensate for the delayed arrival of audio data by playing the audio data from the audio buffer 110 at slower than real-time.
Thus, the data buffer monitoring subroutine will begin playing the audio at a reduced speed during step 1030. The data buffer monitoring subroutine will continue playing the audio at the reduced speed until it is determined during step 1040 that the amount of audio data in the buffer has returned to the desired threshold value.
Once it is determined during step 1040 that the amount of audio data in the buffer is greater than or equal to the desired threshold value, the data buffer monitoring subroutine will resume playing the audio at a normal, or real-time, speed during step 1045. Thereafter, program control will return to step 1000, and continue in the manner described above.
The data buffer monitoring subroutine could play the audio at half-speed during step 1030, e.g., by dividing each frames' worth of buffered audio data into n segments w . _ 2140850 and then playing each segment twice. If it is desired to play the audio at a speed between half speed and normal speed, not all of the n segments are played twice.
Similarly, if it is desired to play the audio at a speed less than half speed, some of the n segments may be played more than twice.
In a preferred embodiment, the data buffer monitoring subroutine will gradually adjust the playback speeds of the audio during steps 1030 and 1045 in order to make the transition from one speed to another less noticeable to a listener. For example, the data buffer monitoring subroutine can reduce the audio playback rate during step 1030 according to a scale that gradually adjusts the playback rate between a defined maximum and minimum audio playback rate. In addition, by monitoring the rate at which the audio buffer 110 is emptying, the data buffer monitoring subroutine can determine how quickly the audio playback rate should be reduced during step 1030 and what the minimum audio playback rate should ultimately be.
In an alternate embodiment, the data buffer monitoring subroutine could play the audio at a reduced speed during step 1030 by utilizing a well-known pitch extraction process, which identifies where the pauses are in the audio and makes the pauses longer.
Once the data buffer monitoring subroutine has stabilized the amount of data in the audio and video buffers 110, 115, as detected during step 1020, program control will proceed to step 1060. During step 1060, the amount of video data requested from the respective file server will be gradually increased. Thereafter, program control will return to step 1000 to continue monitoring the audio and video buffers 110, 115 in the manner described above.
Although the data buffer monitoring subroutine will drop video frames during times of network congestion, it __ _ 2140850 is preferred that the video process will access the still image synchronization subroutine of FIG. 7 where the supplemental materials consist of still images or the continuous media synchronization subroutine of FIG. 9 where the supplemental materials consist of continuous media for each video frame, regardless of whether or not the video frame is actually displayed in window 530. In this manner, synchronization is maintained between the presentation of the supplemental materials in window 520 and the continuing audio presentation, even in the absence of video frames.
It is to be understood that the embodiments and variations shown and described herein are illustrative of the principles of this invention only and that various modifications may be implemented by those skilled in the art without departing from the scope and spirit of the invention.

Claims (12)

CLAIMS:
1. A method for use by a computing system receiving audio and video data over a network for presentation to a user, said method compensating for congestion on said network which causes delayed arrival of said audio and video data, said video data being transmitted over said network to said computing system at a requested video transmittal rate, said audio data being presented to said user at an audio playback rate, said network congestion compensation method comprising the steps of:
maintaining an audio buffer for storing a predefined amount of said audio data received over said network;
maintaining a video buffer for storing a predefined amount of said video data received over said network;
monitoring said audio and video buffers to determine when said amount of audio or video data in said buffer falls below said predefined amounts;
reducing said requested video transmittal rate if said monitoring step determines that said amount of audio or video data in said buffers has fallen below said predefined amounts; and reducing said audio playback rate if said amount of audio data in said audio buffer is below said predefined amount of audio data after said step of reducing said requested video transmittal rate.
2. The network congestion compensation method according to claim 1, further including the step of increasing said audio playback rate once said amount of audio data in said audio buffer is above said predefined amount of audio data.
3. The network congestion compensation method according to claim 1, wherein said step of reducing said audio playback rate is not performed unless said step of reducing said requested video transmittal rate has reduced said requested video transmittal rate to a minimum value.
4. The network congestion compensation method according to claim 1, wherein said step of reducing said audio playback rate plays said audio at a reduced speed by dividing said audio data in said audio buffer into a plurality of segments and playing one or more of said segments at least twice.
5. The network congestion compensation method according to claim 1, wherein said step of reducing said audio playback rate plays said audio at a reduced speed by utilizing a pitch extraction process.
6. The network congestion compensation method according to claim 1, wherein said step of reducing said audio playback rate will gradually adjust said audio playback rate to make the transition from one speed to another less noticeable to said user.
7. A computer-readable storage medium comprising encoded computer-readable program instructions for use in conjunction with a programmable computer receiving audio and video data over a network for presentation to a user, which instructions cause the computer to compensate for congestion on the network that results in delayed arrival of the audio and video data, the video data being transmitted over the network at a requested video transmittal rate to a video buffer for storing a predefined amount of the video data and the audio data being presented to the user at an audio playback rate, wherein a predefined amount of the audio data is stored in an audio buffer, the program instructions defining steps to be performed by the programmable computer, the steps comprising:
monitoring the audio and video buffers to determine when the amount of audio or video data in the buffers falls below the predefined amounts;
reducing the requested video transmittal rate if the monitoring step determines that the amount of audio or video data in the buffers has fallen below the predefined amounts; and reducing the audio playback rate if the amount of audio data in the audio buffer is below the predefined amount of audio data after the step of reducing the requested video transmittal rate.
8. The computer-readable storage medium of claim 7, wherein the steps further comprise increasing the audio playback rate once the amount of audio data in the audio buffer is above the predefined amount of audio data.
9. The computer-readable storage medium of claim 7, wherein the step of reducing the audio playback rate is not performed unless the step of reducing the requested video transmittal rate has reduced the requested video transmittal rate to a minimum value.
10. The computer-readable storage medium of claim 7, wherein the step of reducing the audio playback rate plays the audio at a reduced speed by dividing the audio data in the audio buffer into a plurality of segments and playing one or more of the segments at least twice.
11. The computer-readable storage medium of claim 7, wherein the step of reducing the audio playback rate plays the audio at a reduced speed by utilizing a pitch extraction process.
12. The computer-readable storage medium of claim 7, wherein the step of reducing the audio playback rate will gradually adjust the audio playback rate to make the transition from one speed to another less noticeable to the user.
CA002140850A 1994-02-24 1995-01-23 Networked system for display of multimedia presentations Expired - Fee Related CA2140850C (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US20186394A 1994-02-24 1994-02-24
US201,863 1994-02-24

Publications (2)

Publication Number Publication Date
CA2140850A1 CA2140850A1 (en) 1995-08-25
CA2140850C true CA2140850C (en) 1999-09-21

Family

ID=22747608

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002140850A Expired - Fee Related CA2140850C (en) 1994-02-24 1995-01-23 Networked system for display of multimedia presentations

Country Status (3)

Country Link
US (1) US5822537A (en)
EP (1) EP0669587A3 (en)
CA (1) CA2140850C (en)

Families Citing this family (306)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7103594B1 (en) 1994-09-02 2006-09-05 Wolfe Mark A System and method for information retrieval employing a preloading procedure
US7467137B1 (en) 1994-09-02 2008-12-16 Wolfe Mark A System and method for information retrieval employing a preloading procedure
US6549948B1 (en) * 1994-10-18 2003-04-15 Canon Kabushiki Kaisha Variable frame rate adjustment in a video system
US5793980A (en) * 1994-11-30 1998-08-11 Realnetworks, Inc. Audio-on-demand communication system
US7349976B1 (en) 1994-11-30 2008-03-25 Realnetworks, Inc. Audio-on-demand communication system
US7302638B1 (en) 1995-06-07 2007-11-27 Wolfe Mark A Efficiently displaying and researching information about the interrelationships between documents
US7246310B1 (en) 1995-06-07 2007-07-17 Wolfe Mark A Efficiently displaying and researching information about the interrelationships between documents
US5909238A (en) * 1995-07-25 1999-06-01 Canon Kabushiki Kaisha Image transmission system with billing based on the kind of MPEG frame transmitted
IL115263A (en) * 1995-09-12 1999-04-11 Vocaltec Ltd System and method for distributing multi-media presentations in a computer network
EP0867003A2 (en) * 1995-12-12 1998-09-30 The Board of Trustees for the University of Illinois Method of and system for transmitting and/or retrieving real-time video and audio information over performance-limited transmission systems
US6067075A (en) * 1995-12-21 2000-05-23 Eastman Kodak Company Controller for medical image review station
WO1997026608A1 (en) * 1996-01-18 1997-07-24 Vicom Multimedia Inc. Authoring and publishing system for interactive multimedia computer applications
AU716590B2 (en) * 1996-04-12 2000-03-02 Avid Technology, Inc. A multimedia production system
US5903727A (en) * 1996-06-18 1999-05-11 Sun Microsystems, Inc. Processing HTML to embed sound in a web page
AU3496797A (en) * 1996-06-21 1998-01-07 Integrated Computing Engines, Inc. Network based programmable media manipulator
US5874986A (en) * 1996-06-26 1999-02-23 At&T Corp Method for communicating audiovisual programs over a communications network
CA2197727A1 (en) * 1996-06-27 1997-12-27 Richard Frank Bruno Method for altering a broadcast transmission as a function of its recipient on a communications network
US5760771A (en) * 1996-07-17 1998-06-02 At & T Corp System and method for providing structured tours of hypertext files
JP3202606B2 (en) * 1996-07-23 2001-08-27 キヤノン株式会社 Imaging server and its method and medium
EP0821522B1 (en) * 1996-07-23 2008-04-09 Canon Kabushiki Kaisha Camera control apparatus and method
JP3862321B2 (en) 1996-07-23 2006-12-27 キヤノン株式会社 Server and control method thereof
WO1998020434A2 (en) * 1996-11-07 1998-05-14 Vayu Web, Inc. System and method for displaying information and monitoring communications over the internet
US5951646A (en) * 1996-11-25 1999-09-14 America Online, Inc. System and method for scheduling and processing image and sound data
JPH10228486A (en) * 1997-02-14 1998-08-25 Nec Corp Distributed document classification system and recording medium which records program and which can mechanically be read
US8626763B1 (en) 1997-05-22 2014-01-07 Google Inc. Server-side suggestion of preload operations
US7284187B1 (en) * 1997-05-30 2007-10-16 Aol Llc, A Delaware Limited Liability Company Encapsulated document and format system
US6453334B1 (en) * 1997-06-16 2002-09-17 Streamtheory, Inc. Method and apparatus to allow remotely located computer programs and/or data to be accessed on a local computer in a secure, time-limited manner, with persistent caching
JP3733218B2 (en) * 1997-09-30 2006-01-11 キヤノン株式会社 RELAY DEVICE, ITS CONTROL METHOD, AND STORAGE MEDIUM
US7257604B1 (en) 1997-11-17 2007-08-14 Wolfe Mark A System and method for communicating information relating to a network resource
ES2397501T3 (en) * 1997-11-25 2013-03-07 Motorola Mobility, Llc Methods, systems and manufactured elements of audio content reproduction
US6301258B1 (en) * 1997-12-04 2001-10-09 At&T Corp. Low-latency buffering for packet telephony
US6556560B1 (en) 1997-12-04 2003-04-29 At&T Corp. Low-latency audio interface for packet telephony
US6380950B1 (en) * 1998-01-20 2002-04-30 Globalstreams, Inc. Low bandwidth television
US6442139B1 (en) * 1998-01-29 2002-08-27 At&T Adaptive rate control based on estimation of message queuing delay
US7136870B1 (en) * 1998-03-04 2006-11-14 The Regents Of The University Of California Method and apparatus for accessing and displaying multimedia content
CN1119763C (en) * 1998-03-13 2003-08-27 西门子共同研究公司 Apparatus and method for collaborative dynamic video annotation
US6415326B1 (en) 1998-09-15 2002-07-02 Microsoft Corporation Timeline correlation between multiple timeline-altered media streams
US6622171B2 (en) 1998-09-15 2003-09-16 Microsoft Corporation Multimedia timeline modification in networked client/server systems
US6816909B1 (en) * 1998-09-16 2004-11-09 International Business Machines Corporation Streaming media player with synchronous events from multiple sources
US6307487B1 (en) 1998-09-23 2001-10-23 Digital Fountain, Inc. Information additive code generator and decoder for communication systems
AU743040B2 (en) * 1998-09-23 2002-01-17 Canon Kabushiki Kaisha Multiview multimedia generation system
US7068729B2 (en) 2001-12-21 2006-06-27 Digital Fountain, Inc. Multi-stage code generator and decoder for communication systems
AUPP612998A0 (en) * 1998-09-23 1998-10-15 Canon Kabushiki Kaisha Multiview multimedia generation system
US6349329B1 (en) 1998-09-29 2002-02-19 Radiowave.Com, Inc. Coordinating delivery of supplemental materials with radio broadcast material
WO2000019647A2 (en) * 1998-09-29 2000-04-06 Radiowave.Com, Inc. System and method for coordinating a visual display with audio programmes broadcast over a communications network
US6317784B1 (en) 1998-09-29 2001-11-13 Radiowave.Com, Inc. Presenting supplemental information for material currently and previously broadcast by a radio station
AU1704900A (en) 1998-10-13 2000-05-01 Radiowave.Com, Inc. System and method for determining the audience of digital radio programmes broadcast through the internet
US6253238B1 (en) * 1998-12-02 2001-06-26 Ictv, Inc. Interactive cable television system with frame grabber
FR2787222B1 (en) * 1998-12-14 2001-04-27 Canon Kk METHOD AND DEVICE FOR GEOMETRIC TRANSFORMATION OF AN IMAGE IN A COMPUTER COMMUNICATION NETWORK
GB2348069B (en) * 1998-12-21 2003-06-11 Ibm Representation of a slide-show as video
WO2000043999A2 (en) * 1999-01-22 2000-07-27 Sony Electronics, Inc. Method and apparatus for synchronizing playback of multiple media types over networks having different transmission characteristics
US6236395B1 (en) * 1999-02-01 2001-05-22 Sharp Laboratories Of America, Inc. Audiovisual information management system
US6731600B1 (en) * 1999-02-08 2004-05-04 Realnetworks, Inc. System and method for determining network conditions
US6646655B1 (en) * 1999-03-09 2003-11-11 Webex Communications, Inc. Extracting a time-sequence of slides from video
KR100685982B1 (en) * 1999-04-03 2007-02-23 엘지전자 주식회사 Method and apparatus for synchronization of media information
US7281199B1 (en) 1999-04-14 2007-10-09 Verizon Corporate Services Group Inc. Methods and systems for selection of multimedia presentations
US6502194B1 (en) * 1999-04-16 2002-12-31 Synetix Technologies System for playback of network audio material on demand
US6665751B1 (en) * 1999-04-17 2003-12-16 International Business Machines Corporation Streaming media player varying a play speed from an original to a maximum allowable slowdown proportionally in accordance with a buffer state
US7302396B1 (en) 1999-04-27 2007-11-27 Realnetworks, Inc. System and method for cross-fading between audio streams
US6275793B1 (en) * 1999-04-28 2001-08-14 Periphonics Corporation Speech playback with prebuffered openings
US6625655B2 (en) * 1999-05-04 2003-09-23 Enounce, Incorporated Method and apparatus for providing continuous playback or distribution of audio and audio-visual streamed multimedia reveived over networks having non-deterministic delays
US6625656B2 (en) * 1999-05-04 2003-09-23 Enounce, Incorporated Method and apparatus for continuous playback or distribution of information including audio-visual streamed multimedia
US6728760B1 (en) * 1999-05-05 2004-04-27 Kent Ridge Digital Labs Optimizing delivery of computer media
US6263503B1 (en) * 1999-05-26 2001-07-17 Neal Margulis Method for effectively implementing a wireless television system
US8266657B2 (en) 2001-03-15 2012-09-11 Sling Media Inc. Method for effectively implementing a multi-room television system
US6370688B1 (en) * 1999-05-26 2002-04-09 Enounce, Inc. Method and apparatus for server broadcast of time-converging multi-media streams
AU4855300A (en) * 1999-05-26 2000-12-18 Gte Laboratories Incorporated Synchronized spatial-temporal browsing of images for selection of indexed temporal multimedia titles
US6934759B2 (en) * 1999-05-26 2005-08-23 Enounce, Inc. Method and apparatus for user-time-alignment for broadcast works
US7313808B1 (en) 1999-07-08 2007-12-25 Microsoft Corporation Browsing continuous multimedia content
US7293280B1 (en) * 1999-07-08 2007-11-06 Microsoft Corporation Skimming continuous multimedia content
US6845398B1 (en) * 1999-08-02 2005-01-18 Lucent Technologies Inc. Wireless multimedia player
US6728776B1 (en) * 1999-08-27 2004-04-27 Gateway, Inc. System and method for communication of streaming data
US7383504B1 (en) * 1999-08-30 2008-06-03 Mitsubishi Electric Research Laboratories Method for representing and comparing multimedia content according to rank
US7178107B2 (en) 1999-09-16 2007-02-13 Sharp Laboratories Of America, Inc. Audiovisual information management system with identification prescriptions
US6701355B1 (en) 1999-09-29 2004-03-02 Susquehanna Media Co. System and method for dynamically substituting broadcast material and targeting to specific audiences
US20030216961A1 (en) * 2002-05-16 2003-11-20 Douglas Barry Personalized gaming and demographic collection method and apparatus
US6954859B1 (en) 1999-10-08 2005-10-11 Axcess, Inc. Networked digital security system and methods
US6633674B1 (en) * 1999-11-24 2003-10-14 General Electric Company Picture archiving and communication system employing improved data compression
US6928655B1 (en) * 1999-12-16 2005-08-09 Microsoft Corporation Live presentation searching
US7149359B1 (en) 1999-12-16 2006-12-12 Microsoft Corporation Searching and recording media streams
US6785234B1 (en) * 1999-12-22 2004-08-31 Cisco Technology, Inc. Method and apparatus for providing user control of audio quality
US7383350B1 (en) * 2000-02-03 2008-06-03 International Business Machines Corporation User input based allocation of bandwidth on a data link
US6868440B1 (en) 2000-02-04 2005-03-15 Microsoft Corporation Multi-level skimming of multimedia content using playlists
US6785739B1 (en) * 2000-02-23 2004-08-31 Eastman Kodak Company Data storage and retrieval playback apparatus for a still image receiver
JP4300443B2 (en) * 2000-02-24 2009-07-22 ソニー株式会社 Imaging apparatus and method, and recording medium
US6249281B1 (en) * 2000-02-28 2001-06-19 Presenter.Com On-demand presentation graphical user interface
US6985966B1 (en) 2000-03-29 2006-01-10 Microsoft Corporation Resynchronizing globally unsynchronized multimedia streams
US7237254B1 (en) 2000-03-29 2007-06-26 Microsoft Corporation Seamless switching between different playback speeds of time-scale modified data streams
US6441831B1 (en) 2000-04-04 2002-08-27 Learningaction, Inc. Choosing a multimedia presentation
US7222163B1 (en) 2000-04-07 2007-05-22 Virage, Inc. System and method for hosting of video content over a network
US7962948B1 (en) 2000-04-07 2011-06-14 Virage, Inc. Video-enabled community building
US7260564B1 (en) 2000-04-07 2007-08-21 Virage, Inc. Network video guide and spidering
US8171509B1 (en) 2000-04-07 2012-05-01 Virage, Inc. System and method for applying a database to video multimedia
US7302490B1 (en) * 2000-05-03 2007-11-27 Microsoft Corporation Media file format to support switching between multiple timeline-altered media streams
US7055168B1 (en) 2000-05-03 2006-05-30 Sharp Laboratories Of America, Inc. Method for interpreting and executing user preferences of audiovisual information
US7584291B2 (en) * 2000-05-12 2009-09-01 Mosi Media, Llc System and method for limiting dead air time in internet streaming media delivery
US8028314B1 (en) 2000-05-26 2011-09-27 Sharp Laboratories Of America, Inc. Audiovisual information management system
US7051271B1 (en) * 2000-05-31 2006-05-23 Fuji Xerox Co., Ltd. Method, system and article of manufacture for linking a video to a scanned document
JP2001346139A (en) * 2000-06-05 2001-12-14 Canon Inc Information processing method and device
WO2001097045A1 (en) * 2000-06-09 2001-12-20 Veazy Inc. Application specific live streaming multimedia mixer apparatus, systems and methods
US7739335B2 (en) 2000-06-22 2010-06-15 Sony Corporation Method and apparatus for providing a customized selection of audio content over the internet
US7647340B2 (en) 2000-06-28 2010-01-12 Sharp Laboratories Of America, Inc. Metadata in JPEG 2000 file format
US7047312B1 (en) * 2000-07-26 2006-05-16 Nortel Networks Limited TCP rate control with adaptive thresholds
US7363589B1 (en) * 2000-07-28 2008-04-22 Tandberg Telecom A/S System and method for generating invisible notes on a presenter's screen
US6606689B1 (en) * 2000-08-23 2003-08-12 Nintendo Co., Ltd. Method and apparatus for pre-caching data in audio memory
US6643744B1 (en) 2000-08-23 2003-11-04 Nintendo Co., Ltd. Method and apparatus for pre-fetching audio data
US20020026521A1 (en) * 2000-08-31 2002-02-28 Sharfman Joshua Dov Joseph System and method for managing and distributing associated assets in various formats
US6834371B1 (en) * 2000-08-31 2004-12-21 Interactive Video Technologies, Inc. System and method for controlling synchronization of a time-based presentation and its associated assets
US6839059B1 (en) 2000-08-31 2005-01-04 Interactive Video Technologies, Inc. System and method for manipulation and interaction of time-based mixed media formats
US6922702B1 (en) 2000-08-31 2005-07-26 Interactive Video Technologies, Inc. System and method for assembling discrete data files into an executable file and for processing the executable file
US8595372B2 (en) * 2000-09-12 2013-11-26 Wag Acquisition, Llc Streaming media buffering system
US7716358B2 (en) * 2000-09-12 2010-05-11 Wag Acquisition, Llc Streaming media buffering system
US6766376B2 (en) 2000-09-12 2004-07-20 Sn Acquisition, L.L.C Streaming media buffering system
FR2814027B1 (en) * 2000-09-14 2003-01-31 Cit Alcatel METHOD FOR SYNCHRONIZING A MULTIMEDIA FILE
US8020183B2 (en) 2000-09-14 2011-09-13 Sharp Laboratories Of America, Inc. Audiovisual management system
US7075671B1 (en) * 2000-09-14 2006-07-11 International Business Machines Corp. System and method for providing a printing capability for a transcription service or multimedia presentation
US20020087883A1 (en) * 2000-11-06 2002-07-04 Curt Wohlgemuth Anti-piracy system for remotely served computer applications
US8831995B2 (en) 2000-11-06 2014-09-09 Numecent Holdings, Inc. Optimized server for streamed applications
US7062567B2 (en) 2000-11-06 2006-06-13 Endeavors Technology, Inc. Intelligent network streaming and execution system for conventionally coded applications
US20020091840A1 (en) * 2000-11-28 2002-07-11 Gregory Pulier Real-time optimization of streaming media from a plurality of media sources
US7451196B1 (en) 2000-12-15 2008-11-11 Stream Theory, Inc. Method and system for executing a software application in a virtual environment
ATE464740T1 (en) 2000-12-15 2010-04-15 British Telecomm TRANSMISSION OF SOUND AND/OR IMAGE MATERIAL
CA2429827C (en) 2000-12-15 2009-08-25 British Telecommunications Public Limited Company Transmission and reception of audio and/or video material
GB0030706D0 (en) * 2000-12-15 2001-01-31 British Telecomm Delivery of audio and or video material
EP1215663A1 (en) * 2000-12-15 2002-06-19 BRITISH TELECOMMUNICATIONS public limited company Encoding audio signals
US7356605B1 (en) * 2000-12-29 2008-04-08 Cisco Technology, Inc. System and method for controlling delivery of streaming media
US7085842B2 (en) 2001-02-12 2006-08-01 Open Text Corporation Line navigation conferencing system
US20020120747A1 (en) * 2001-02-23 2002-08-29 Frerichs David J. System and method for maintaining constant buffering time in internet streaming media delivery
US7631088B2 (en) * 2001-02-27 2009-12-08 Jonathan Logan System and method for minimizing perceived dead air time in internet streaming media delivery
JP2002358329A (en) * 2001-02-28 2002-12-13 Sony Computer Entertainment Inc Information providing apparatus, information processing apparatus, information providing method, information processing method, program and recording medium
US7904814B2 (en) 2001-04-19 2011-03-08 Sharp Laboratories Of America, Inc. System for presenting audio-video content
JP2003006555A (en) * 2001-06-25 2003-01-10 Nova:Kk Content distribution method, scenario data, recording medium and scenario data generation method
US7216288B2 (en) * 2001-06-27 2007-05-08 International Business Machines Corporation Dynamic scene description emulation for playback of audio/visual streams on a scene description based playback system
US6792449B2 (en) 2001-06-28 2004-09-14 Microsoft Corporation Startup methods and apparatuses for use in streaming content
US20040205116A1 (en) * 2001-08-09 2004-10-14 Greg Pulier Computer-based multimedia creation, management, and deployment platform
US20030049591A1 (en) * 2001-09-12 2003-03-13 Aaron Fechter Method and system for multimedia production and recording
CN100348030C (en) * 2001-09-14 2007-11-07 索尼株式会社 Information creating method, information creating device, and network information processing system
JP4288879B2 (en) * 2001-09-14 2009-07-01 ソニー株式会社 Network information processing system and information processing method
US7474698B2 (en) 2001-10-19 2009-01-06 Sharp Laboratories Of America, Inc. Identification of replay segments
JP3900413B2 (en) * 2002-02-14 2007-04-04 Kddi株式会社 Video information transmission method and program
US8214741B2 (en) 2002-03-19 2012-07-03 Sharp Laboratories Of America, Inc. Synchronization of video and data
US20030185296A1 (en) * 2002-03-28 2003-10-02 Masten James W. System for the capture of evidentiary multimedia data, live/delayed off-load to secure archival storage and managed streaming distribution
JP3689063B2 (en) * 2002-04-19 2005-08-31 松下電器産業株式会社 Data receiving apparatus and data distribution system
US9240810B2 (en) 2002-06-11 2016-01-19 Digital Fountain, Inc. Systems and processes for decoding chain reaction codes through inactivation
US7725557B2 (en) * 2002-06-24 2010-05-25 Microsoft Corporation Client-side caching of streaming media content
US7949689B2 (en) * 2002-07-18 2011-05-24 Accenture Global Services Limited Media indexing beacon and capture device
US20040024900A1 (en) * 2002-07-30 2004-02-05 International Business Machines Corporation Method and system for enhancing streaming operation in a distributed communication system
JP2004135256A (en) * 2002-08-09 2004-04-30 Ricoh Co Ltd Data structure of information file, methods, apparatuses and programs for generating and reproducing information file, and storage media for storing the same programs
US7523482B2 (en) * 2002-08-13 2009-04-21 Microsoft Corporation Seamless digital channel changing
US8397269B2 (en) * 2002-08-13 2013-03-12 Microsoft Corporation Fast digital channel changing
EP1546942B1 (en) * 2002-09-24 2017-07-19 S.I.Sv.El. Societa' Italiana Per Lo Sviluppo Dell'elettronica S.P.A. System and method for associating different types of media content
US7657907B2 (en) 2002-09-30 2010-02-02 Sharp Laboratories Of America, Inc. Automatic user profiling
KR101143282B1 (en) 2002-10-05 2012-05-08 디지털 파운튼, 인크. Systematic encoding and decoding of chain reaction codes
CN100477802C (en) * 2002-10-24 2009-04-08 汤姆森许可贸易公司 A method and system for maintaining lip synchronization
DE10251654B4 (en) * 2002-10-31 2006-03-02 Siemens Ag Method for ensuring the same message order in multiple data sinks
GB0230301D0 (en) * 2002-12-30 2003-02-05 Nokia Corp Streaming media
US7593915B2 (en) * 2003-01-07 2009-09-22 Accenture Global Services Gmbh Customized multi-media services
US7006945B2 (en) * 2003-01-10 2006-02-28 Sharp Laboratories Of America, Inc. Processing of video content
US7925770B1 (en) * 2003-01-29 2011-04-12 Realnetworks, Inc. Systems and methods for selecting buffering time for media data
US7548585B2 (en) * 2003-02-10 2009-06-16 At&T Intellectual Property I, L.P. Audio stream adaptive frequency scheme
US7630612B2 (en) * 2003-02-10 2009-12-08 At&T Intellectual Property, I, L.P. Video stream adaptive frame rate scheme
US7383344B2 (en) * 2003-02-14 2008-06-03 Microsoft Corporation Remote encoder system and method for capturing the live presentation of video multiplexed with images
US20110181686A1 (en) * 2003-03-03 2011-07-28 Apple Inc. Flow control
US7395346B2 (en) * 2003-04-22 2008-07-01 Scientific-Atlanta, Inc. Information frame modifier
US7603689B2 (en) * 2003-06-13 2009-10-13 Microsoft Corporation Fast start-up for digital video streams
US7734568B2 (en) * 2003-06-26 2010-06-08 Microsoft Corporation DVD metadata wizard
US20040268400A1 (en) * 2003-06-26 2004-12-30 Microsoft Corporation Quick starting video content
US7054774B2 (en) * 2003-06-27 2006-05-30 Microsoft Corporation Midstream determination of varying bandwidth availability
US7391717B2 (en) 2003-06-30 2008-06-24 Microsoft Corporation Streaming of variable bit rate multimedia content
US20050013589A1 (en) * 2003-07-14 2005-01-20 Microsoft Corporation Adding recording functionality to a media player
KR100984257B1 (en) * 2003-07-14 2010-09-30 소니 주식회사 Display device and display method
US20070118812A1 (en) * 2003-07-15 2007-05-24 Kaleidescope, Inc. Masking for presenting differing display formats for media streams
KR100941139B1 (en) * 2003-09-15 2010-02-09 엘지전자 주식회사 Method for setting media streaming parameters on universal plug and play-based network
EP1665539B1 (en) 2003-10-06 2013-04-10 Digital Fountain, Inc. Soft-Decision Decoding of Multi-Stage Chain Reaction Codes
US7444419B2 (en) * 2003-10-10 2008-10-28 Microsoft Corporation Media stream scheduling for hiccup-free fast-channel-change in the presence of network chokepoints
US7562375B2 (en) * 2003-10-10 2009-07-14 Microsoft Corporation Fast channel change
JP4289129B2 (en) * 2003-11-18 2009-07-01 ヤマハ株式会社 Audio distribution system
US7430222B2 (en) * 2004-02-27 2008-09-30 Microsoft Corporation Media stream splicer
US7594245B2 (en) 2004-03-04 2009-09-22 Sharp Laboratories Of America, Inc. Networked video devices
US8356317B2 (en) 2004-03-04 2013-01-15 Sharp Laboratories Of America, Inc. Presence based technology
US8949899B2 (en) 2005-03-04 2015-02-03 Sharp Laboratories Of America, Inc. Collaborative recommendation system
US7162533B2 (en) 2004-04-30 2007-01-09 Microsoft Corporation Session description message extensions
US7418651B2 (en) 2004-05-07 2008-08-26 Digital Fountain, Inc. File download and streaming system
JP2005333478A (en) * 2004-05-20 2005-12-02 Mitsumi Electric Co Ltd Streaming content reproduction method and internet connecting device using the same
US8346605B2 (en) * 2004-06-07 2013-01-01 Sling Media, Inc. Management of shared media content
US7975062B2 (en) 2004-06-07 2011-07-05 Sling Media, Inc. Capturing and sharing media content
US8099755B2 (en) 2004-06-07 2012-01-17 Sling Media Pvt. Ltd. Systems and methods for controlling the encoding of a media stream
US7769756B2 (en) 2004-06-07 2010-08-03 Sling Media, Inc. Selection and presentation of context-relevant supplemental content and advertising
US7917932B2 (en) 2005-06-07 2011-03-29 Sling Media, Inc. Personal video recorder functionality for placeshifting systems
US9998802B2 (en) 2004-06-07 2018-06-12 Sling Media LLC Systems and methods for creating variable length clips from a media stream
EP1769399B1 (en) 2004-06-07 2020-03-18 Sling Media L.L.C. Personal media broadcasting system
US7571246B2 (en) 2004-07-29 2009-08-04 Microsoft Corporation Media transrating over a bandwidth-limited network
US7640352B2 (en) 2004-09-24 2009-12-29 Microsoft Corporation Methods and systems for presentation of media obtained from a media stream
US7240162B2 (en) 2004-10-22 2007-07-03 Stream Theory, Inc. System and method for predictive streaming
JP2008527468A (en) 2004-11-13 2008-07-24 ストリーム セオリー,インコーポレイテッド Hybrid local / remote streaming
US7477653B2 (en) * 2004-12-10 2009-01-13 Microsoft Corporation Accelerated channel change in rate-limited environments
US20060136389A1 (en) * 2004-12-22 2006-06-22 Cover Clay H System and method for invocation of streaming application
US20060184697A1 (en) * 2005-02-11 2006-08-17 Microsoft Corporation Detecting clock drift in networked devices through monitoring client buffer fullness
US9716609B2 (en) 2005-03-23 2017-07-25 Numecent Holdings, Inc. System and method for tracking changes to files in streaming applications
US8024523B2 (en) 2007-11-07 2011-09-20 Endeavors Technologies, Inc. Opportunistic block transmission with time constraints
EP1899814B1 (en) * 2005-06-30 2017-05-03 Sling Media, Inc. Firmware update for consumer electronic device
WO2007005789A2 (en) * 2005-06-30 2007-01-11 Sling Media, Inc. Screen management system for media player
US8074248B2 (en) 2005-07-26 2011-12-06 Activevideo Networks, Inc. System and method for providing video content associated with a source image to a television in a communication network
US8135040B2 (en) * 2005-11-30 2012-03-13 Microsoft Corporation Accelerated channel change
US9294728B2 (en) 2006-01-10 2016-03-22 Imagine Communications Corp. System and method for routing content
WO2007095550A2 (en) 2006-02-13 2007-08-23 Digital Fountain, Inc. Streaming and buffering using variable fec overhead and protection periods
US9270414B2 (en) 2006-02-21 2016-02-23 Digital Fountain, Inc. Multiple-field based code generator and decoder for communications systems
US8689253B2 (en) 2006-03-03 2014-04-01 Sharp Laboratories Of America, Inc. Method and system for configuring media-playing sets
US7971129B2 (en) 2006-05-10 2011-06-28 Digital Fountain, Inc. Code generator and decoder for communications systems operating using hybrid codes to allow for multiple efficient users of the communications systems
US9209934B2 (en) 2006-06-09 2015-12-08 Qualcomm Incorporated Enhanced block-request streaming using cooperative parallel HTTP and forward error correction
US9432433B2 (en) 2006-06-09 2016-08-30 Qualcomm Incorporated Enhanced block-request streaming system using signaling or block creation
US9386064B2 (en) 2006-06-09 2016-07-05 Qualcomm Incorporated Enhanced block-request streaming using URL templates and construction rules
US9380096B2 (en) 2006-06-09 2016-06-28 Qualcomm Incorporated Enhanced block-request streaming system for handling low-latency streaming
US9178535B2 (en) 2006-06-09 2015-11-03 Digital Fountain, Inc. Dynamic stream interleaving and sub-stream based delivery
US9419749B2 (en) 2009-08-19 2016-08-16 Qualcomm Incorporated Methods and apparatus employing FEC codes with permanent inactivation of symbols for encoding and decoding processes
EP1879392A3 (en) * 2006-07-11 2011-11-02 Magix Ag System and method for dynamically creating online multimedia slideshows
US7831727B2 (en) * 2006-09-11 2010-11-09 Apple Computer, Inc. Multi-content presentation of unassociated content types
US8180920B2 (en) * 2006-10-13 2012-05-15 Rgb Networks, Inc. System and method for processing content
US8261345B2 (en) 2006-10-23 2012-09-04 Endeavors Technologies, Inc. Rule-based application access management
JP4360401B2 (en) * 2006-12-05 2009-11-11 セイコーエプソン株式会社 CONTENT REPRODUCTION SYSTEM, CONTENT REPRODUCTION DEVICE USED FOR THE SAME, CONTENT REPRODUCTION METHOD, AND COMPUTER PROGRAM
US8020100B2 (en) 2006-12-22 2011-09-13 Apple Inc. Fast creation of video segments
US8943410B2 (en) * 2006-12-22 2015-01-27 Apple Inc. Modified media presentation during scrubbing
US7992097B2 (en) 2006-12-22 2011-08-02 Apple Inc. Select drag and drop operations on video thumbnails across clip boundaries
WO2008084179A1 (en) * 2007-01-08 2008-07-17 Nds Limited Buffer management
US9826197B2 (en) 2007-01-12 2017-11-21 Activevideo Networks, Inc. Providing television broadcasts over a managed network and interactive content over an unmanaged network to a client device
EP3145200A1 (en) 2007-01-12 2017-03-22 ActiveVideo Networks, Inc. Mpeg objects and systems and methods for using mpeg objects
JP2007220134A (en) * 2007-03-30 2007-08-30 Fujitsu Ltd License transfer device, storage medium and license transfer method
JP2007188530A (en) * 2007-03-30 2007-07-26 Fujitsu Ltd License transfer device, storage medium, and medium reading method
US20080256485A1 (en) * 2007-04-12 2008-10-16 Jason Gary Krikorian User Interface for Controlling Video Programs on Mobile Computing Devices
US8400913B2 (en) * 2007-05-23 2013-03-19 Microsoft Corporation Method for optimizing near field links
US8627509B2 (en) 2007-07-02 2014-01-07 Rgb Networks, Inc. System and method for monitoring content
US8230100B2 (en) * 2007-07-26 2012-07-24 Realnetworks, Inc. Variable fidelity media provision system and method
WO2009020640A2 (en) * 2007-08-08 2009-02-12 Swarmcast, Inc. Media player plug-in installation techniques
CA2697764A1 (en) 2007-09-12 2009-03-19 Steve Chen Generating and communicating source identification information to enable reliable communications
US8477793B2 (en) 2007-09-26 2013-07-02 Sling Media, Inc. Media streaming device with gateway functionality
US8635360B2 (en) * 2007-10-19 2014-01-21 Google Inc. Media playback point seeking using data range requests
US8350971B2 (en) 2007-10-23 2013-01-08 Sling Media, Inc. Systems and methods for controlling media devices
US8892738B2 (en) 2007-11-07 2014-11-18 Numecent Holdings, Inc. Deriving component statistics for a stream enabled application
US8543720B2 (en) * 2007-12-05 2013-09-24 Google Inc. Dynamic bit rate scaling
US8060609B2 (en) 2008-01-04 2011-11-15 Sling Media Inc. Systems and methods for determining attributes of media items accessed via a personal media broadcaster
US8667279B2 (en) 2008-07-01 2014-03-04 Sling Media, Inc. Systems and methods for securely place shifting media content
US20100001960A1 (en) * 2008-07-02 2010-01-07 Sling Media, Inc. Systems and methods for gestural interaction with user interface objects
US8381310B2 (en) 2009-08-13 2013-02-19 Sling Media Pvt. Ltd. Systems, methods, and program applications for selectively restricting the placeshifting of copy protected digital media content
US20100064332A1 (en) * 2008-09-08 2010-03-11 Sling Media Inc. Systems and methods for presenting media content obtained from multiple sources
US8667163B2 (en) 2008-09-08 2014-03-04 Sling Media Inc. Systems and methods for projecting images from a computer system
US9473812B2 (en) * 2008-09-10 2016-10-18 Imagine Communications Corp. System and method for delivering content
US9247276B2 (en) * 2008-10-14 2016-01-26 Imagine Communications Corp. System and method for progressive delivery of media content
EP2180708A1 (en) * 2008-10-22 2010-04-28 TeliaSonera AB Method for streaming media playback and terminal device
US9191610B2 (en) 2008-11-26 2015-11-17 Sling Media Pvt Ltd. Systems and methods for creating logical media streams for media storage and playback
US8438602B2 (en) 2009-01-26 2013-05-07 Sling Media Inc. Systems and methods for linking media content
US8775665B2 (en) * 2009-02-09 2014-07-08 Citrix Systems, Inc. Method for controlling download rate of real-time streaming as needed by media player
US9281847B2 (en) 2009-02-27 2016-03-08 Qualcomm Incorporated Mobile reception of digital video broadcasting—terrestrial services
US8171148B2 (en) 2009-04-17 2012-05-01 Sling Media, Inc. Systems and methods for establishing connections between devices communicating over a network
US8499059B2 (en) * 2009-05-04 2013-07-30 Rovi Solutions Corporation System and methods for buffering of real-time data streams
WO2010141460A1 (en) 2009-06-01 2010-12-09 Swarmcast, Inc. Data retrieval based on bandwidth cost and delay
US8406431B2 (en) 2009-07-23 2013-03-26 Sling Media Pvt. Ltd. Adaptive gain control for digital audio samples in a media stream
US9479737B2 (en) 2009-08-06 2016-10-25 Echostar Technologies L.L.C. Systems and methods for event programming via a remote media player
US9525838B2 (en) 2009-08-10 2016-12-20 Sling Media Pvt. Ltd. Systems and methods for virtual remote control of streamed media
US8532472B2 (en) 2009-08-10 2013-09-10 Sling Media Pvt Ltd Methods and apparatus for fast seeking within a media stream buffer
US9565479B2 (en) 2009-08-10 2017-02-07 Sling Media Pvt Ltd. Methods and apparatus for seeking within a media stream using scene detection
US8966101B2 (en) 2009-08-10 2015-02-24 Sling Media Pvt Ltd Systems and methods for updating firmware over a network
US8799408B2 (en) * 2009-08-10 2014-08-05 Sling Media Pvt Ltd Localization systems and methods
US9288010B2 (en) 2009-08-19 2016-03-15 Qualcomm Incorporated Universal file delivery methods for providing unequal error protection and bundled file delivery services
US9160974B2 (en) 2009-08-26 2015-10-13 Sling Media, Inc. Systems and methods for transcoding and place shifting media content
US8314893B2 (en) 2009-08-28 2012-11-20 Sling Media Pvt. Ltd. Remote control and method for automatically adjusting the volume output of an audio device
US9917874B2 (en) 2009-09-22 2018-03-13 Qualcomm Incorporated Enhanced block-request streaming using block partitioning or request controls for improved client-side handling
US9015225B2 (en) 2009-11-16 2015-04-21 Echostar Technologies L.L.C. Systems and methods for delivering messages over a network
US8799485B2 (en) 2009-12-18 2014-08-05 Sling Media, Inc. Methods and apparatus for establishing network connections using an inter-mediating device
US8626879B2 (en) 2009-12-22 2014-01-07 Sling Media, Inc. Systems and methods for establishing network connections using local mediation services
US9178923B2 (en) 2009-12-23 2015-11-03 Echostar Technologies L.L.C. Systems and methods for remotely controlling a media server via a network
US9275054B2 (en) 2009-12-28 2016-03-01 Sling Media, Inc. Systems and methods for searching media content
US20110191456A1 (en) * 2010-02-03 2011-08-04 Sling Media Pvt Ltd Systems and methods for coordinating data communication between two devices
US8856349B2 (en) * 2010-02-05 2014-10-07 Sling Media Inc. Connection priority services for data communication between two devices
US20110208506A1 (en) * 2010-02-24 2011-08-25 Sling Media Inc. Systems and methods for emulating network-enabled media components
US8401370B2 (en) * 2010-03-09 2013-03-19 Dolby Laboratories Licensing Corporation Application tracks in audio/video containers
DE102010011098A1 (en) * 2010-03-11 2011-11-17 Daimler Ag Audio and video data playback device installed in motor car, has buffer unit in which playback velocity of audio and video is reduced in relation to normal velocity, until preset quantity of audio and video is stored in buffer unit
US9485546B2 (en) 2010-06-29 2016-11-01 Qualcomm Incorporated Signaling video samples for trick mode video representations
US8918533B2 (en) 2010-07-13 2014-12-23 Qualcomm Incorporated Video switching for streaming video data
US9185439B2 (en) 2010-07-15 2015-11-10 Qualcomm Incorporated Signaling data for multiplexing video components
US9596447B2 (en) 2010-07-21 2017-03-14 Qualcomm Incorporated Providing frame packing type information for video coding
US9456015B2 (en) 2010-08-10 2016-09-27 Qualcomm Incorporated Representation groups for network streaming of coded multimedia data
KR20130138263A (en) 2010-10-14 2013-12-18 액티브비디오 네트웍스, 인코포레이티드 Streaming digital video between video devices using a cable television system
US9270299B2 (en) 2011-02-11 2016-02-23 Qualcomm Incorporated Encoding and decoding using elastic codes with flexible source block mapping
US8958375B2 (en) 2011-02-11 2015-02-17 Qualcomm Incorporated Framing for an improved radio link protocol including FEC
EP2695388B1 (en) 2011-04-07 2017-06-07 ActiveVideo Networks, Inc. Reduction of latency in video distribution networks using adaptive bit rates
US9253233B2 (en) 2011-08-31 2016-02-02 Qualcomm Incorporated Switch signaling methods providing improved switching between representations for adaptive HTTP streaming
US9843844B2 (en) 2011-10-05 2017-12-12 Qualcomm Incorporated Network streaming of media data
US10409445B2 (en) 2012-01-09 2019-09-10 Activevideo Networks, Inc. Rendering of an interactive lean-backward user interface on a television
US9489827B2 (en) 2012-03-12 2016-11-08 Cisco Technology, Inc. System and method for distributing content in a video surveillance network
US9294226B2 (en) 2012-03-26 2016-03-22 Qualcomm Incorporated Universal object delivery and template-based file delivery
US9800945B2 (en) 2012-04-03 2017-10-24 Activevideo Networks, Inc. Class-based intelligent multiplexing over unmanaged networks
US9123084B2 (en) 2012-04-12 2015-09-01 Activevideo Networks, Inc. Graphical application integration with MPEG objects
US9049349B2 (en) * 2012-05-16 2015-06-02 Cisco Technology, Inc. System and method for video recording and retention in a network
US9264288B1 (en) 2012-10-10 2016-02-16 Cisco Technology, Inc. Identifying media network flows that use dynamic codec identifications
WO2014145921A1 (en) 2013-03-15 2014-09-18 Activevideo Networks, Inc. A multiple-mode system and method for providing user selectable video content
US20140338516A1 (en) * 2013-05-19 2014-11-20 Michael J. Andri State driven media playback rate augmentation and pitch maintenance
US9326047B2 (en) 2013-06-06 2016-04-26 Activevideo Networks, Inc. Overlay rendering of user interface onto source video
US9219922B2 (en) 2013-06-06 2015-12-22 Activevideo Networks, Inc. System and method for exploiting scene graph information in construction of an encoded video sequence
US9294785B2 (en) 2013-06-06 2016-03-22 Activevideo Networks, Inc. System and method for exploiting scene graph information in construction of an encoded video sequence
US10021205B2 (en) * 2013-10-22 2018-07-10 Salesforce.Com, Inc. Rules-based multipoint routing of real-time information using client-server architecture
US9788029B2 (en) 2014-04-25 2017-10-10 Activevideo Networks, Inc. Intelligent multiplexing using class-based, multi-dimensioned decision logic for managed networks
US10536790B2 (en) * 2015-03-12 2020-01-14 StarTime Software Technology Co., Ltd. Location based services audio system
US11184665B2 (en) 2018-10-03 2021-11-23 Qualcomm Incorporated Initialization set for network streaming of media data
CN110798458B (en) * 2019-10-22 2022-05-06 潍坊歌尔微电子有限公司 Data synchronization method, device, equipment and computer readable storage medium
CN111050208A (en) * 2019-12-23 2020-04-21 深圳市豪恩汽车电子装备股份有限公司 Real-time monitoring video playing device and method for motor vehicle
US11800179B2 (en) * 2020-12-03 2023-10-24 Alcacruz Inc. Multiview video with one window based on another

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0624341B2 (en) * 1986-12-18 1994-03-30 三菱電機株式会社 Multimedia data transmission system
US5208745A (en) * 1988-07-25 1993-05-04 Electric Power Research Institute Multimedia interface and method for computer system
US5109482A (en) * 1989-01-11 1992-04-28 David Bohrman Interactive video control system for displaying user-selectable clips
JP2865782B2 (en) * 1990-03-16 1999-03-08 富士通株式会社 CODEC device for asynchronous transmission
DE69122147T2 (en) * 1990-03-16 1997-01-30 Hewlett Packard Co Method and device for clipping pixels from source and target windows in a graphic system
US5237648A (en) * 1990-06-08 1993-08-17 Apple Computer, Inc. Apparatus and method for editing a video recording by selecting and displaying video clips
US5191645A (en) * 1991-02-28 1993-03-02 Sony Corporation Of America Digital signal processing system employing icon displays
US5159447A (en) * 1991-05-23 1992-10-27 At&T Bell Laboratories Buffer control for variable bit-rate channel
CA2078714A1 (en) * 1991-12-20 1993-06-21 Robert A. Pascoe Automated audio/visual presentation
US5287182A (en) * 1992-07-02 1994-02-15 At&T Bell Laboratories Timing recovery for variable bit-rate video on asynchronous transfer mode (ATM) networks
NZ268754A (en) * 1993-06-09 1998-07-28 Intelligence At Large Inc Prioritised packetised data for multiple media digital communication

Also Published As

Publication number Publication date
CA2140850A1 (en) 1995-08-25
EP0669587A2 (en) 1995-08-30
EP0669587A3 (en) 1997-09-24
US5822537A (en) 1998-10-13

Similar Documents

Publication Publication Date Title
CA2140850C (en) Networked system for display of multimedia presentations
US7167191B2 (en) Techniques for capturing information during multimedia presentations
US6789228B1 (en) Method and system for the storage and retrieval of web-based education materials
US7653925B2 (en) Techniques for receiving information during multimedia presentations and communicating the information
US6697569B1 (en) Automated conversion of a visual presentation into digital data format
US8281230B2 (en) Techniques for storing multimedia information with source documents
US6449653B2 (en) Interleaved multiple multimedia stream for synchronized transmission over a computer network
US6877134B1 (en) Integrated data and real-time metadata capture system and method
US8918708B2 (en) Enhanced capture, management and distribution of live presentations
CA2198699C (en) System and method for recording and playing back multimedia events
US20020036694A1 (en) Method and system for the storage and retrieval of web-based educational materials
EP1024444B1 (en) Image information describing method, video retrieval method, video reproducing method, and video reproducing apparatus
US6804295B1 (en) Conversion of video and audio to a streaming slide show
US20150381685A1 (en) System and Method for Capturing, Editing, Searching, and Delivering Multi-Media Content with Local and Global Time
WO1994018776A2 (en) Multimedia distribution system
US11437072B2 (en) Recording presentations using layered keyframes
Katseff et al. Predictive prefetch in the Nemesis multimedia information service
EP0895617A1 (en) A method and system for synchronizing and navigating multiple streams of isochronous and non-isochronous data
JP2009225116A (en) Video recording device with network transmission function
Hjelsvold Video information contents and architecture
Amir et al. Automatic generation of conference video proceedings
US20020054026A1 (en) Synchronized transmission of recorded writing data with audio
Zhang et al. Implementation of video presentation in database systems
Song et al. PVCAIS: A personal videoconference archive indexing system
Megzari et al. A distributed platform for interactive multimedia

Legal Events

Date Code Title Description
EEER Examination request
MKLA Lapsed