WO2000016221A9 - Interactive playlist generation using annotations - Google Patents

Interactive playlist generation using annotations

Info

Publication number
WO2000016221A9
WO2000016221A9 PCT/US1999/021391 US9921391W WO0016221A9 WO 2000016221 A9 WO2000016221 A9 WO 2000016221A9 US 9921391 W US9921391 W US 9921391W WO 0016221 A9 WO0016221 A9 WO 0016221A9
Authority
WO
WIPO (PCT)
Prior art keywords
annotations
media
annotation
user
computer
Prior art date
Application number
PCT/US1999/021391
Other languages
French (fr)
Other versions
WO2000016221A1 (en
Inventor
Anoop Gupta
David M Bargeron
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Priority to AU59264/99A priority Critical patent/AU5926499A/en
Publication of WO2000016221A1 publication Critical patent/WO2000016221A1/en
Publication of WO2000016221A9 publication Critical patent/WO2000016221A9/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/43Querying
    • G06F16/438Presentation of query results
    • G06F16/4387Presentation of query results by the use of playlists
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/48Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/74Browsing; Visualisation therefor
    • G06F16/748Hypervideo
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/78Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/7867Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, title and artist information, manually generated time, location and usage information, user ratings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/166Editing, e.g. inserting or deleting
    • G06F40/169Annotation, e.g. comment data or footnotes
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S707/00Data processing: database and file management or data structures
    • Y10S707/953Organization of data
    • Y10S707/956Hierarchical
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S707/00Data processing: database and file management or data structures
    • Y10S707/99931Database or file accessing

Definitions

  • This invention relates to networked client/server systems and to methods of delivering and rendering multimedia content in such systems. More particularly, the invention relates to systems and methods of selecting and providing such content.
  • Synchronized media means multiple media objects that share a common timeline.
  • Video and audio are examples of synchronized media — each is a separate data stream with its own data structure, but the two data streams are played back in synchronization with each other.
  • Virtually any media type can have a timeline.
  • an image object can change like an animated .gif file, text can change and move, and animation and digital effects can happen over time. This concept of synchronizing multiple media types is gaining greater meaning and currency with the emergence of more sophisticated media composition frameworks implied by MPEG-4, Dynamic HTML, and other media playback environments.
  • streaming is used to indicate that the data representing the various media types is provided over a network to a client computer on a real-time, as-needed basis, rather than being pre-delivered in its entirety before playback.
  • client computer renders streaming data as it is received from a network server, rather than waiting for an entire "file" to be delivered.
  • Multimedia presentations may also include "annotations" relating to the multimedia presentation.
  • An annotation is data (e.g., audio, text, video, etc.) that corresponds to a multimedia presentation.
  • Annotations can be added by anyone with appropriate access rights to the annotation system (e.g., the lecturer/trainer or any of the students/trainees).
  • These annotations typically correspond to a particular temporal location in the multimedia presentation and can provide a replacement for much of the "in-person” interaction and "classroom discussion” that is lost when the presentation is not made "in-person” or “live”.
  • a student can comment on a particular point, to which another student (or lecturer) can respond in a subsequent annotation. This process can continue, allowing a "classroom discussion" to occur via these annotations.
  • some systems allow a user to select a particular one of these annotations and begin playback ofthe presentation starting at approximately the point in the presentation to which the annotation corresponds.
  • the invention described below addresses this and other disadvantages of annotations, providing a way to improve multimedia presentation using annotations.
  • Annotations correspond to media segments of one or more multimedia streams.
  • a playlist generation interface is presented to the user in the form of annotation titles or summaries for a group of annotations.
  • This group of annotations corresponds to the media segments that are part of a playlist.
  • the playlist can then be altered by the user to suit his or her desires or needs by interacting with the annotation title/summary interface.
  • the media segments of the playlist can then be presented to the user in a seamless, contiguous manner.
  • the ordering of the annotation titles/summaries can be altered by the user, resulting in a corresponding change in order of presentation of the media segments.
  • the ordering of the annotation titles/summaries can be changed by moving the titles or summaries in a drag and drop manner.
  • the media segments of the playlist can themselves be stored as an additional multimedia stream.
  • This additional multimedia stream can then be annotated in the same manner as other multimedia streams.
  • Fig. 1 shows a client/server network system and environment in accordance with one embodiment of the invention.
  • Fig. 2 shows a general example of a computer that can be used as a client or server in accordance with the invention.
  • Fig. 3 is a block diagram illustrating an annotation server and a client computer in more detail in accordance with one embodiment ofthe invention.
  • Fig. 4 is a block diagram illustrating the structure for an annotation according to one embodiment ofthe invention.
  • Fig. 5 is a block diagram illustrating exemplary annotation collections.
  • Fig. 6 illustrates an annotation toolbar in accordance with one embodiment ofthe invention.
  • Fig. 7 illustrates an "add new annotation” dialog box in accordance with one embodiment ofthe invention.
  • Fig. 8 illustrates a "query annotations" dialog box in accordance with one embodiment ofthe invention.
  • Fig. 9 illustrates a "view annotations" dialog box in accordance with one embodiment ofthe invention.
  • Fig. 10 is a diagrammatic illustration of a graphical user interface window displaying annotations and corresponding media segments concurrently in accordance with one embodiment of the invention.
  • Fig. 11 illustrates methodological aspects of one embodiment of the invention in retrieving and presenting annotations and media segments to a user.
  • Fig. 1 shows a client/server network system and environment in accordance with one embodiment of the invention.
  • the system includes multiple network server computers 10, 11, 12, and 13, and multiple ( «) network client computers 15.
  • the computers communicate with each other over a data communications network.
  • the communications network in Fig. 1 comprises a public network 16 such as the Internet.
  • the data communications network might also include, either in addition to or in place of the Internet, local-area networks and/or private wide-area networks.
  • Streaming media server computer 11 has access to streaming media content in the form of different media streams.
  • These media streams can be individual media streams (e.g., audio, video, graphical, etc.), or alternatively can be composite media streams including two or more of such individual streams.
  • Some media streams might be stored as files in a database or other file storage system, while other media streams might be supplied to the server on a "live" basis from other data source components through dedicated communications channels or through the Internet itself.
  • Annotation server 10 controls the storage of annotations and their provision to client computers 15.
  • the annotation server 10 manages the annotation meta data store 18 and the annotation content store 17.
  • the annotation server 10 communicates with the client computers 15 via any of a wide variety of known protocols, such as the Hypertext Transfer Protocol (HTTP).
  • HTTP Hypertext Transfer Protocol
  • the annotation server 10 can receive and provide annotations via direct contact with a client computer 15, or alternatively via electronic mail (email) via email server 13.
  • the annotation server 10 similarly communicates with the email server 13 via any of a wide variety of known protocols, such as the Simple Mail Transfer Protocol (SMTP).
  • SMTP Simple Mail Transfer Protocol
  • annotation server 10 corresponds to the streaming media available from media server computer 11.
  • annotations are discussed as corresponding to streaming media.
  • annotations can similarly correspond to "pre- delivered" rather than streaming media, such as media previously stored at the client computers 15 via the network 16, via removable magnetic or optical disks, etc.
  • a conventional web browser of the client computer 15 contacts the web server 12 to get the Hypertext Markup Language (HTML) page, the media server 11 to get the streaming data, and the annotation server 10 to get any annotations associated with that media.
  • HTML Hypertext Markup Language
  • the client computer 15 desires to add or retrieve annotations
  • the client computer 15 contacts the annotation server 10 to perform the desired addition/retrieval.
  • program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
  • program modules may be located in both local and remote memory storage devices.
  • Fig. 2 shows a general example of a computer 20 that can be used as a client or server in accordance with the invention.
  • Computer 20 is shown as an example of a computer that can perform the functions of any of server computers 10-13 or a client computer 15 of Figure 1.
  • Computer 20 includes one or more processors or processing units 21, a system memory 22, and a bus 23 that couples various system components including the system memory 22 to processors 21.
  • the bus 23 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures.
  • the system memory includes read only memory (ROM) 24 and random access memory (RAM) 25.
  • ROM read only memory
  • RAM random access memory
  • Computer 20 further includes a hard disk drive 27 for reading from and writing to a hard disk, not shown, a magnetic disk drive 28 for reading from and writing to a removable magnetic disk 29, and an optical disk drive 30 for reading from or writing to a removable optical disk 31 such as a CD ROM or other optical media.
  • the hard disk drive 27, magnetic disk drive 28, and optical disk drive 30 are connected to the system bus 23 by an SCSI interface 32 or some other appropriate interface.
  • the drives and their associated computer-readable media provide nonvolatile storage of computer readable instructions, data structures, program modules and other data for computer 20.
  • the exemplary environment described herein employs a hard disk, a removable magnetic disk 29 and a removable optical disk 31 , it should be appreciated by those skilled in the art that other types of computer readable media which can store data that is accessible by a computer, such as magnetic cassettes, flash memory cards, digital video disks, random access memories (RAMs) read only memories (ROM), and the like, may also be used in the exemplary operating environment. y
  • a number of program modules may be stored on the hard disk, magnetic disk 29, optical disk 31, ROM 24, or RAM 25, including an operating system 35, one or more application programs 36, other program modules 37, and program data 38.
  • a user may enter commands and information into computer 20 through input devices such as keyboard 40 and pointing device 42.
  • Other input devices may include a microphone, joystick, game pad, satellite dish, scanner, or the like.
  • These and other input devices are connected to the processing unit 21 through an interface 46 that is coupled to the system bus.
  • a monitor 47 or other type of display device is also connected to the system bus 23 via an interface, such as a video adapter 48.
  • personal computers typically include other peripheral output devices (not shown) such as speakers and printers.
  • Computer 20 operates in a networked environment using logical connections to one or more remote computers, such as a remote computer 49.
  • the remote computer 49 may be another personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to computer 20, although only a memory storage device 50 has been illustrated in Fig. 2.
  • the logical connections depicted in Fig. 2 include a local area network (LAN) 51 and a wide area network (WAN) 52.
  • LAN local area network
  • WAN wide area network
  • Such networking environments are commonplace in offices, enterprise- wide computer networks, intranets, and the Internet.
  • remote computer 49 executes an Internet Web browser program such as the "Internet Explorer" Web browser manufactured and distributed by Microsoft Corporation of Redmond, Washington.
  • computer 20 When used in a LAN networking environment, computer 20 is connected to the local network 51 through a network interface or adapter 53. When used in a WAN networking environment, computer 20 typically includes a modem 54 or other means for establishing communications over the wide area network 52, such as the Internet.
  • the modem 54 which may be internal or external, is connected to the system bus 23 via a serial port interface 33.
  • program modules depicted relative to the personal computer 20, or portions thereof may be stored in the remote memory storage device. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.
  • the data processors of computer 20 are programmed by means of instructions stored at different times in the various computer-readable storage media of the computer.
  • Programs and operating systems are typically distributed, for example, on floppy disks or CD-ROMs. From there, they are installed or loaded into the secondary memory of a computer. At execution, they are loaded at least partially into the computer's primary electronic memory.
  • the invention described herein includes these and other various types of computer-readable storage media when such media contain instructions or programs for implementing the steps described below in conjunction with a microprocessor or other data processor.
  • the invention also includes the computer itself when programmed according to the methods and techniques described below.
  • certain sub-components of the computer may be programmed to perform the functions and steps described below. The invention includes such sub-components when they are programmed as described.
  • the invention described herein includes data structures, described below, as embodied on various types of memory media.
  • programs and other executable program components such as the operating system are illustrated herein as discrete blocks, although it is recognized that such programs and components reside at various times in different storage components of the computer, and are executed by the data processor(s) of the computer.
  • Client/Server Relationship Fig. 3 illustrates an annotation server and a client computer in more detail.
  • commands are formulated at client computer 15 and forwarded to annotation server 10 via HTTP requests.
  • communication between client 15 and server 10 is performed via HTTP, using commands encoded as Uniform Resource Locators (URLs) and data formatted as object linking and embedding (OLE) structured storage documents, or alternatively using Extensible Markup Language (XML).
  • URLs Uniform Resource Locators
  • OLE object linking and embedding
  • XML Extensible Markup Language
  • Client 15 includes an HTTP services (HttpSvcs) module 152, which manages communication with server 10, and an annotation back end (ABE) module 151, which translates user actions into commands destined for server 10.
  • a user interface (MMA) module 150 provides the user interface (UI) for a user to add and select different annotations, and be presented with the annotations.
  • the user interface module 150 supports ActiveX controls that display an annotation interface for streaming video on the Web.
  • Client 15 also includes a web browser module 153, which provides a conventional web browsing interface and capabilities for the user to access various servers via network 16 of Fig. 1.
  • Web browser 153 also provides the interface for a user to be presented with media streams.
  • the user can select which one of different versions of multimedia content he or she wishes to receive from media server 11 of Fig. 1. This selection can be direct (e.g., entry of a particular URL or selection of a "low resolution” option), or indirect (e.g., entry of a particular desired playback duration or an indication of system capabilities, such as "slow system” or “fast system”). Alternatively, other media presentation interfaces could be used.
  • Annotation server 10 includes the Multimedia Annotation Web Server (MAWS) module 130, which is an Internet Services Application Programming Interface (IS API) plug- in for Internet Information Server (IIS) module 135. Together, these two modules provide the web server functionality of annotation server 10.
  • Annotation server 10 also includes an HTTP Services module 131 which manages communication with client 15.
  • annotation server 10 utilizes The Windows Messaging Subsystem 134 to facilitate communication with email server 13 of Fig. 1, and an email reply server 133 for processing incoming email received from email server 13.
  • Annotation server 10 further includes an annotation back end (ABE) module 132, which contains functionality for accessing annotation stores 17 and 18, for composing outgoing email based on annotation data, and for processing incoming email.
  • ABE annotation back end
  • Incoming email is received and passed to the ABE module 132 by the Email Reply Server 133.
  • Annotation content authored at client 15, using user interface 150 is received by ABE 132 and maintained in annotation content store 17.
  • Received meta data (control information) corresponding to the annotation content is maintained in annotation meta data store 18.
  • the annotation content and meta data can be stored in any of a variety of conventional manners, such as in SQL relational databases (e.g., using Microsoft "SQL Server” version 7.0, available from Microsoft Corporation).
  • Annotation server 10 is illustrated in Fig.
  • annotation data can be stored together in a single store, or content may be stored by another distinct storage system on the network 16 of Fig. 1, such as a file system, media server, email server, or other data store.
  • ABE 132 of annotation server 10 also manages the interactive generation and presentation of streaming media data from server computer 11 of Fig. 1 using "playlists".
  • a "playlist” is a listing of one or more multimedia segments to be retrieved and presented in a given order.
  • Each of the multimedia segments in the playlist is defined by a source identifier, a start time, and an end time.
  • the source identifier identifies which media stream the segment is part of, the start time identifies the temporal location within the media stream where the segment begins, and the end time identifies the temporal location within the media stream where the segment ends.
  • ABE 132 allows playlists to be generated interactively based on annotations maintained in annotation stores 17 and 18.
  • ABE 132 provides a user at client 15 with multiple possible annotation identifiers (e.g., titles or summaries) from which the user can select those of interest to him or her. Based on the selected annotations, ABE 132 coordinates provision of the associated media segments to the user.
  • ABE 132 can directly communicate with video server computer 11 to identify which segments are to be provided, or alternatively can provide the appropriate information to the browser of client computer 15, which in turn can request the media segments from server computer 11.
  • Fig. 4 shows an exemplary structure for an annotation entry 180 that is maintained by annotation server 10 in annotation meta data store 18 of Fig. 3.
  • an annotation entry 180 includes an author field 182, a time range field 184, a time units field 186, a creation time field 188, a title field 190, a content field 192, an identifier field 194, a related annotation identifier field 196, a set identifier(s) field 198, a media content identifier field 200, an arbitrary number of user-defined property fields 202, and a sequence number 204.
  • Each of fields 182-204 is a collection of data which define a particular characteristic of annotation entry 180.
  • Annotation entry 180 is maintained by annotation server 10 of Fig. 3 in annotation meta data store 18.
  • Content field 192 includes a pointer to (or other identifier of) the annotation content, which in turn is stored in annotation content store 17.
  • Author field 182 contains data identifying the user who created annotation entry 180 and who is therefore the author of the annotation. The author is identified by ABE 151 of Fig. 3 based on the user logged into client 15 at the time the annotation is created.
  • Time range field 184 contains data representing "begin” and "end” times defining a segment of media timeline to which annotation entry 180 is associated.
  • Time units field 186 contains data representing the units of time represented in time range field 184. Together, time range field 184 and time units field 186 identify the relative time range of the annotation represented by annotation entry 180. This relative time range corresponds to a particular segment of the media stream to which annotation entry 180 is associated.
  • the begin and end times for the annotation are provided by the user via interface 150 of Fig. 3, or alternatively can be automatically or implicitly derived using a variety of audio and video signal processing techniques, such as sentence detection in audio streams or video object tracking.
  • a first annotation may correspond to a segment ranging between the first and fourth minutes of media content
  • a second annotation may correspond to a segment ranging between the second and seventh mmutes of the media content
  • a third annotation may correspond to a segment ranging between the second and third minutes of the media content.
  • annotations could be associated with (or "anchored” on) specific objects in the video content, or specific events in the audio content.
  • Creation time field 188 contains data specifying the date and time at which annotation entry 180 is created. It should be noted that the time of creation of annotation entry 180 is absolute and is not relative to the video or audio content of the media stream to which annotation entry 180 is associated. Accordingly, a user can specify that annotations which are particularly old, e.g., created more than two weeks earlier, are not to be displayed.
  • ABE 132 of Fig. 3 stores the creation time and date when the annotation is created.
  • Title field 190 contains data representing a title by which the annotation represented by annotation entry 180 is identified. The title is generally determined by the user and the user enters the data representing the title using conventional and well known user interface techniques. The data can be as simple as ASCII text or as complex as HTML code which can include text having different fonts and type styles, graphics including wallpaper, motion video images, audio, and links to other multimedia documents.
  • Content field 192 contains data representing the substantive content of the annotation as authored by the user.
  • the actual data can be stored in content field 192, or alternatively content field 192 may store a pointer to (or other indicator of) the content that is stored separately from the entry 180 itself.
  • content field 192 includes a pointer to (or other identifier of) the annotation content, which in turn is stored in annotation content store 17.
  • the user enters the data representing the content using conventional and well known user interface techniques. The content added by the user in creating annotation entry
  • content field 192 contains data representing the substantive content the user wishes to include with the presentation of the corresponding media stream at the relative time range represented by time range field 184 and time units field 186.
  • Annotation identifier field 194 stores data that uniquely identifies annotation entry 180, while related annotation identifier field 196 stores data that uniquely identifies a related annotation.
  • Annotation identifier field 194 can be used by other annotation entries to associate such other annotation entries with annotation entry 180.
  • threads of discussion can develop in which a second annotation responds to a first annotation, a third annotation responds to the second annotation and so on.
  • an identifier of the first annotation would be stored in related annotation identifier field 196 of the second annotation
  • an identifier of the second annotation would be stored in related annotation identifier field 196 of the third annotation, and so on.
  • Set identifier(s) field 198 stores data that identifies a particular one or more sets to which annotation entry 180 belongs.
  • a media stream can have multiple sets of annotations, sets can span multiple media content, and a particular annotation can correspond to one or more of these sets. Which set(s) an annotation belongs to is identified by the author of the annotation.
  • a media stream corresponding to a lecture may include the following sets: "instructor's comments”, “assistant's comments”, “audio comments”, “text comments”, “student questions”, and each student's personal comments.
  • Media content identifier field 200 contains data that uniquely identifies particular multimedia content as the content to which annotation entry 180 corresponds.
  • Media content identifier 200 can identify a single media stream (either an individual stream or a composite stream), or alternatively identify multiple different streams that are different versions of the same media content.
  • Media content identifier 200 can identify media versions in a variety of different manners.
  • the data represents a real-time transport protocol (RTP) address of the different media streams.
  • RTP address is a type of uniform resource locator (URL) by which multimedia documents can be identified in a network.
  • a unique identifier is assigned to the content rather than to the individual media streams.
  • a different unique identifier of the media streams could be created by annotation server 10 of Fig. 3 and assigned to the media streams.
  • Such a unique identifier would also be used by streaming media server 11 of Fig. 1 to identify the media streams.
  • a uniform resource name such as those described by K. Sollins and L. Mosinter in "Functional Requirements for Uniform Resource Names," IETF RFC 1733 (December 1994) could be used to identify the media stream.
  • User-defined property fields 202 are one or more user-definable fields that allow users (or user interface designers) to customize the annotation system.
  • additional property fields include a "reference URL” property which contains the URL of a web page used as reference material for the content of the annotation; a "help URL” property containing the URL of a help page which can be accessed concerning the content of the annotation; a "view script” property containing JavaScript which is to be executed whenever the annotation is viewed; a "display type” property, which gives the client user interface information about how the annotation is to be displayed; etc.
  • Sequence number 204 allows a user to define (via user interface 150 of Fig. 3) a custom ordering for the display of annotation identifiers, as discussed in more detail below. Sequence number 204 stores the relative position of the annotations with respect to one another in the custom ordering, allowing the custom ordering to be saved for future used.
  • annotation entry 180 stores a single sequence number. Alternatively, multiple sequence numbers 204 may be included in annotation entry 180 each corresponding to a different custom ordering, or a different annotation set, or a different user, etc.
  • Fig. 5 illustrates exemplary implicit annotation collections for annotations maintained by annotation server 10 of Fig. 3.
  • a collection of annotations refers to annotation entries 180 of Fig. 4 that correspond to the same media stream(s), based on the media content identifier 200.
  • Annotation entries 180 can be viewed conceptually as part of the same annotation collection if they have the same media content identifiers 200, even though the annotation entries may not be physically stored together by annotation server 10.
  • Annotation database 206 includes two annotation collections 208 and 210.
  • Annotation server 10 dynamically adds, deletes, and modifies annotation entries in annotation database 206 based on commands from client 15.
  • Annotation entries can be created and added to annotation database 206 at any time a user cares to comment upon the content of the stream (or another annotation) in the form of an annotation.
  • Annotation server 10 forms an annotation entry from identification data, content data, title data, and author data of an "add annotation" request received from the client's ABE 151 (Fig. 3), and adds the annotation entry to annotation database 206.
  • Annotation database 206 includes a fields 212, 214, and 216 that specify common characteristics of all annotation entries of database 206 or an annotation collection 208 or 210. Alternatively, fields 212-216 can be included redundantly in each annotation entry 180.
  • Creator field 212 contains data identifying the user who was responsible for creating annotation database 206.
  • RTP address fields 214 and 216 contain data representing an RTP address of the media stream (e.g., the RTP address of the stream identified in media content identifier 200 of Fig. 5) for the annotation collection.
  • An RTP address provides an alternative mechanism, in addition to the data in identifier field 200, for associating the media stream with annotation entries 180.
  • RTP address fields 214 and 216 need not be included, particularly embodiments in which media content identifier 200 contains the RTP address of the media stream.
  • An annotation can be created by a user of any of the client computers 15 of Fig. 1.
  • client 15 includes an interface module 150 that presents an interface to a user (e.g., a graphical user interface), allowing a user to make requests of annotation server 10.
  • a user can access annotation server 10 via an annotation toolbar provided by interface 150.
  • FIG. 6 illustrates an annotation toolbar in accordance with one embodiment of the invention.
  • Annotation toolbar 240 includes various identifying information and user-selectable options 242-254. Selection of an exit or "X" button 242 causes interface 150 to terminate display of the toolbar 240.
  • a server identifier 244 identifies the annotation server with which client 15 is currently configured to communicate (annotation server 10 of Fig. 1. in the illustrated embodiment).
  • Selection of a connection button 246 causes ABE 151 of Fig. 3 to establish a connection with the annotation server identified by identifier 244.
  • Selection of a query button 248 causes interface module 150 to open a "query" dialog box, from which a user can search for particular annotations.
  • Selection of an add button 250 causes interface module 150 to open an "add new annotation" dialog box, from which a user can create a new annotation.
  • Selection of a show annotations button 252 causes interface module 150 to open a "view annotations" dialog box, from which a user can select particular annotations for presentation.
  • Selection of a preferences button 254 causes interface 150 of Fig. 3 to open a "preferences" dialog box, from which a user can specify various UI preferences, such as an automatic server query refresh interval, or default query criteria values to be persisted between sessions.
  • Fig. 7 shows an "add new annotation" dialog box 260 that results from user selection of add button 250 of Fig. 6 to create a new annotation.
  • Interface 150 assumes that the current media stream being presented to the user is the media stream to which this new annotation will be associated.
  • the media stream to which an annotation is associated is referred to as the "target" of the annotation.
  • An identifier of this stream is displayed in a target specification area 262 of dialog box 260.
  • a user could change the target of the annotation, such as by typing in a new identifier in target area 262, or by selection of a "browse" button O 00/16221 ,_, , PCT/US99/21391
  • a time strip 264 is also provided as part of dialog box 260.
  • Time strip 264 represents the entire presentation time of the corresponding media stream.
  • a thumb 265 that moves within time strip 264 indicates a particular temporal position within the media stream.
  • the annotation being created via dialog box 260 has a begin time and an end time, which together define a particular segment of the media stream.
  • thumb 265 represents the begin time for the segment relative to the media stream.
  • thumb 265 represents the end time for the segment relative to the media stream.
  • two different thumbs could be displayed, one for the begin time and one for the end time.
  • the begin and end times are also displayed in an hours/minutes/seconds format in boxes 266 and 270, respectively.
  • Thumb 265 can be moved along time strip 264 in any of a variety of conventional manners. For example, a user can depress a button of a mouse (or other cursor control device) while a pointer is "on top" of thumb 265 and move the pointer along time strip 264, causing thumb 265 to move along with the pointer. The appropriate begin or end time is then set when the mouse button is released. Alternatively, the begin and end times can be set by entering (e.g., via an alphanumeric keyboard) particular times in boxes 266 and 270.
  • Dialog box 260 also includes a "play" button 274. Selection of play button 274 causes interface module 150 of Fig. 3 to forward a segment specification to web browser 153 of client 15.
  • the segment specification includes the target identifier from target display 262 and the begin and end times from boxes 266 and 270, respectively.
  • the browser Upon receipt of the segment specification from interface module 150, the browser communicates with media server 11 and requests the identified media segment using conventional HTTP requests. In response, media server 11 streams the media segment to client 15 for presentation to the user. This presentation allows, for example, the user to verify the portion of the media stream to which his or her annotation will correspond.
  • Dialog box 260 also includes an annotation set identifier 272, an email field
  • Annotation set identifier 272 allows the user to identify a named set to which the new annotation will belong. This set can be a previously defined set, or a new set being created by the user. Selection of the particular set can be made from a drop-down menu activated by selection of a button 273, or alternatively can be directly input by the user (e.g., typed in using an alphanumeric keyboard).
  • annotation server 10 of Fig. 3 supports read and write access controls, allowing the creator of the set to identify which users are able to read and/or write to the annotation set. In this embodiment, only those sets for which the user has write access can be entered as set identifier 272.
  • Email identifier 275 allows the user to input the email address of a recipient of the annotation.
  • the newly created annotation is electronically mailed to the recipient indicated in identifier 275 in addition to being added to the annotation database.
  • the recipient of the electronic mail message can reply to the message to create an additional annotation.
  • the original email message is configured with annotation server 10 as the sender. Because of this, a "reply to sender" request from the recipient will cause an email reply to be sent to annotation server 10.
  • annotation server 10 Upon receipt of such an electronic mail message reply, annotation server 10 creates a new annotation and uses the reply message content as the content of the new annotation.
  • This new annotation identifies, as a related annotation, the original annotation that was created when the original mail message was sent by annotation server 10. In the illustrated embodiment, this related annotation identifier is stored in field 196 of Fig 4.
  • Summary 276 allows the user to provide a short summary or title of the annotation content. Although the summary is illustrated as being text, it could include any of a wide variety of characters, alphanumerics, graphics, etc. In the illustrated embodiment, summary 276 is stored in the title field 190 of the annotation entry of Fig. 4.
  • Dialog box 260 further includes radio buttons 280 and 282, which allow an annotation to be created as text and/or audio. Although not shown, other types of annotations could also be accommodated, such as graphics, HTML documents, etc.
  • Input controls 278 are also provided as part of dialog box. The illustrated controls are enabled when the annotation includes audio data. Input controls 278 include conventional audio control buttons such as fast forward, rewind, play, pause, stop and record. Additionally, an audio display bar 279 can be included to provide visual progress feedback when the audio is playing or recording.
  • input controls 278 may simply include a box into which text can be input by the user via an alphanumeric keyboard. Additionally, a keyboard layout may also be provided to the user, allowing him or her to "point and click" using a mouse and pointer to select particular characters for entry.
  • Annotation and Media Segment Retrieval Fig. 8 shows a "query annotations" dialog box 330 that results from a user selecting query button 248 of Fig. 6. Many of the options presented to the user in dialog box 330 are similar to those presented in the "add new annotation" dialog box 260 of Fig. 7, however, those in dialog box 330 are used as search criteria rather than data for a new annotation.
  • Dialog box 330 includes a target display 332 that contains an identifier of the target stream. This identifier can be input in any of a variety of manners, such as by typing in a new identifier in target display 332, or by selection of a "browse” button (not shown) that allows the user to browse through different directories of media streams. In the illustrated embodiment, the identifier is an URL. However, alternate embodiments can use different identifier formats. Dialog box 330 also includes target information 334, which includes a time strip, thumb, "from” button, "to” button, “play” button, and begin and end times, which are analogous to the time strip, thumb, "from” button, "to” button, “play” button, begin and end times of dialog box 260 of Fig. 7. The begin and end times in target information 334 limit the query for annotations to only those annotations having a time range that corresponds to at least part of the media segment between the begin and end times of target information 334.
  • Dialog box 330 also includes an annotation set list 336.
  • Annotation set list 336 includes a listing of the various sets that correspond to the target media stream. According to one implementation, only those sets for which an annotation has been created are displayed in set list 336.
  • annotation server 10 of Fig. 3 supports read and write security, allowing the creator of the set to identify which users are able to read and/or write to the annotation set. In this embodiment, only those sets for which the user has read access are displayed in set list 336. A user can select sets from annotation set list 336 in a variety of manners.
  • a "select all” button 338 allows a user to select all sets in set list 336
  • a “deselect all” button 340 allows a user to de-select all sets in set list 336.
  • the sets displayed as part of annotation set list 336 contain annotations which correspond to the target identifier in target display 332.
  • the sets in selection list 338 need not necessarily contain annotations which correspond to the target identifier in target display 332.
  • Interface module 150 allows a user to select different target streams during the querying process. Thus, a user may identify a first target stream and select one or more sets to query annotations from for the first target stream, and then identify a second target stream and select one or more sets to query annotations from for the second target stream. Additional search criteria can also be input by the user. As illustrated, a particular creation date and time identifier 342 can be input, along with a relation 344 (e.g., "after" or "before").
  • a summary keyword search identifier 346 A maximum number of annotations to retrieve in response to the query can also be included as a max identifier 348.
  • the query can be limited to only annotations that correspond to the target identifier in target display 332 by selecting check box 360.
  • a level of detail 350 to retrieve can also be selected by the user. Examples of different levels that could be retrieved include the "full level” (that is, all content of the annotation), or a "deferred download” where only an identifier of the annotations (e.g., a summary or title) is downloaded.
  • the "full level” that is, all content of the annotation
  • a "deferred download” where only an identifier of the annotations (e.g., a summary or title) is downloaded.
  • 26 selection of checkbox 354 selects the deferred download level, whereas if checkbox 354 is not selected then the full level of detail is implicitly selected.
  • a server identifier 356 identifies the annotation server with which client 15 is currently configured to communicate. Different annotation servers can be selected by the user by inputting the appropriate identifier as server identifier 356. This input can be provided in any of a variety of manners, such as by typing in a new identifier in server identifier 356 or by selection of a "browse" button (not shown) that allows the user to browse through different directories of annotation servers.
  • a user can request automatic display of the retrieved annotations by selecting a "display retrieved annotations" checkbox 358. Selection of "advanced" button 362 reduces the number of options available to the user, simplifying dialog box 330. For example, the simplified dialog box may not include fields 342, 344, 348, 346, 350, 332, 334, or 336. The user can then complete the query process by selecting a query button
  • interface 150 Upon selection of the query button 364, interface 150 closes the query dialog box 330 and forwards the search criteria to annotation server 10. Additionally, if checkbox 358 is selected then interface 150 displays a "view annotations" dialog box 400 of Fig. 9. Alternatively, a user can provide a view request, causing interface 150 to display dialog box 400, by selecting show annotations button 252 in annotation toolbar 240 of Fig. 6.
  • Fig. 9 shows a dialog box 400 that identifies annotations corresponding to a playlist of media segments.
  • the playlist is a result of the query input by the user as discussed above with reference to Fig. 8.
  • annotation identifiers in the form of user identifiers 406 and summaries 408 are displayed within an annotation listing box 402.
  • the user can scroll through annotation identifiers in a conventional manner via scroll bars 404 and 405.
  • the annotation identifiers are presented in annotation listing box 402 according to a default criteria, such as chronological by creation time/date, by user, alphabetical by summaries, etc.
  • Related annotations are displayed in an annotation listing 402 in a hierarchical, horizontally offset manner.
  • the identifier of an annotation that is related to a previous annotation is "indented" from that previous annotation's identifier and a connecting line between the two identifiers is shown.
  • Dialog box 400 can be displayed concurrently with a multimedia player that is presenting multimedia content that corresponds to the annotations in annotation listing 402 (e.g., as illustrated in Fig. 10 below).
  • Interface module 150 can have the annotations "track" the corresponding multimedia content being played back, so that the user is presented with an indication (e.g., an arrow) as to which annotation(s) correspond to the current temporal position of the multimedia content.
  • Such tracking can be enabled by selecting checkbox 422, or disabled by de- selecting checkbox 422.
  • Dialog box 400 also includes a merge annotation sets checkbox 424.
  • Selection of merge annotation sets checkbox 424 causes interface module 150 to present annotation identifiers in listing box 402 in a chronological order regardless of what set(s) the annotations in annotation listing 402 belong to. If checkbox 424 is not selected, then annotations from different sets are grouped and displayed together in annotation listing 402 (e.g., under the same tree item). Thus, when checkbox 424 is not selected, interface 150 displays one playlist for each annotation set that has been retrieved from annotation server 10.
  • Dialog box 400 also includes a refresh button 428, a close button 430, and an advanced button 432. Selection of refresh button 428 causes interface module 150 to communicate with annotation back end 151 to access annotation server 10 and obtain any additional annotations that correspond to the query that resulted in listing box 402.
  • Selection of close button 430 causes interface 150 to terminate the display of dialog box 400.
  • Selection of advanced button 432 causes interface 150 to display a different view annotations box having additional details, such as annotation target information (analogous to target display 332 discussed below with reference to Fig. 8), user-selectable preferences for information displayed as annotation identifiers in listing box 402, etc.
  • preview information is presented in a preview section 416, and a selection box or menu 410 is provided.
  • the exact nature ofthe preview information is dependent on the data type and amount of information that was requested (e.g., as identified in level of detail 350 of Fig. 8).
  • Menu 410 includes the following options: play, export ASX playlist, export annotations, time order, custom order, save, and reset.
  • Selection of the "play” option causes playback of the multimedia content to begin starting with the selected annotation in annotation list 402.
  • Selection of the "export ASX playlist” option causes annotation backend 151 to output a record (e.g., create a file) that identifies the temporal segments of multimedia content that the annotations identified in list 402 correspond to, as determined by the begin and end times of the annotations.
  • Selection of the "export annotations” option causes annotation backend 151 to output a record (e.g., create a file) that includes the annotation content of each annotation identified in list 402.
  • Selection of the "time order” option causes interface module 150 to display the identifiers in list 402 in chronological order based on the begin time for each annotation.
  • Selection of the "custom order” option allows the user to identify some other criteria to be used in determining the order of the identifiers in list 402 (e.g., identifiers can be re-ordered in a conventional drag and drop manner).
  • Re-ordering annotation identifiers causes the sequence numbers 204 (of Fig. 4) of the annotations to be re-ordered accordingly.
  • Selection of the "save” option causes interface module 150 to save the current custom ordering to annotation server 10 of Fig. 3 by saving the current sequence numbers of the annotations.
  • Selection of the "reset” option causes interface module 150 to ignore any changes that have been made since the last saved custom ordering and revert to the last saved custom ordering.
  • Transfer of the corresponding media segments (and/or the annotations) to client 15 is initiated when the "play" option of menu 410 is selected.
  • interface 150 of Fig. 3 provides the list of annotation identifiers being displayed to web browser 153 (or other multimedia presentation application) in the order of their display, including the target identifier and temporal range information.
  • web browser 153 receives a list of multimedia segments that it is to present to the user in a particular order.
  • Web browser 153 accesses media server 11 to stream the multimedia segments to client 15 for presentation in that order.
  • a user is able to review the information regarding the annotations that satisfy his or her search criteria and then modify the annotation playlist (e.g., by deleting or reordering annotation identifiers) before the corresponding media segments (and/or the annotations) are presented to him or her.
  • modify the annotation playlist e.g., by deleting or reordering annotation identifiers
  • transfer of the media segments may be initiated in other manners rather than by selection of the play option in menu 410.
  • a "start" button may be included as part of dialog box 400, selection of which initiates transfer of the media segments to client 15.
  • the annotations and/or corresponding media segments are presented to the user "back to back" with very little or no noticeable gap between different annotations and between different segments.
  • the presentation of the annotations and/or media segments is "seamless".
  • a user is able to reorder the media segments of the playlist and thereby alter their order of presentation.
  • media segments are reordered by changing the ordering of the annotation identifiers in annotation listing 402 in a drag and drop manner.
  • a user can select a particular annotation identifier (e.g., identifier 420) and drag it to a different location within the dialog box (e.g., between identifiers 419 and 421), thereby changing when the media segment corresponding to the annotation identified by identifier 420 is presented relative to the other annotations.
  • a particular annotation identifier e.g., identifier 420
  • drag it to a different location within the dialog box e.g., between identifiers 419 and 421
  • web browser 153 sends a message to the appropriate media server 11 of Fig. 1 to begin streaming the appropriate segment to client computer 15.
  • Web browser 153 knowing the duration of each of the segments being provided to client computer 15, forwards additional messages to media server 11 to continue with the provision of the next segment, according to the playlist, when appropriate.
  • web browser 153 can keep the media segments being provided to the user in a seamless manner.
  • the media segments could be streamed to annotation server 10 for temporary buffering and subsequent streaming to client computer 15.
  • identifying information e.g., source, start time, and end time
  • media server 11 could be provided to media server 11 from annotation server 10 for streaming to client computer 15.
  • the collection of media segments identified by the playlist can be stored as an additional media stream by selecting "export ASF playlist" option in menu 410 of Fig. 9.
  • the collection can be retrieved by the user (or other users) at a later time without having to go through another querying process.
  • the collection of segments, stored as a media stream can itself be annotated.
  • the collection of segments can be stored as a media stream in any of a variety of different locations and formats.
  • the media stream can be stored in an additional data store (not shown) managed by annotation server 10 of Fig. 3, or alternatively stored at media server 11 of Fig. 1 or another media server (not shown) of Fig. 1.
  • the media stream includes the source information, start time, and end time for each of the segments in the playlist.
  • the media stream includes pointers to each of the annotations.
  • the stored pointers can be used to retrieve each of the appropriate annotations, from which the corresponding media segments can be retrieved.
  • the media segments themselves could be copied from media server 11 of Fig. 1 and those segments stored as the media stream.
  • Fig. 10 shows one implementation of a graphical user interface window 450 that concurrently displays annotations and corresponding media segments.
  • This UI window 450 has an annotation screen 454, a media screen 456, and a toolbar 240.
  • Media screen 456 is the region of the UI within which the multimedia content is rendered. For video content, the video is displayed on screen 456. For non- visual content, screen 456 displays static or dynamic images representing the content. For audio content, for example, a dynamically changing frequency wave that represents an audio signal is displayed in media screen 456.
  • Annotation screen 454 is the region of the UI within which the annotation identifiers and/or annotation content are rendered.
  • dialog box 400 of Fig. 9 can be annotation screen 454.
  • Fig. 11 illustrates methodological aspects of one embodiment of the invention in retrieving and presenting annotations and media segments to a user.
  • a step 500 comprises displaying a query dialog box 330 of Fig. 8.
  • Interface 150 of Fig. 3 provides dialog box 330 in response to a query request from a user, allowing the user to search for annotations that satisfy various user-definable criteria.
  • a step 502 comprises receiving query input from the user.
  • Interface 150 of Fig. 3 provides dialog box 330 in response to a query request from a user, allowing the user to search for annotations that satisfy various user-definable criteria.
  • a step 502 comprises receiving query input from the user.
  • Fig. 3 receives the user's input(s) to the query dialog box and provides the inputs to annotation server 10 of Fig. 3.
  • a step 504 comprises generating an annotation list.
  • ABE 132 of Fig. 3 uses the user inputs to the query dialog box to select annotations from stores 17 and 18.
  • ABE 132 searches through annotation meta data store 18 for the annotations that satisfy the criteria provided by the user.
  • the annotations that satisfy that criteria then become part of the annotation list and identifying information, such as the annotation titles or summaries, are provided to client 15 by annotation server 10.
  • a step 506 comprises displaying a view annotations dialog box 400 of Fig. 9 that contains the annotation identifying information from the annotation list generated in step 504.
  • Steps 508 and 510 comprise receiving user input selecting various annotations from the identifying information displayed in step 506.
  • Steps 508 and 510 repeat until the user has finished his or her selecting.
  • a step 512 comprises retrieving the selected annotations and corresponding media segments.
  • ABE 132 in annotation server 10 of Fig. 3 is responsible for retrieving the selected annotations from stores 17 and 18.
  • a step 514 comprises presenting the selected annotations and corresponding media segments to the user in a seamless manner.
  • both the selected annotations as well as the corresponding media segments are provided to the user.
  • only the media segments corresponding to the annotations (and not the annotations themselves) are provided to the user.
  • only the annotations (and not the corresponding segments of the media stream) are provided to the user.
  • the annotations are downloaded to the client computer first, and the media segments are downloaded to the client computer later in an on-demand manner.
  • annotation data is buffered in annotation server 10 of Fig. 1 for provision to client 15 and media stream data is buffered in media server 11 for provision to client 15.
  • Sufficient buffering is provided to allow the annotation and media stream data to be provided to the client seamlessly. For example, when streaming two media segments to client 15, as the end of the first media segment draws near media server 11 is working on obtaining and streaming the beginning of the second media segment to client 15. By doing so, there is little or no noticeable gap between the first and second media segments as presented to the user.
  • additional buffering can be provided by client 15 to allow the seamless presentation of the data.

Abstract

A plurality of user-selected annotations are used to define a playlist of media segments corresponding to the annotations. The user-selected annotations and their corresponding media segments are then provided to the user in a seamless manner. A user interface allows the user to alter the playlist and the order of annotations in the playlist. The user interface identifies each annotation by a short subject line.

Description

INTERACTIVE PLAYLIST GENERATION USING ANNOTATIONS
A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.
RELATED APPLICATIONS
This application claims priority to U.S. Provisional Application No. 60/100,452, filed September 15, 1998, entitled "Annotations for Streaming Video on the Web: System Design and Usage", to Anoop Gupta and David M. Bargeron.
TECHNICAL FIELD
This invention relates to networked client/server systems and to methods of delivering and rendering multimedia content in such systems. More particularly, the invention relates to systems and methods of selecting and providing such content.
BACKGROUND OF THE INVENTION
The advent of computers and their continued technological advancement has revolutionized the manner in which people work and live. An example of such is in the education field, wherein educational presentations (such as college lectures, workplace training sessions, etc.) can be provided to a computer user as multimedia data (e.g., video, audio, text, and/or animation data). Today, such presentations are primarily video and audio, but a richer, broader digital media era is emerging. Educational multimedia presentations provide many benefits, such as allowing the presentation data to be created at a single time yet be presented to different users at different times and in different locations throughout the world.
These multimedia presentations are provided to a user as synchronized media. Synchronized media means multiple media objects that share a common timeline. Video and audio are examples of synchronized media — each is a separate data stream with its own data structure, but the two data streams are played back in synchronization with each other. Virtually any media type can have a timeline. For example, an image object can change like an animated .gif file, text can change and move, and animation and digital effects can happen over time. This concept of synchronizing multiple media types is gaining greater meaning and currency with the emergence of more sophisticated media composition frameworks implied by MPEG-4, Dynamic HTML, and other media playback environments.
The term "streaming" is used to indicate that the data representing the various media types is provided over a network to a client computer on a real-time, as-needed basis, rather than being pre-delivered in its entirety before playback. Thus, the client computer renders streaming data as it is received from a network server, rather than waiting for an entire "file" to be delivered.
Multimedia presentations may also include "annotations" relating to the multimedia presentation. An annotation is data (e.g., audio, text, video, etc.) that corresponds to a multimedia presentation. Annotations can be added by anyone with appropriate access rights to the annotation system (e.g., the lecturer/trainer or any of the students/trainees). These annotations typically correspond to a particular temporal location in the multimedia presentation and can provide a replacement for much of the "in-person" interaction and "classroom discussion" that is lost when the presentation is not made "in-person" or "live". As part of an annotation, a student can comment on a particular point, to which another student (or lecturer) can respond in a subsequent annotation. This process can continue, allowing a "classroom discussion" to occur via these annotations. Additionally, some systems allow a user to select a particular one of these annotations and begin playback ofthe presentation starting at approximately the point in the presentation to which the annotation corresponds.
However, current systems typically allow a user to select multimedia playback based only on individual annotations. This limitation provides a cumbersome process for the user, as he or she may wish to view several different portions of the presentation corresponding to several different annotations. Using current systems, the user would be required to undergo the painstaking process of selecting a first annotation, viewing/listening to the multimedia presentation corresponding to the first annotation, selecting a second annotation, viewing/listening to the multimedia presentation corresponding to the second annotation, selecting a third annotation, viewing/listening to the multimedia presentation corresponding to the third annotation, and so on through several annotations.
The invention described below addresses this and other disadvantages of annotations, providing a way to improve multimedia presentation using annotations.
SUMMARY OF THE INVENTION
Annotations correspond to media segments of one or more multimedia streams. A playlist generation interface is presented to the user in the form of annotation titles or summaries for a group of annotations. This group of annotations corresponds to the media segments that are part of a playlist. The playlist can then be altered by the user to suit his or her desires or needs by interacting with the annotation title/summary interface. The media segments of the playlist can then be presented to the user in a seamless, contiguous manner.
According to one aspect of the invention, the ordering of the annotation titles/summaries can be altered by the user, resulting in a corresponding change in order of presentation of the media segments. The ordering of the annotation titles/summaries can be changed by moving the titles or summaries in a drag and drop manner.
According to another aspect of the invention, the media segments of the playlist can themselves be stored as an additional multimedia stream. This additional multimedia stream can then be annotated in the same manner as other multimedia streams.
BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 shows a client/server network system and environment in accordance with one embodiment of the invention.
Fig. 2 shows a general example of a computer that can be used as a client or server in accordance with the invention.
Fig. 3 is a block diagram illustrating an annotation server and a client computer in more detail in accordance with one embodiment ofthe invention. Fig. 4 is a block diagram illustrating the structure for an annotation according to one embodiment ofthe invention.
Fig. 5 is a block diagram illustrating exemplary annotation collections. Fig. 6 illustrates an annotation toolbar in accordance with one embodiment ofthe invention. Fig. 7 illustrates an "add new annotation" dialog box in accordance with one embodiment ofthe invention. Fig. 8 illustrates a "query annotations" dialog box in accordance with one embodiment ofthe invention.
Fig. 9 illustrates a "view annotations" dialog box in accordance with one embodiment ofthe invention.
Fig. 10 is a diagrammatic illustration of a graphical user interface window displaying annotations and corresponding media segments concurrently in accordance with one embodiment of the invention.
Fig. 11 illustrates methodological aspects of one embodiment of the invention in retrieving and presenting annotations and media segments to a user.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
General Network Structure
Fig. 1 shows a client/server network system and environment in accordance with one embodiment of the invention. Generally, the system includes multiple network server computers 10, 11, 12, and 13, and multiple («) network client computers 15. The computers communicate with each other over a data communications network. The communications network in Fig. 1 comprises a public network 16 such as the Internet. The data communications network might also include, either in addition to or in place of the Internet, local-area networks and/or private wide-area networks.
Streaming media server computer 11 has access to streaming media content in the form of different media streams. These media streams can be individual media streams (e.g., audio, video, graphical, etc.), or alternatively can be composite media streams including two or more of such individual streams. Some media streams might be stored as files in a database or other file storage system, while other media streams might be supplied to the server on a "live" basis from other data source components through dedicated communications channels or through the Internet itself.
There are various standards for streaming media content and composite media streams. The "Advanced Streaming Format" (ASF) is an example of such a standard, including both accepted versions of the standard and proposed standards for future adoption. ASF specifies the way in which multimedia content is stored, streamed, and presented by the tools, servers, and clients of various multimedia vendors. Further details about ASF are available from Microsoft Corporation of Redmond, Washington. Annotation server 10 controls the storage of annotations and their provision to client computers 15. The annotation server 10 manages the annotation meta data store 18 and the annotation content store 17. The annotation server 10 communicates with the client computers 15 via any of a wide variety of known protocols, such as the Hypertext Transfer Protocol (HTTP). The annotation server 10 can receive and provide annotations via direct contact with a client computer 15, or alternatively via electronic mail (email) via email server 13. The annotation server 10 similarly communicates with the email server 13 via any of a wide variety of known protocols, such as the Simple Mail Transfer Protocol (SMTP).
The annotations managed by annotation server 10 correspond to the streaming media available from media server computer 11. In the discussions to follow, the annotations are discussed as corresponding to streaming media. However, it should be noted that the annotations can similarly correspond to "pre- delivered" rather than streaming media, such as media previously stored at the client computers 15 via the network 16, via removable magnetic or optical disks, etc. When a user of a client computer 15 accesses a web page containing streaming media, a conventional web browser of the client computer 15 contacts the web server 12 to get the Hypertext Markup Language (HTML) page, the media server 11 to get the streaming data, and the annotation server 10 to get any annotations associated with that media. When a user of a client computer 15 desires to add or retrieve annotations, the client computer 15 contacts the annotation server 10 to perform the desired addition/retrieval.
Exemplary Computer Environment In the discussion below, the invention will be described in the general context of computer-executable instructions, such as program modules, being executed by one or more conventional personal computers. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that the invention may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like. In a distributed computer environment, program modules may be located in both local and remote memory storage devices.
Fig. 2 shows a general example of a computer 20 that can be used as a client or server in accordance with the invention. Computer 20 is shown as an example of a computer that can perform the functions of any of server computers 10-13 or a client computer 15 of Figure 1. Computer 20 includes one or more processors or processing units 21, a system memory 22, and a bus 23 that couples various system components including the system memory 22 to processors 21.
The bus 23 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. The system memory includes read only memory (ROM) 24 and random access memory (RAM) 25. A basic input/output system (BIOS) 26, containing the basic routines that help to transfer information between elements within computer 20, such as during start-up, is stored in ROM 24. Computer 20 further includes a hard disk drive 27 for reading from and writing to a hard disk, not shown, a magnetic disk drive 28 for reading from and writing to a removable magnetic disk 29, and an optical disk drive 30 for reading from or writing to a removable optical disk 31 such as a CD ROM or other optical media. The hard disk drive 27, magnetic disk drive 28, and optical disk drive 30 are connected to the system bus 23 by an SCSI interface 32 or some other appropriate interface. The drives and their associated computer-readable media provide nonvolatile storage of computer readable instructions, data structures, program modules and other data for computer 20. Although the exemplary environment described herein employs a hard disk, a removable magnetic disk 29 and a removable optical disk 31 , it should be appreciated by those skilled in the art that other types of computer readable media which can store data that is accessible by a computer, such as magnetic cassettes, flash memory cards, digital video disks, random access memories (RAMs) read only memories (ROM), and the like, may also be used in the exemplary operating environment. y
A number of program modules may be stored on the hard disk, magnetic disk 29, optical disk 31, ROM 24, or RAM 25, including an operating system 35, one or more application programs 36, other program modules 37, and program data 38. A user may enter commands and information into computer 20 through input devices such as keyboard 40 and pointing device 42. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner, or the like. These and other input devices are connected to the processing unit 21 through an interface 46 that is coupled to the system bus. A monitor 47 or other type of display device is also connected to the system bus 23 via an interface, such as a video adapter 48. In addition to the monitor, personal computers typically include other peripheral output devices (not shown) such as speakers and printers.
Computer 20 operates in a networked environment using logical connections to one or more remote computers, such as a remote computer 49. The remote computer 49 may be another personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to computer 20, although only a memory storage device 50 has been illustrated in Fig. 2. The logical connections depicted in Fig. 2 include a local area network (LAN) 51 and a wide area network (WAN) 52. Such networking environments are commonplace in offices, enterprise- wide computer networks, intranets, and the Internet. In the described embodiment of the invention, remote computer 49 executes an Internet Web browser program such as the "Internet Explorer" Web browser manufactured and distributed by Microsoft Corporation of Redmond, Washington.
When used in a LAN networking environment, computer 20 is connected to the local network 51 through a network interface or adapter 53. When used in a WAN networking environment, computer 20 typically includes a modem 54 or other means for establishing communications over the wide area network 52, such as the Internet. The modem 54, which may be internal or external, is connected to the system bus 23 via a serial port interface 33. In a networked environment, program modules depicted relative to the personal computer 20, or portions thereof, may be stored in the remote memory storage device. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.
Generally, the data processors of computer 20 are programmed by means of instructions stored at different times in the various computer-readable storage media of the computer. Programs and operating systems are typically distributed, for example, on floppy disks or CD-ROMs. From there, they are installed or loaded into the secondary memory of a computer. At execution, they are loaded at least partially into the computer's primary electronic memory. The invention described herein includes these and other various types of computer-readable storage media when such media contain instructions or programs for implementing the steps described below in conjunction with a microprocessor or other data processor. The invention also includes the computer itself when programmed according to the methods and techniques described below. Furthermore, certain sub-components of the computer may be programmed to perform the functions and steps described below. The invention includes such sub-components when they are programmed as described. In addition, the invention described herein includes data structures, described below, as embodied on various types of memory media.
For purposes of illustration, programs and other executable program components such as the operating system are illustrated herein as discrete blocks, although it is recognized that such programs and components reside at various times in different storage components of the computer, and are executed by the data processor(s) of the computer.
Client/Server Relationship Fig. 3 illustrates an annotation server and a client computer in more detail.
As noted above, generally, commands are formulated at client computer 15 and forwarded to annotation server 10 via HTTP requests. In the illustrated embodiment of Fig. 3, communication between client 15 and server 10 is performed via HTTP, using commands encoded as Uniform Resource Locators (URLs) and data formatted as object linking and embedding (OLE) structured storage documents, or alternatively using Extensible Markup Language (XML).
Client 15 includes an HTTP services (HttpSvcs) module 152, which manages communication with server 10, and an annotation back end (ABE) module 151, which translates user actions into commands destined for server 10. A user interface (MMA) module 150 provides the user interface (UI) for a user to add and select different annotations, and be presented with the annotations. According to one implementation, the user interface module 150 supports ActiveX controls that display an annotation interface for streaming video on the Web.
Client 15 also includes a web browser module 153, which provides a conventional web browsing interface and capabilities for the user to access various servers via network 16 of Fig. 1. Web browser 153 also provides the interface for a user to be presented with media streams. In addition to the use of playlists discussed below, the user can select which one of different versions of multimedia content he or she wishes to receive from media server 11 of Fig. 1. This selection can be direct (e.g., entry of a particular URL or selection of a "low resolution" option), or indirect (e.g., entry of a particular desired playback duration or an indication of system capabilities, such as "slow system" or "fast system"). Alternatively, other media presentation interfaces could be used.
Annotation server 10 includes the Multimedia Annotation Web Server (MAWS) module 130, which is an Internet Services Application Programming Interface (IS API) plug- in for Internet Information Server (IIS) module 135. Together, these two modules provide the web server functionality of annotation server 10. Annotation server 10 also includes an HTTP Services module 131 which manages communication with client 15. In addition, annotation server 10 utilizes The Windows Messaging Subsystem 134 to facilitate communication with email server 13 of Fig. 1, and an email reply server 133 for processing incoming email received from email server 13.
Annotation server 10 further includes an annotation back end (ABE) module 132, which contains functionality for accessing annotation stores 17 and 18, for composing outgoing email based on annotation data, and for processing incoming email. Incoming email is received and passed to the ABE module 132 by the Email Reply Server 133. Annotation content authored at client 15, using user interface 150, is received by ABE 132 and maintained in annotation content store 17. Received meta data (control information) corresponding to the annotation content is maintained in annotation meta data store 18. The annotation content and meta data can be stored in any of a variety of conventional manners, such as in SQL relational databases (e.g., using Microsoft "SQL Server" version 7.0, available from Microsoft Corporation). Annotation server 10 is illustrated in Fig. 3 as maintaining the annotation content and associated control information (meta data) separately in two different stores. Alternatively, all of the annotation data (content and meta information) can be stored together in a single store, or content may be stored by another distinct storage system on the network 16 of Fig. 1, such as a file system, media server, email server, or other data store.
ABE 132 of annotation server 10 also manages the interactive generation and presentation of streaming media data from server computer 11 of Fig. 1 using "playlists". A "playlist" is a listing of one or more multimedia segments to be retrieved and presented in a given order. Each of the multimedia segments in the playlist is defined by a source identifier, a start time, and an end time. The source identifier identifies which media stream the segment is part of, the start time identifies the temporal location within the media stream where the segment begins, and the end time identifies the temporal location within the media stream where the segment ends.
ABE 132 allows playlists to be generated interactively based on annotations maintained in annotation stores 17 and 18. ABE 132 provides a user at client 15 with multiple possible annotation identifiers (e.g., titles or summaries) from which the user can select those of interest to him or her. Based on the selected annotations, ABE 132 coordinates provision of the associated media segments to the user. ABE 132 can directly communicate with video server computer 11 to identify which segments are to be provided, or alternatively can provide the appropriate information to the browser of client computer 15, which in turn can request the media segments from server computer 11.
Fig. 4 shows an exemplary structure for an annotation entry 180 that is maintained by annotation server 10 in annotation meta data store 18 of Fig. 3. In the illustrated embodiment, an annotation entry 180 includes an author field 182, a time range field 184, a time units field 186, a creation time field 188, a title field 190, a content field 192, an identifier field 194, a related annotation identifier field 196, a set identifier(s) field 198, a media content identifier field 200, an arbitrary number of user-defined property fields 202, and a sequence number 204. Each of fields 182-204 is a collection of data which define a particular characteristic of annotation entry 180. Annotation entry 180 is maintained by annotation server 10 of Fig. 3 in annotation meta data store 18. Content field 192, as discussed in more detail below, includes a pointer to (or other identifier of) the annotation content, which in turn is stored in annotation content store 17.
Author field 182 contains data identifying the user who created annotation entry 180 and who is therefore the author of the annotation. The author is identified by ABE 151 of Fig. 3 based on the user logged into client 15 at the time the annotation is created.
Time range field 184 contains data representing "begin" and "end" times defining a segment of media timeline to which annotation entry 180 is associated. Time units field 186 contains data representing the units of time represented in time range field 184. Together, time range field 184 and time units field 186 identify the relative time range of the annotation represented by annotation entry 180. This relative time range corresponds to a particular segment of the media stream to which annotation entry 180 is associated. The begin and end times for the annotation are provided by the user via interface 150 of Fig. 3, or alternatively can be automatically or implicitly derived using a variety of audio and video signal processing techniques, such as sentence detection in audio streams or video object tracking.
It should be noted that the time ranges for different annotations can overlap. Thus, for example, a first annotation may correspond to a segment ranging between the first and fourth minutes of media content, a second annotation may correspond to a segment ranging between the second and seventh mmutes of the media content, and a third annotation may correspond to a segment ranging between the second and third minutes of the media content.
Alternatively, rather than using the presentation timeline of the media content, different media characteristics can be used to associate the annotation with a particular segment(s) of the media content. For example, annotations could be associated with (or "anchored" on) specific objects in the video content, or specific events in the audio content.
Creation time field 188 contains data specifying the date and time at which annotation entry 180 is created. It should be noted that the time of creation of annotation entry 180 is absolute and is not relative to the video or audio content of the media stream to which annotation entry 180 is associated. Accordingly, a user can specify that annotations which are particularly old, e.g., created more than two weeks earlier, are not to be displayed. ABE 132 of Fig. 3 stores the creation time and date when the annotation is created. Title field 190 contains data representing a title by which the annotation represented by annotation entry 180 is identified. The title is generally determined by the user and the user enters the data representing the title using conventional and well known user interface techniques. The data can be as simple as ASCII text or as complex as HTML code which can include text having different fonts and type styles, graphics including wallpaper, motion video images, audio, and links to other multimedia documents.
Content field 192 contains data representing the substantive content of the annotation as authored by the user. The actual data can be stored in content field 192, or alternatively content field 192 may store a pointer to (or other indicator of) the content that is stored separately from the entry 180 itself. In the illustrated example, content field 192 includes a pointer to (or other identifier of) the annotation content, which in turn is stored in annotation content store 17. The user enters the data representing the content using conventional and well known user interface techniques. The content added by the user in creating annotation entry
180 can include any one or more of text, graphics, video, audio, etc. or links thereto. In essence, content field 192 contains data representing the substantive content the user wishes to include with the presentation of the corresponding media stream at the relative time range represented by time range field 184 and time units field 186.
Annotation identifier field 194 stores data that uniquely identifies annotation entry 180, while related annotation identifier field 196 stores data that uniquely identifies a related annotation. Annotation identifier field 194 can be used by other annotation entries to associate such other annotation entries with annotation entry 180. In this way, threads of discussion can develop in which a second annotation responds to a first annotation, a third annotation responds to the second annotation and so on. By way of example, an identifier of the first annotation would be stored in related annotation identifier field 196 of the second annotation, an identifier of the second annotation would be stored in related annotation identifier field 196 of the third annotation, and so on.
Set identifier(s) field 198 stores data that identifies a particular one or more sets to which annotation entry 180 belongs. A media stream can have multiple sets of annotations, sets can span multiple media content, and a particular annotation can correspond to one or more of these sets. Which set(s) an annotation belongs to is identified by the author of the annotation. By way of example, a media stream corresponding to a lecture may include the following sets: "instructor's comments", "assistant's comments", "audio comments", "text comments", "student questions", and each student's personal comments. Media content identifier field 200 contains data that uniquely identifies particular multimedia content as the content to which annotation entry 180 corresponds. Media content identifier 200 can identify a single media stream (either an individual stream or a composite stream), or alternatively identify multiple different streams that are different versions of the same media content. Media content identifier 200 can identify media versions in a variety of different manners. According to one embodiment, the data represents a real-time transport protocol (RTP) address of the different media streams. An RTP address is a type of uniform resource locator (URL) by which multimedia documents can be identified in a network. According to an alternate embodiment, a unique identifier is assigned to the content rather than to the individual media streams. According to another alternate embodiment, a different unique identifier of the media streams could be created by annotation server 10 of Fig. 3 and assigned to the media streams. Such a unique identifier would also be used by streaming media server 11 of Fig. 1 to identify the media streams. According to another alternate embodiment, a uniform resource name (URN) such as those described by K. Sollins and L. Mosinter in "Functional Requirements for Uniform Resource Names," IETF RFC 1733 (December 1994) could be used to identify the media stream.
User-defined property fields 202 are one or more user-definable fields that allow users (or user interface designers) to customize the annotation system. Examples of such additional property fields include a "reference URL" property which contains the URL of a web page used as reference material for the content of the annotation; a "help URL" property containing the URL of a help page which can be accessed concerning the content of the annotation; a "view script" property containing JavaScript which is to be executed whenever the annotation is viewed; a "display type" property, which gives the client user interface information about how the annotation is to be displayed; etc.
Sequence number 204 allows a user to define (via user interface 150 of Fig. 3) a custom ordering for the display of annotation identifiers, as discussed in more detail below. Sequence number 204 stores the relative position of the annotations with respect to one another in the custom ordering, allowing the custom ordering to be saved for future used. In the illustrated example, annotation entry 180 stores a single sequence number. Alternatively, multiple sequence numbers 204 may be included in annotation entry 180 each corresponding to a different custom ordering, or a different annotation set, or a different user, etc.
Fig. 5 illustrates exemplary implicit annotation collections for annotations maintained by annotation server 10 of Fig. 3. A collection of annotations refers to annotation entries 180 of Fig. 4 that correspond to the same media stream(s), based on the media content identifier 200. Annotation entries 180 can be viewed conceptually as part of the same annotation collection if they have the same media content identifiers 200, even though the annotation entries may not be physically stored together by annotation server 10.
Annotation database 206 includes two annotation collections 208 and 210. Annotation server 10 dynamically adds, deletes, and modifies annotation entries in annotation database 206 based on commands from client 15. Annotation entries can be created and added to annotation database 206 at any time a user cares to comment upon the content of the stream (or another annotation) in the form of an annotation. Annotation server 10 forms an annotation entry from identification data, content data, title data, and author data of an "add annotation" request received from the client's ABE 151 (Fig. 3), and adds the annotation entry to annotation database 206. Annotation database 206 includes a fields 212, 214, and 216 that specify common characteristics of all annotation entries of database 206 or an annotation collection 208 or 210. Alternatively, fields 212-216 can be included redundantly in each annotation entry 180. Creator field 212 contains data identifying the user who was responsible for creating annotation database 206.
RTP address fields 214 and 216 contain data representing an RTP address of the media stream (e.g., the RTP address of the stream identified in media content identifier 200 of Fig. 5) for the annotation collection. An RTP address provides an alternative mechanism, in addition to the data in identifier field 200, for associating the media stream with annotation entries 180. In alternative embodiments, RTP address fields 214 and 216 need not be included, particularly embodiments in which media content identifier 200 contains the RTP address of the media stream.
User Interface
An annotation can be created by a user of any of the client computers 15 of Fig. 1. As discussed above with reference to Fig. 3, client 15 includes an interface module 150 that presents an interface to a user (e.g., a graphical user interface), allowing a user to make requests of annotation server 10. In the illustrated embodiment, a user can access annotation server 10 via an annotation toolbar provided by interface 150.
Fig. 6 illustrates an annotation toolbar in accordance with one embodiment of the invention. Annotation toolbar 240 includes various identifying information and user-selectable options 242-254. Selection of an exit or "X" button 242 causes interface 150 to terminate display of the toolbar 240. A server identifier 244 identifies the annotation server with which client 15 is currently configured to communicate (annotation server 10 of Fig. 1. in the illustrated embodiment).
Selection of a connection button 246 causes ABE 151 of Fig. 3 to establish a connection with the annotation server identified by identifier 244. Selection of a query button 248 causes interface module 150 to open a "query" dialog box, from which a user can search for particular annotations. Selection of an add button 250 causes interface module 150 to open an "add new annotation" dialog box, from which a user can create a new annotation.
Selection of a show annotations button 252 causes interface module 150 to open a "view annotations" dialog box, from which a user can select particular annotations for presentation.
Selection of a preferences button 254 causes interface 150 of Fig. 3 to open a "preferences" dialog box, from which a user can specify various UI preferences, such as an automatic server query refresh interval, or default query criteria values to be persisted between sessions.
Annotation Creation
Fig. 7 shows an "add new annotation" dialog box 260 that results from user selection of add button 250 of Fig. 6 to create a new annotation. Interface 150 assumes that the current media stream being presented to the user is the media stream to which this new annotation will be associated. The media stream to which an annotation is associated is referred to as the "target" of the annotation. An identifier of this stream is displayed in a target specification area 262 of dialog box 260. Alternatively, a user could change the target of the annotation, such as by typing in a new identifier in target area 262, or by selection of a "browse" button O 00/16221 ,_, , PCT/US99/21391
21 (not shown) that allows the user to browse through different directories of media streams.
A time strip 264 is also provided as part of dialog box 260. Time strip 264 represents the entire presentation time of the corresponding media stream. A thumb 265 that moves within time strip 264 indicates a particular temporal position within the media stream. The annotation being created via dialog box 260 has a begin time and an end time, which together define a particular segment of the media stream. When "from" button 268 is selected, thumb 265 represents the begin time for the segment relative to the media stream. When "to" button 271 is selected, thumb 265 represents the end time for the segment relative to the media stream. Alternatively, two different thumbs could be displayed, one for the begin time and one for the end time. The begin and end times are also displayed in an hours/minutes/seconds format in boxes 266 and 270, respectively.
Thumb 265 can be moved along time strip 264 in any of a variety of conventional manners. For example, a user can depress a button of a mouse (or other cursor control device) while a pointer is "on top" of thumb 265 and move the pointer along time strip 264, causing thumb 265 to move along with the pointer. The appropriate begin or end time is then set when the mouse button is released. Alternatively, the begin and end times can be set by entering (e.g., via an alphanumeric keyboard) particular times in boxes 266 and 270.
Dialog box 260 also includes a "play" button 274. Selection of play button 274 causes interface module 150 of Fig. 3 to forward a segment specification to web browser 153 of client 15. The segment specification includes the target identifier from target display 262 and the begin and end times from boxes 266 and 270, respectively. Upon receipt of the segment specification from interface module 150, the browser communicates with media server 11 and requests the identified media segment using conventional HTTP requests. In response, media server 11 streams the media segment to client 15 for presentation to the user. This presentation allows, for example, the user to verify the portion of the media stream to which his or her annotation will correspond. Dialog box 260 also includes an annotation set identifier 272, an email field
275, and a summary 276. Annotation set identifier 272 allows the user to identify a named set to which the new annotation will belong. This set can be a previously defined set, or a new set being created by the user. Selection of the particular set can be made from a drop-down menu activated by selection of a button 273, or alternatively can be directly input by the user (e.g., typed in using an alphanumeric keyboard). According to one embodiment of the invention, annotation server 10 of Fig. 3 supports read and write access controls, allowing the creator of the set to identify which users are able to read and/or write to the annotation set. In this embodiment, only those sets for which the user has write access can be entered as set identifier 272.
Email identifier 275 allows the user to input the email address of a recipient of the annotation. When an email address is included, the newly created annotation is electronically mailed to the recipient indicated in identifier 275 in addition to being added to the annotation database. Furthermore, the recipient of the electronic mail message can reply to the message to create an additional annotation. To enable this, the original email message is configured with annotation server 10 as the sender. Because of this, a "reply to sender" request from the recipient will cause an email reply to be sent to annotation server 10. Upon receipt of such an electronic mail message reply, annotation server 10 creates a new annotation and uses the reply message content as the content of the new annotation. This new annotation identifies, as a related annotation, the original annotation that was created when the original mail message was sent by annotation server 10. In the illustrated embodiment, this related annotation identifier is stored in field 196 of Fig 4.
Summary 276 allows the user to provide a short summary or title of the annotation content. Although the summary is illustrated as being text, it could include any of a wide variety of characters, alphanumerics, graphics, etc. In the illustrated embodiment, summary 276 is stored in the title field 190 of the annotation entry of Fig. 4.
Dialog box 260 further includes radio buttons 280 and 282, which allow an annotation to be created as text and/or audio. Although not shown, other types of annotations could also be accommodated, such as graphics, HTML documents, etc. Input controls 278 are also provided as part of dialog box. The illustrated controls are enabled when the annotation includes audio data. Input controls 278 include conventional audio control buttons such as fast forward, rewind, play, pause, stop and record. Additionally, an audio display bar 279 can be included to provide visual progress feedback when the audio is playing or recording.
The exact nature of input controls 278 is dependent on the type of annotation content being provided. In the case of text content, input controls 278 may simply include a box into which text can be input by the user via an alphanumeric keyboard. Additionally, a keyboard layout may also be provided to the user, allowing him or her to "point and click" using a mouse and pointer to select particular characters for entry.
Annotation and Media Segment Retrieval Fig. 8 shows a "query annotations" dialog box 330 that results from a user selecting query button 248 of Fig. 6. Many of the options presented to the user in dialog box 330 are similar to those presented in the "add new annotation" dialog box 260 of Fig. 7, however, those in dialog box 330 are used as search criteria rather than data for a new annotation.
Dialog box 330 includes a target display 332 that contains an identifier of the target stream. This identifier can be input in any of a variety of manners, such as by typing in a new identifier in target display 332, or by selection of a "browse" button (not shown) that allows the user to browse through different directories of media streams. In the illustrated embodiment, the identifier is an URL. However, alternate embodiments can use different identifier formats. Dialog box 330 also includes target information 334, which includes a time strip, thumb, "from" button, "to" button, "play" button, and begin and end times, which are analogous to the time strip, thumb, "from" button, "to" button, "play" button, begin and end times of dialog box 260 of Fig. 7. The begin and end times in target information 334 limit the query for annotations to only those annotations having a time range that corresponds to at least part of the media segment between the begin and end times of target information 334.
Dialog box 330 also includes an annotation set list 336. Annotation set list 336 includes a listing of the various sets that correspond to the target media stream. According to one implementation, only those sets for which an annotation has been created are displayed in set list 336. According to one embodiment of the invention, annotation server 10 of Fig. 3 supports read and write security, allowing the creator of the set to identify which users are able to read and/or write to the annotation set. In this embodiment, only those sets for which the user has read access are displayed in set list 336. A user can select sets from annotation set list 336 in a variety of manners.
For example, using a mouse and pointer to "click" on a set in list 336, which highlights the set to provide feedback to the user that the set has been selected. Clicking on the selected set again de-selects the set (leaving it no longer highlighted). Additionally, a "select all" button 338 allows a user to select all sets in set list 336, while a "deselect all" button 340 allows a user to de-select all sets in set list 336.
In the illustrated embodiment, the sets displayed as part of annotation set list 336 contain annotations which correspond to the target identifier in target display 332. However, in alternate embodiments the sets in selection list 338 need not necessarily contain annotations which correspond to the target identifier in target display 332. Interface module 150 allows a user to select different target streams during the querying process. Thus, a user may identify a first target stream and select one or more sets to query annotations from for the first target stream, and then identify a second target stream and select one or more sets to query annotations from for the second target stream. Additional search criteria can also be input by the user. As illustrated, a particular creation date and time identifier 342 can be input, along with a relation 344 (e.g., "after" or "before"). Similarly, particular words, phrases, characters, graphics, etc. that must appear in the summary can be input in a summary keyword search identifier 346. A maximum number of annotations to retrieve in response to the query can also be included as a max identifier 348. Furthermore, the query can be limited to only annotations that correspond to the target identifier in target display 332 by selecting check box 360.
A level of detail 350 to retrieve can also be selected by the user. Examples of different levels that could be retrieved include the "full level" (that is, all content of the annotation), or a "deferred download" where only an identifier of the annotations (e.g., a summary or title) is downloaded. In the illustrated example, 26 selection of checkbox 354 selects the deferred download level, whereas if checkbox 354 is not selected then the full level of detail is implicitly selected.
A server identifier 356 identifies the annotation server with which client 15 is currently configured to communicate. Different annotation servers can be selected by the user by inputting the appropriate identifier as server identifier 356. This input can be provided in any of a variety of manners, such as by typing in a new identifier in server identifier 356 or by selection of a "browse" button (not shown) that allows the user to browse through different directories of annotation servers. A user can request automatic display of the retrieved annotations by selecting a "display retrieved annotations" checkbox 358. Selection of "advanced" button 362 reduces the number of options available to the user, simplifying dialog box 330. For example, the simplified dialog box may not include fields 342, 344, 348, 346, 350, 332, 334, or 336. The user can then complete the query process by selecting a query button
364. Upon selection of the query button 364, interface 150 closes the query dialog box 330 and forwards the search criteria to annotation server 10. Additionally, if checkbox 358 is selected then interface 150 displays a "view annotations" dialog box 400 of Fig. 9. Alternatively, a user can provide a view request, causing interface 150 to display dialog box 400, by selecting show annotations button 252 in annotation toolbar 240 of Fig. 6.
Fig. 9 shows a dialog box 400 that identifies annotations corresponding to a playlist of media segments. The playlist is a result of the query input by the user as discussed above with reference to Fig. 8. In the illustration of Fig. 9, annotation identifiers in the form of user identifiers 406 and summaries 408 are displayed within an annotation listing box 402. The user can scroll through annotation identifiers in a conventional manner via scroll bars 404 and 405. The annotation identifiers are presented in annotation listing box 402 according to a default criteria, such as chronological by creation time/date, by user, alphabetical by summaries, etc. Related annotations are displayed in an annotation listing 402 in a hierarchical, horizontally offset manner. The identifier of an annotation that is related to a previous annotation is "indented" from that previous annotation's identifier and a connecting line between the two identifiers is shown.
Dialog box 400 can be displayed concurrently with a multimedia player that is presenting multimedia content that corresponds to the annotations in annotation listing 402 (e.g., as illustrated in Fig. 10 below). Interface module 150 can have the annotations "track" the corresponding multimedia content being played back, so that the user is presented with an indication (e.g., an arrow) as to which annotation(s) correspond to the current temporal position of the multimedia content. Such tracking can be enabled by selecting checkbox 422, or disabled by de- selecting checkbox 422.
Dialog box 400 also includes a merge annotation sets checkbox 424. Selection of merge annotation sets checkbox 424 causes interface module 150 to present annotation identifiers in listing box 402 in a chronological order regardless of what set(s) the annotations in annotation listing 402 belong to. If checkbox 424 is not selected, then annotations from different sets are grouped and displayed together in annotation listing 402 (e.g., under the same tree item). Thus, when checkbox 424 is not selected, interface 150 displays one playlist for each annotation set that has been retrieved from annotation server 10.
Dialog box 400 also includes a refresh button 428, a close button 430, and an advanced button 432. Selection of refresh button 428 causes interface module 150 to communicate with annotation back end 151 to access annotation server 10 and obtain any additional annotations that correspond to the query that resulted in listing box 402.
Selection of close button 430 causes interface 150 to terminate the display of dialog box 400. Selection of advanced button 432 causes interface 150 to display a different view annotations box having additional details, such as annotation target information (analogous to target display 332 discussed below with reference to Fig. 8), user-selectable preferences for information displayed as annotation identifiers in listing box 402, etc.
Upon user selection of a particular annotation identifier from listing box 402 (e.g., "single clicking" on the summary), preview information is presented in a preview section 416, and a selection box or menu 410 is provided. The exact nature ofthe preview information is dependent on the data type and amount of information that was requested (e.g., as identified in level of detail 350 of Fig. 8).
Menu 410 includes the following options: play, export ASX playlist, export annotations, time order, custom order, save, and reset. Selection of the "play" option causes playback of the multimedia content to begin starting with the selected annotation in annotation list 402. Selection of the "export ASX playlist" option causes annotation backend 151 to output a record (e.g., create a file) that identifies the temporal segments of multimedia content that the annotations identified in list 402 correspond to, as determined by the begin and end times of the annotations. Selection of the "export annotations" option causes annotation backend 151 to output a record (e.g., create a file) that includes the annotation content of each annotation identified in list 402.
Selection of the "time order" option causes interface module 150 to display the identifiers in list 402 in chronological order based on the begin time for each annotation. Selection of the "custom order" option allows the user to identify some other criteria to be used in determining the order of the identifiers in list 402 (e.g., identifiers can be re-ordered in a conventional drag and drop manner). Re-ordering annotation identifiers causes the sequence numbers 204 (of Fig. 4) of the annotations to be re-ordered accordingly. Selection of the "save" option causes interface module 150 to save the current custom ordering to annotation server 10 of Fig. 3 by saving the current sequence numbers of the annotations. Selection of the "reset" option causes interface module 150 to ignore any changes that have been made since the last saved custom ordering and revert to the last saved custom ordering. Transfer of the corresponding media segments (and/or the annotations) to client 15 is initiated when the "play" option of menu 410 is selected. Upon selection of the play option, interface 150 of Fig. 3 provides the list of annotation identifiers being displayed to web browser 153 (or other multimedia presentation application) in the order of their display, including the target identifier and temporal range information. Thus, web browser 153 receives a list of multimedia segments that it is to present to the user in a particular order. Web browser 153 then accesses media server 11 to stream the multimedia segments to client 15 for presentation in that order. By use of the play option in menu 410, a user is able to review the information regarding the annotations that satisfy his or her search criteria and then modify the annotation playlist (e.g., by deleting or reordering annotation identifiers) before the corresponding media segments (and/or the annotations) are presented to him or her.
Alternatively, transfer of the media segments may be initiated in other manners rather than by selection of the play option in menu 410. For example, a "start" button may be included as part of dialog box 400, selection of which initiates transfer of the media segments to client 15. The annotations and/or corresponding media segments are presented to the user "back to back" with very little or no noticeable gap between different annotations and between different segments. Thus, the presentation of the annotations and/or media segments is "seamless". A user is able to reorder the media segments of the playlist and thereby alter their order of presentation. In the illustrated embodiment, media segments are reordered by changing the ordering of the annotation identifiers in annotation listing 402 in a drag and drop manner. For example, using a mouse and pointer a user can select a particular annotation identifier (e.g., identifier 420) and drag it to a different location within the dialog box (e.g., between identifiers 419 and 421), thereby changing when the media segment corresponding to the annotation identified by identifier 420 is presented relative to the other annotations.
As discussed above, information regarding the media stream as well as the particular media segment within that stream to which an annotation corresponds is maintained in each annotation. At the appropriate time, web browser 153 sends a message to the appropriate media server 11 of Fig. 1 to begin streaming the appropriate segment to client computer 15. Web browser 153, knowing the duration of each of the segments being provided to client computer 15, forwards additional messages to media server 11 to continue with the provision of the next segment, according to the playlist, when appropriate. By managing the delivery of the media segments to client computer 15 in such a manner, web browser 153 can keep the media segments being provided to the user in a seamless manner.
According to an alternate embodiment, the media segments could be streamed to annotation server 10 for temporary buffering and subsequent streaming to client computer 15. According to another alternate embodiment, identifying information (e.g., source, start time, and end time) for the media segment could be provided to media server 11 from annotation server 10 for streaming to client computer 15.
Additionally, according to one embodiment the collection of media segments identified by the playlist can be stored as an additional media stream by selecting "export ASF playlist" option in menu 410 of Fig. 9. By saving the collection of media segments as a single media stream, the collection can be retrieved by the user (or other users) at a later time without having to go through another querying process. Furthermore, the collection of segments, stored as a media stream, can itself be annotated. The collection of segments can be stored as a media stream in any of a variety of different locations and formats. The media stream can be stored in an additional data store (not shown) managed by annotation server 10 of Fig. 3, or alternatively stored at media server 11 of Fig. 1 or another media server (not shown) of Fig. 1. According to one embodiment, the media stream includes the source information, start time, and end time for each of the segments in the playlist. Thus, little storage space is required and the identifying information for each of the segments is independent of the annotations. Alternatively, the media stream includes pointers to each of the annotations. For subsequent retrieval of the media segments, the stored pointers can be used to retrieve each of the appropriate annotations, from which the corresponding media segments can be retrieved. According to another alternate embodiment, the media segments themselves could be copied from media server 11 of Fig. 1 and those segments stored as the media stream.
Fig. 10 shows one implementation of a graphical user interface window 450 that concurrently displays annotations and corresponding media segments. This UI window 450 has an annotation screen 454, a media screen 456, and a toolbar 240. Media screen 456 is the region of the UI within which the multimedia content is rendered. For video content, the video is displayed on screen 456. For non- visual content, screen 456 displays static or dynamic images representing the content. For audio content, for example, a dynamically changing frequency wave that represents an audio signal is displayed in media screen 456.
Annotation screen 454 is the region of the UI within which the annotation identifiers and/or annotation content are rendered. For example, dialog box 400 of Fig. 9 can be annotation screen 454.
Fig. 11 illustrates methodological aspects of one embodiment of the invention in retrieving and presenting annotations and media segments to a user.
A step 500 comprises displaying a query dialog box 330 of Fig. 8. Interface 150 of Fig. 3 provides dialog box 330 in response to a query request from a user, allowing the user to search for annotations that satisfy various user-definable criteria. A step 502 comprises receiving query input from the user. Interface 150 of
Fig. 3 receives the user's input(s) to the query dialog box and provides the inputs to annotation server 10 of Fig. 3.
A step 504 comprises generating an annotation list. ABE 132 of Fig. 3 uses the user inputs to the query dialog box to select annotations from stores 17 and 18. ABE 132 searches through annotation meta data store 18 for the annotations that satisfy the criteria provided by the user. The annotations that satisfy that criteria then become part of the annotation list and identifying information, such as the annotation titles or summaries, are provided to client 15 by annotation server 10.
A step 506 comprises displaying a view annotations dialog box 400 of Fig. 9 that contains the annotation identifying information from the annotation list generated in step 504. Steps 508 and 510 comprise receiving user input selecting various annotations from the identifying information displayed in step 506. Steps 508 and 510 repeat until the user has finished his or her selecting.
A step 512 comprises retrieving the selected annotations and corresponding media segments. ABE 132 in annotation server 10 of Fig. 3 is responsible for retrieving the selected annotations from stores 17 and 18.
A step 514 comprises presenting the selected annotations and corresponding media segments to the user in a seamless manner.
In the illustrated embodiment, both the selected annotations as well as the corresponding media segments are provided to the user. In one alternate embodiment, only the media segments corresponding to the annotations (and not the annotations themselves) are provided to the user. In another alternate embodiment only the annotations (and not the corresponding segments of the media stream) are provided to the user. In another embodiment, the annotations are downloaded to the client computer first, and the media segments are downloaded to the client computer later in an on-demand manner.
In the illustrated embodiment, annotation data is buffered in annotation server 10 of Fig. 1 for provision to client 15 and media stream data is buffered in media server 11 for provision to client 15. Sufficient buffering is provided to allow the annotation and media stream data to be provided to the client seamlessly. For example, when streaming two media segments to client 15, as the end of the first media segment draws near media server 11 is working on obtaining and streaming the beginning of the second media segment to client 15. By doing so, there is little or no noticeable gap between the first and second media segments as presented to the user. Alternatively, rather than providing such buffering in the servers 10 and 11, additional buffering can be provided by client 15 to allow the seamless presentation of the data.

Claims

1. One or more computer-readable media containing a computer program for annotating streaming media, wherein the program performs steps comprising: creating annotations interactively with a user, wherein the annotations correspond to identified segments of one or more media streams; graphically ordering the annotations in a desired order of presentation in response to user input; and in response to a user instruction, sequentially presenting the annotations along with their corresponding identified media stream segments in the desired order of presentation.
2. One or more computer-readable media as recited in claim 0, wherein the annotations comprise textual annotations.
3. One or more computer-readable media as recited in claim 0, wherein the media streams comprise audio/visual video streams.
4. One or more computer-readable media as recited in claim 0, wherein: the annotations are textual annotations; the media streams are audio/visual video streams; and the presenting step comprises displaying the textual annotations in one display area while displaying the corresponding segments of the audio/visual streams in another display area.
5. One or more computer-readable media as recited in claim 0, the steps further comprising storing the annotations and their desired order of presentation.
6. One or more computer-readable media as recited in claim 0, the steps further comprising: storing the annotations and their desired order of presentation; and in response to a user request, retrieving the stored annotations and their desired order of presentation, displaying the retrieved annotations in their desired order of presentation, and retrieving and presenting the media stream segments identified by the retrieved annotations, in sequential order in accordance with the desired order of presentation of the retrieved annotations.
7. A method comprising: receiving an indication of a plurality of annotations selected by a user, wherein each of the plurality of annotations corresponds to a media stream or to one or more media streams; and seamlessly providing one or more of, the plurality of annotations, and at least a portion of the media stream corresponding to each of the plurality of annotations.
8. A method as recited in claim 7, wherein the seamlessly providing comprises providing the plurality of annotations and the portions of the media streams corresponding to the plurality of annotations to a client computer for seamless presentation to a user.
9. A method as recited in claim 7, wherein each of the plurality of annotations corresponds to a segment of one of the one or more media streams, each segment being less than the entire stream.
10. A method as recited in claim 7, wherein the seamlessly providing comprises: seamlessly providing the plurality of annotations concurrently with seamlessly providing at least a portion of the media stream corresponding to each of the plurality of annotations.
11. A method as recited in claim 7, further comprising: presenting a plurality of annotation identifiers to the user; and wherein the seamlessly providing comprises seamlessly providing the one or more of the plurality of annotations and the portion of the media stream corresponding to each of the plurality of annotations in an order defined by the order of the plurality of annotation identifiers.
12. A method as recited in claim 11 , further comprising: allowing the ordering of the plurality of annotation identifiers to be changed by the user. J O
13. A method as recited in claim 12, further comprising: allowing the user to change the order of the plurality of annotation identifiers in a drag and drop manner.
14. A method as recited in claim 7, further comprising: storing the at least a portion of the media stream corresponding to each of the plurality of annotations as a new media stream of the one or more media streams.
15. A method as recited in claim 7, wherein each of the plurality of annotations comprises one or more of audio data and text data.
16. A method as recited in claim 7, wherein each of the one or more media streams comprises audio and video data.
17. A computer-readable memory containing a computer program that is executable by a computer to perform the method recited in claim 7.
18. A system comprising: an annotation database that stores one or more collections of annotations, wherein each of the annotations identifies at least a segment of a media stream; and an annotation module to control storage and retrieval of the plurality of annotations, wherein the annotation module is configured to perform steps comprising: retrieving a particular collection of annotations from the annotation database; presenting the annotations of the retrieved collection to a user; and managing sequential presentation to the user of the media stream segments corresponding to the presented annotations.
19. A system as recited in claim 18, wherein the annotation module is further configured to perform a step of communicating with a client computer to provide indications of the plurality of annotations to the client computer for display to the user.
20. A system as recited in claim 19, wherein the indications of the plurality of annotations comprise summary information for each of the plurality of annotations.
21. A system as recited in claim 19, wherein each of the plurality of annotations corresponds to an annotation set, and wherein the annotation module is further configured to perform a step of providing the annotation set information to the client computer.
22. A system as recited in claim 18, wherein each of the media stream segments comprises audio and video data.
23. A system as recited in claim 18, wherein the annotation module is further configured to perform a step of saving information regarding the media stream segments as an additional new media stream.
24. A system as recited in claim 23, wherein the information regarding each of the media stream segments comprises an identifier of a media stream of which the media segment is a part, a temporal location in the media stream identifying where the media segment begins, and a temporal location in the media stream identifying where the media segment ends.
25. A system as recited in claim 18, further comprising: a client computer, coupled to the annotation module, configured to receive the media stream segments and present the media stream segments to the user.
26. A system as recited in claim 25, further comprising: a media server, coupled to the annotation module, having access to a plurality of media streams, the media server configured to provide at least a portion of the plurality of media streams to the client computer as the media stream segments.
27. A system as recited in claim 18, wherein each of the plurality of annotation identifiers corresponds to a single media stream of the plurality of media streams.
28. One or more computer-readable storage media containing a program having instructions that are executable by a computer to perform steps comprising: configuring a first portion of a user interface to display a plurality of identifiers corresponding to a plurality of annotations, the plurality of identifiers corresponding to a playlist of media segments to be seamlessly presented to a user; and reordering the plurality of identifiers in accordance with user input to change the order in which the media segments are to be presented.
29. One or more computer-readable storage media as claimed in claim 28, the program having instructions that are executable by the computer to further perform a step comprising: receiving the media segments from a media server in an order determined by the playlist.
30. One or more computer-readable storage media as claimed in claim
28, the program having instructions that are executable by the computer to further perform steps comprising: receiving the media segments from a media server in an order determined by the playlist; and presenting the media segments at the user interface in the order determined by the playlist.
31. One or more computer-readable storage media as claimed in claim 28, the program having instructions that are executable by the computer to further perform a step comprising: allowing the user to reorder the plurality of identifiers in a drag and drop manner.
32. One or more computer-readable storage media as claimed in claim 28, the program having instructions that are executable by the computer to further perform a step comprising: configuring a second portion of the user interface to present the plurality of annotations concurrently with the media segments.
33. One or more computer-readable storage media as claimed in claim 28, wherein each ofthe media segments comprises audio and video data.
PCT/US1999/021391 1998-09-15 1999-09-15 Interactive playlist generation using annotations WO2000016221A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU59264/99A AU5926499A (en) 1998-09-15 1999-09-15 Interactive playlist generation using annotations

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10045298P 1998-09-15 1998-09-15
US60/100,452 1998-09-15

Publications (2)

Publication Number Publication Date
WO2000016221A1 WO2000016221A1 (en) 2000-03-23
WO2000016221A9 true WO2000016221A9 (en) 2000-08-17

Family

ID=22279845

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/US1999/021391 WO2000016221A1 (en) 1998-09-15 1999-09-15 Interactive playlist generation using annotations
PCT/US1999/021344 WO2000016541A1 (en) 1998-09-15 1999-09-15 Annotation creation and notification via electronic mail

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/US1999/021344 WO2000016541A1 (en) 1998-09-15 1999-09-15 Annotation creation and notification via electronic mail

Country Status (3)

Country Link
US (4) US7051275B2 (en)
AU (2) AU5926099A (en)
WO (2) WO2000016221A1 (en)

Families Citing this family (589)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8352400B2 (en) 1991-12-23 2013-01-08 Hoffberg Steven M Adaptive pattern recognition based controller apparatus and method and human-factored interface therefore
US6877137B1 (en) * 1998-04-09 2005-04-05 Rose Blush Software Llc System, method and computer program product for mediating notes and note sub-notes linked or otherwise associated with stored or networked web pages
US20060129944A1 (en) * 1994-01-27 2006-06-15 Berquist David T Software notes
CA2181342A1 (en) * 1994-01-27 1995-08-03 David T. Berquist Software notes
US20020120925A1 (en) * 2000-03-28 2002-08-29 Logan James D. Audio and video program recording, editing and playback systems using metadata
US20030093790A1 (en) 2000-03-28 2003-05-15 Logan James D. Audio and video program recording, editing and playback systems using metadata
US7111009B1 (en) * 1997-03-14 2006-09-19 Microsoft Corporation Interactive playlist generation using annotations
US6654931B1 (en) 1998-01-27 2003-11-25 At&T Corp. Systems and methods for playing, browsing and interacting with MPEG-4 coded audio-visual objects
US7437725B1 (en) 1999-01-04 2008-10-14 General Electric Company Processing techniques for servers handling client/server traffic and communications
US6105055A (en) * 1998-03-13 2000-08-15 Siemens Corporate Research, Inc. Method and apparatus for asynchronous multimedia collaboration
US7269585B1 (en) * 1998-06-26 2007-09-11 Alexander James Burke User interface and search system for local and remote internet and other applications
US6665687B1 (en) 1998-06-26 2003-12-16 Alexander James Burke Composite user interface and search system for internet and multimedia applications
US6108703A (en) * 1998-07-14 2000-08-22 Massachusetts Institute Of Technology Global hosting system
US6233389B1 (en) 1998-07-30 2001-05-15 Tivo, Inc. Multimedia time warping system
US7051275B2 (en) 1998-09-15 2006-05-23 Microsoft Corporation Annotations for multiple versions of media content
US6956593B1 (en) 1998-09-15 2005-10-18 Microsoft Corporation User interface for creating, viewing and temporally positioning annotations for media content
US7155679B2 (en) 1998-11-18 2006-12-26 Eastman Kodak Company Digital media frame
US6535228B1 (en) * 1998-11-18 2003-03-18 Eastman Kodak Company Method and system for sharing images using a digital media frame
US6751670B1 (en) * 1998-11-24 2004-06-15 Drm Technologies, L.L.C. Tracking electronic component
JP2000163331A (en) * 1998-11-24 2000-06-16 Matsushita Electric Ind Co Ltd Multimedia electronic mail method and system
US6859799B1 (en) * 1998-11-30 2005-02-22 Gemstar Development Corporation Search engine for video and graphics
US7555721B2 (en) 1998-12-30 2009-06-30 Aol Llc, A Delaware Limited Liability Company Customized user interface
US7353234B2 (en) 1998-12-30 2008-04-01 Aol Llc, A Delaware Limited Liability Company Customized user interface based on user record information
US7127515B2 (en) * 1999-01-15 2006-10-24 Drm Technologies, Llc Delivering electronic content
US7966078B2 (en) 1999-02-01 2011-06-21 Steven Hoffberg Network media appliance system and method
US6687877B1 (en) * 1999-02-17 2004-02-03 Siemens Corp. Research Inc. Web-based call center system with web document annotation
US7908602B2 (en) * 1999-06-30 2011-03-15 Blackboard Inc. Internet-based education support system, method and medium providing security attributes in modular, extensible components
US8464302B1 (en) 1999-08-03 2013-06-11 Videoshare, Llc Method and system for sharing video with advertisements over a network
US6496849B1 (en) 1999-08-30 2002-12-17 Zaplet, Inc. Electronic media for communicating information among a group of participants
US6569206B1 (en) * 1999-10-29 2003-05-27 Verizon Laboratories Inc. Facilitation of hypervideo by automatic IR techniques in response to user requests
US6996775B1 (en) * 1999-10-29 2006-02-07 Verizon Laboratories Inc. Hypervideo: information retrieval using time-related multimedia:
US6757866B1 (en) * 1999-10-29 2004-06-29 Verizon Laboratories Inc. Hyper video: information retrieval using text from multimedia
US20050149559A1 (en) * 1999-11-01 2005-07-07 Oppedahl & Larson Llp Status monitoring system
US6351776B1 (en) 1999-11-04 2002-02-26 Xdrive, Inc. Shared internet storage resource, user interface system, and method
US20100185614A1 (en) 1999-11-04 2010-07-22 O'brien Brett Shared Internet storage resource, user interface system, and method
US20020059223A1 (en) * 1999-11-30 2002-05-16 Nash Paul R. Locator based assisted information browsing
US6832245B1 (en) 1999-12-01 2004-12-14 At&T Corp. System and method for analyzing communications of user messages to rank users and contacts based on message content
US20020049852A1 (en) * 1999-12-06 2002-04-25 Yen-Jen Lee Global messaging with distributed adaptive streaming control
US7028267B1 (en) 1999-12-07 2006-04-11 Microsoft Corporation Method and apparatus for capturing and rendering text annotations for non-modifiable electronic content
US6714214B1 (en) * 1999-12-07 2004-03-30 Microsoft Corporation System method and user interface for active reading of electronic content
US6992687B1 (en) * 1999-12-07 2006-01-31 Microsoft Corporation Bookmarking and placemarking a displayed document in a computer system
US7458014B1 (en) * 1999-12-07 2008-11-25 Microsoft Corporation Computer user interface architecture wherein both content and user interface are composed of documents with links
US7337389B1 (en) 1999-12-07 2008-02-26 Microsoft Corporation System and method for annotating an electronic document independently of its content
US9424240B2 (en) 1999-12-07 2016-08-23 Microsoft Technology Licensing, Llc Annotations for electronic content
GB2357395A (en) * 1999-12-14 2001-06-20 Nokia Mobile Phones Ltd Message exchange between wireless terminals.
US7116765B2 (en) * 1999-12-16 2006-10-03 Intellisync Corporation Mapping an internet document to be accessed over a telephone system
US7281034B1 (en) * 2000-01-24 2007-10-09 Friskit, Inc. System and method for media playback over a network using links that contain control signals and commands
US6519648B1 (en) * 2000-01-24 2003-02-11 Friskit, Inc. Streaming media search and continuous playback of multiple media resources located on a network
US6389467B1 (en) 2000-01-24 2002-05-14 Friskit, Inc. Streaming media search and continuous playback system of media resources located by multiple network addresses
US8620286B2 (en) 2004-02-27 2013-12-31 Synchronoss Technologies, Inc. Method and system for promoting and transferring licensed content and applications
US7505762B2 (en) * 2004-02-27 2009-03-17 Fusionone, Inc. Wireless telephone data backup system
US8156074B1 (en) 2000-01-26 2012-04-10 Synchronoss Technologies, Inc. Data transfer and synchronization system
US6694336B1 (en) * 2000-01-25 2004-02-17 Fusionone, Inc. Data transfer and synchronization system
US6671757B1 (en) 2000-01-26 2003-12-30 Fusionone, Inc. Data transfer and synchronization system
US6591260B1 (en) * 2000-01-28 2003-07-08 Commerce One Operations, Inc. Method of retrieving schemas for interpreting documents in an electronic commerce system
US6859909B1 (en) * 2000-03-07 2005-02-22 Microsoft Corporation System and method for annotating web-based documents
WO2001067772A2 (en) 2000-03-09 2001-09-13 Videoshare, Inc. Sharing a streaming video
US7624172B1 (en) 2000-03-17 2009-11-24 Aol Llc State change alerts mechanism
DE60132433T2 (en) 2000-03-17 2008-12-24 America Online, Inc. IMMEDIATE MESSAGE TRANSMISSION WITH ADDITIONAL LANGUAGE COMMUNICATION
US9736209B2 (en) 2000-03-17 2017-08-15 Facebook, Inc. State change alerts mechanism
US7398312B1 (en) * 2000-03-29 2008-07-08 Lucent Technologies Inc. Method and system for caching streaming multimedia on the internet
US7260564B1 (en) * 2000-04-07 2007-08-21 Virage, Inc. Network video guide and spidering
US7647555B1 (en) * 2000-04-13 2010-01-12 Fuji Xerox Co., Ltd. System and method for video access from notes or summaries
US7240100B1 (en) * 2000-04-14 2007-07-03 Akamai Technologies, Inc. Content delivery network (CDN) content server request handling mechanism with metadata framework support
US7373313B1 (en) * 2000-04-25 2008-05-13 Alexa Internet Service for enabling users to share information regarding products represented on web pages
US7302490B1 (en) * 2000-05-03 2007-11-27 Microsoft Corporation Media file format to support switching between multiple timeline-altered media streams
US6912564B1 (en) 2000-05-04 2005-06-28 America Online, Inc. System for instant messaging the sender and recipients of an e-mail message
US8122363B1 (en) 2000-05-04 2012-02-21 Aol Inc. Presence status indicator
US20130067340A1 (en) 2000-05-04 2013-03-14 Facebook, Inc. Intelligently enabled menu choices based on online presence state in address book
US8132110B1 (en) 2000-05-04 2012-03-06 Aol Inc. Intelligently enabled menu choices based on online presence state in address book
US9100221B2 (en) 2000-05-04 2015-08-04 Facebook, Inc. Systems for messaging senders and recipients of an electronic message
US20130073648A1 (en) 2000-05-04 2013-03-21 Facebook, Inc. Presenting a recipient of an e-mail with an option to instant message a sender or another recipient based on the sender's or the other recipient's address and online status
US7979802B1 (en) 2000-05-04 2011-07-12 Aol Inc. Providing supplemental contact information corresponding to a referenced individual
US20050177574A1 (en) * 2000-05-08 2005-08-11 James Riley Electronic course generation systems and methods
EP1295457A2 (en) * 2000-05-11 2003-03-26 Sun Microsystems, Inc. Network library service
US6769028B1 (en) * 2000-05-26 2004-07-27 Sonicbox, Inc. Method and apparatus for sharing streaming media links
US7712024B2 (en) 2000-06-06 2010-05-04 Microsoft Corporation Application program interfaces for semantically labeling strings and providing actions based on semantically labeled strings
US7421645B2 (en) * 2000-06-06 2008-09-02 Microsoft Corporation Method and system for providing electronic commerce actions based on semantically labeled strings
US7716163B2 (en) 2000-06-06 2010-05-11 Microsoft Corporation Method and system for defining semantic categories and actions
US7770102B1 (en) 2000-06-06 2010-08-03 Microsoft Corporation Method and system for semantically labeling strings and providing actions based on semantically labeled strings
US7788602B2 (en) 2000-06-06 2010-08-31 Microsoft Corporation Method and system for providing restricted actions for recognized semantic categories
FR2810131B1 (en) * 2000-06-08 2005-04-08 Stg Interactive SYSTEM FOR PUBLICATION OF MULTIMEDIA DATA
US20020057297A1 (en) * 2000-06-12 2002-05-16 Tom Grimes Personalized content management
US8001190B2 (en) 2001-06-25 2011-08-16 Aol Inc. Email integrated instant messaging
KR100368301B1 (en) * 2000-06-26 2003-01-24 김상용 Video Mail System and Method
US7234108B1 (en) 2000-06-29 2007-06-19 Microsoft Corporation Ink thickness rendering for electronic annotations
US7895334B1 (en) 2000-07-19 2011-02-22 Fusionone, Inc. Remote access communication architecture apparatus and method
US8073954B1 (en) 2000-07-19 2011-12-06 Synchronoss Technologies, Inc. Method and apparatus for a secure remote access system
US6549751B1 (en) * 2000-07-25 2003-04-15 Giuseppe Li Mandri Multimedia educational system
US7984098B2 (en) 2000-07-25 2011-07-19 AOL, Inc. Video messaging
US6721921B1 (en) * 2000-07-26 2004-04-13 Itm Associates Method and system for annotating documents using an independent annotation repository
FI112307B (en) 2000-08-02 2003-11-14 Nokia Corp communication Server
US6563913B1 (en) * 2000-08-21 2003-05-13 Koninklijke Philips Electronics N.V. Selective sending of portions of electronic content
KR20020043239A (en) * 2000-08-23 2002-06-08 요트.게.아. 롤페즈 Method of enhancing rendering of a content item, client system and server system
US8932136B2 (en) * 2000-08-25 2015-01-13 Opentv, Inc. Method and system for initiating an interactive game
EP2487607A3 (en) * 2000-08-29 2013-03-27 Open Text S.A. Tool for collaborative edit/search of dynamic objects
MXPA03003138A (en) 2000-10-11 2003-07-14 United Video Properties Inc Systems and methods for providing storage of data on servers in an on-demand media delivery system.
US6985934B1 (en) * 2000-10-23 2006-01-10 Binham Communications Corporation Method and system for providing rich media content over a computer network
US7587446B1 (en) * 2000-11-10 2009-09-08 Fusionone, Inc. Acquisition and synchronization of digital media to a personal information space
US20020059343A1 (en) * 2000-11-10 2002-05-16 Masahiro Kurishima Client apparatus and recording medium that records a program thereof
US7593954B1 (en) * 2000-11-15 2009-09-22 Traction Software, Inc. System and method for cross-referencing, searching and displaying entries in a document publishing system
US7370315B1 (en) * 2000-11-21 2008-05-06 Microsoft Corporation Visual programming environment providing synchronization between source code and graphical component objects
US8255791B2 (en) 2000-11-29 2012-08-28 Dov Koren Collaborative, flexible, interactive real-time displays
US7253919B2 (en) * 2000-11-30 2007-08-07 Ricoh Co., Ltd. Printer with embedded retrieval and publishing interface
US7818435B1 (en) 2000-12-14 2010-10-19 Fusionone, Inc. Reverse proxy mechanism for retrieving electronic content associated with a local network
US20020122060A1 (en) * 2000-12-18 2002-09-05 Markel Steven O. Wizard generating HTML web pages using XML and XSL
US7277955B2 (en) * 2000-12-22 2007-10-02 Verizon Corporate Services Group Inc. Streaming content
JP4216460B2 (en) * 2000-12-26 2009-01-28 パイオニア株式会社 Information processing system, terminal device, and information processing method
US8595340B2 (en) * 2001-01-18 2013-11-26 Yahoo! Inc. Method and system for managing digital content, including streaming media
US20070300258A1 (en) * 2001-01-29 2007-12-27 O'connor Daniel Methods and systems for providing media assets over a network
US7774817B2 (en) * 2001-01-31 2010-08-10 Microsoft Corporation Meta data enhanced television programming
FI115744B (en) * 2001-02-08 2005-06-30 Nokia Corp communication Service
JP3664475B2 (en) * 2001-02-09 2005-06-29 インターナショナル・ビジネス・マシーンズ・コーポレーション Information processing method, information processing system, program, and recording medium
EP1362485B1 (en) 2001-02-12 2008-08-13 Gracenote, Inc. Generating and matching hashes of multimedia content
US6897880B2 (en) * 2001-02-22 2005-05-24 Sony Corporation User interface for generating parameter values in media presentations based on selected presentation instances
US7158971B1 (en) * 2001-03-07 2007-01-02 Thomas Layne Bascom Method for searching document objects on a network
US7386792B1 (en) 2001-03-07 2008-06-10 Thomas Layne Bascom System and method for collecting, storing, managing and providing categorized information related to a document object
US7366979B2 (en) * 2001-03-09 2008-04-29 Copernicus Investments, Llc Method and apparatus for annotating a document
US7499888B1 (en) 2001-03-16 2009-03-03 Fusionone, Inc. Transaction authentication system and method
US7089309B2 (en) * 2001-03-21 2006-08-08 Theplatform For Media, Inc. Method and system for managing and distributing digital media
US8615566B1 (en) 2001-03-23 2013-12-24 Synchronoss Technologies, Inc. Apparatus and method for operational support of remote network systems
US7904516B2 (en) * 2001-06-18 2011-03-08 Leap Wireless International, Inc. Voice attachment to an email using a wireless communication device
US20020147779A1 (en) * 2001-04-05 2002-10-10 International Business Machines Corporation Method and computer program product for providing email that guides a recipient through a set of associated web pages
US7778816B2 (en) 2001-04-24 2010-08-17 Microsoft Corporation Method and system for applying input mode bias
US8230018B2 (en) * 2001-05-08 2012-07-24 Intel Corporation Method and apparatus for preserving confidentiality of electronic mail
US20020188630A1 (en) * 2001-05-21 2002-12-12 Autodesk, Inc. Method and apparatus for annotating a sequence of frames
JP2002358336A (en) * 2001-06-01 2002-12-13 Pioneer Electronic Corp System for disclosure of design information, method therefor and recording medium
KR100438697B1 (en) * 2001-07-07 2004-07-05 삼성전자주식회사 Reproducing apparatus and method for providing bookmark information thereof
US20030097640A1 (en) * 2001-07-25 2003-05-22 International Business Machines Corporation System and method for creating and editing documents
US6963874B2 (en) 2002-01-09 2005-11-08 Digital River, Inc. Web-site performance analysis system and method utilizing web-site traversal counters and histograms
AU2001284628A1 (en) * 2001-08-31 2003-03-10 Kent Ridge Digital Labs Time-based media navigation system
US7747943B2 (en) * 2001-09-07 2010-06-29 Microsoft Corporation Robust anchoring of annotations to content
JP3852568B2 (en) * 2001-09-11 2006-11-29 ソニー株式会社 Apparatus and method for creating multimedia presentation
US20040237032A1 (en) * 2001-09-27 2004-11-25 David Miele Method and system for annotating audio/video data files
US7716287B2 (en) 2004-03-05 2010-05-11 Aol Inc. Organizing entries in participant lists based on communications strengths
US7774711B2 (en) 2001-09-28 2010-08-10 Aol Inc. Automatic categorization of entries in a contact list
US7143102B2 (en) * 2001-09-28 2006-11-28 Sigmatel, Inc. Autogenerated play lists from search criteria
US7512652B1 (en) 2001-09-28 2009-03-31 Aol Llc, A Delaware Limited Liability Company Passive personalization of buddy lists
US7765484B2 (en) 2001-09-28 2010-07-27 Aol Inc. Passive personalization of lists
US7640561B1 (en) * 2001-10-24 2009-12-29 IntEnt Media Ventures Method and system of media programming to provide an integrated entertainment experience
US20040189713A1 (en) * 2001-10-31 2004-09-30 Metacyber.Net Computer-based user interface for a memory-resident rapid comprehension document for original source information
US7424129B2 (en) * 2001-11-19 2008-09-09 Ricoh Company, Ltd Printing system with embedded audio/video content recognition and processing
US20040181815A1 (en) * 2001-11-19 2004-09-16 Hull Jonathan J. Printer with radio or television program extraction and formating
US7861169B2 (en) 2001-11-19 2010-12-28 Ricoh Co. Ltd. Multimedia print driver dialog interfaces
US7747655B2 (en) 2001-11-19 2010-06-29 Ricoh Co. Ltd. Printable representations for time-based media
US7610358B2 (en) * 2001-11-26 2009-10-27 Time Warner Cable System and method for effectively presenting multimedia information materials
US20040205481A1 (en) * 2001-12-17 2004-10-14 Zuniga Oscar A. Multimedia delivery methods and multifunction device therefor
KR100411437B1 (en) * 2001-12-28 2003-12-18 엘지전자 주식회사 Intelligent news video browsing system
US20030131097A1 (en) * 2002-01-09 2003-07-10 Stephane Kasriel Interactive path analysis
US7631035B2 (en) * 2002-01-09 2009-12-08 Digital River, Inc. Path-analysis toolbar
US20030128231A1 (en) * 2002-01-09 2003-07-10 Stephane Kasriel Dynamic path analysis
US7454466B2 (en) * 2002-01-16 2008-11-18 Xerox Corporation Method and system for flexible workflow management
US20040205482A1 (en) * 2002-01-24 2004-10-14 International Business Machines Corporation Method and apparatus for active annotation of multimedia content
US6996558B2 (en) 2002-02-26 2006-02-07 International Business Machines Corporation Application portability and extensibility through database schema and query abstraction
US7243301B2 (en) 2002-04-10 2007-07-10 Microsoft Corporation Common annotation framework
US20030202009A1 (en) * 2002-04-24 2003-10-30 Stephane Kasriel Integration toolbar
US20030204490A1 (en) * 2002-04-24 2003-10-30 Stephane Kasriel Web-page collaboration system
US7325194B2 (en) * 2002-05-07 2008-01-29 Microsoft Corporation Method, system, and apparatus for converting numbers between measurement systems based upon semantically labeled strings
US7707496B1 (en) 2002-05-09 2010-04-27 Microsoft Corporation Method, system, and apparatus for converting dates between calendars and languages based upon semantically labeled strings
US7707024B2 (en) 2002-05-23 2010-04-27 Microsoft Corporation Method, system, and apparatus for converting currency values based upon semantically labeled strings
US7742048B1 (en) 2002-05-23 2010-06-22 Microsoft Corporation Method, system, and apparatus for converting numbers based upon semantically labeled strings
US7921357B2 (en) * 2002-05-23 2011-04-05 Lebow David G Highlighting comparison method
AU2003247452A1 (en) * 2002-05-31 2004-07-14 University Of Utah Research Foundation System and method for visual annotation and knowledge representation
US20030225761A1 (en) * 2002-05-31 2003-12-04 American Management Systems, Inc. System for managing and searching links
US20030227478A1 (en) * 2002-06-05 2003-12-11 Chatfield Keith M. Systems and methods for a group directed media experience
US7827546B1 (en) 2002-06-05 2010-11-02 Microsoft Corporation Mechanism for downloading software components from a remote source for use by a local software application
US7356537B2 (en) 2002-06-06 2008-04-08 Microsoft Corporation Providing contextually sensitive tools and help content in computer-generated documents
US7716676B2 (en) 2002-06-25 2010-05-11 Microsoft Corporation System and method for issuing a message to a program
JP2004030327A (en) * 2002-06-26 2004-01-29 Sony Corp Device and method for providing contents-related information, electronic bulletin board system and computer program
US7392479B2 (en) * 2002-06-27 2008-06-24 Microsoft Corporation System and method for providing namespace related information
US7568151B2 (en) * 2002-06-27 2009-07-28 Microsoft Corporation Notification of activity around documents
US7209915B1 (en) 2002-06-28 2007-04-24 Microsoft Corporation Method, system and apparatus for routing a query to one or more providers
US7362349B2 (en) * 2002-07-10 2008-04-22 Seiko Epson Corporation Multi-participant conference system with controllable content delivery using a client monitor back-channel
US7257774B2 (en) * 2002-07-30 2007-08-14 Fuji Xerox Co., Ltd. Systems and methods for filtering and/or viewing collaborative indexes of recorded media
US20120060102A1 (en) * 2002-08-07 2012-03-08 Joseph Shohfi System and method for visual communication between buyers and sellers
US7401221B2 (en) * 2002-09-04 2008-07-15 Microsoft Corporation Advanced stream format (ASF) data stream header object protection
US7716112B1 (en) * 2002-09-30 2010-05-11 Trading Technologies International, Inc. System and method for price-based annotations in an electronic trading environment
US7610237B1 (en) * 2002-09-30 2009-10-27 Trading Technologies International Inc. System and method for creating trade-related annotations in an electronic trading environment
US7668842B2 (en) * 2002-10-16 2010-02-23 Microsoft Corporation Playlist structure for large playlists
US7136874B2 (en) 2002-10-16 2006-11-14 Microsoft Corporation Adaptive menu system for media players
US7707231B2 (en) * 2002-10-16 2010-04-27 Microsoft Corporation Creating standardized playlists and maintaining coherency
US7054888B2 (en) 2002-10-16 2006-05-30 Microsoft Corporation Optimizing media player memory during rendering
US7043477B2 (en) * 2002-10-16 2006-05-09 Microsoft Corporation Navigating media content via groups within a playlist
US20060026376A1 (en) * 2002-10-16 2006-02-02 Microsoft Corporation Retrieving graphics from slow retrieval storage devices
CA2408478A1 (en) * 2002-10-17 2004-04-17 Ibm Canada Limited-Ibm Canada Limitee Method and computer product for identifying and selecting potential e-mail reply recipients from a multi-party e-mail
IES20030840A2 (en) * 2002-11-08 2004-05-05 Aliope Ltd Multimedia management
US7899862B2 (en) 2002-11-18 2011-03-01 Aol Inc. Dynamic identification of other users to an online user
US8005919B2 (en) 2002-11-18 2011-08-23 Aol Inc. Host-based intelligent results related to a character stream
US8701014B1 (en) 2002-11-18 2014-04-15 Facebook, Inc. Account linking
US8965964B1 (en) 2002-11-18 2015-02-24 Facebook, Inc. Managing forwarded electronic messages
US7107520B2 (en) * 2002-11-18 2006-09-12 Hewlett-Packard Development Company, L.P. Automated propagation of document metadata
CA2506585A1 (en) 2002-11-18 2004-06-03 Valerie Kucharewski People lists
US7590696B1 (en) 2002-11-18 2009-09-15 Aol Llc Enhanced buddy list using mobile device identifiers
US7428580B2 (en) 2003-11-26 2008-09-23 Aol Llc Electronic message forwarding
US8122137B2 (en) 2002-11-18 2012-02-21 Aol Inc. Dynamic location of a subordinate user
US7640306B2 (en) 2002-11-18 2009-12-29 Aol Llc Reconfiguring an electronic message to effect an enhanced notification
US8037150B2 (en) 2002-11-21 2011-10-11 Aol Inc. System and methods for providing multiple personas in a communications environment
US7636755B2 (en) 2002-11-21 2009-12-22 Aol Llc Multiple avatar personalities
US7536713B1 (en) * 2002-12-11 2009-05-19 Alan Bartholomew Knowledge broadcasting and classification system
US7925246B2 (en) 2002-12-11 2011-04-12 Leader Technologies, Inc. Radio/telephony interoperability system
US8195714B2 (en) 2002-12-11 2012-06-05 Leaper Technologies, Inc. Context instantiated application protocol
AU2003297193A1 (en) * 2002-12-13 2004-07-09 Applied Minds, Inc. Meta-web
EP1574093B1 (en) * 2002-12-20 2009-04-01 Nokia Corporation Method and device for organizing user provided information with meta-information
US20040123231A1 (en) * 2002-12-20 2004-06-24 Adams Hugh W. System and method for annotating multi-modal characteristics in multimedia documents
US20040128302A1 (en) * 2002-12-31 2004-07-01 Schirmer Andrew L. System and method for controlling privacy in common-item discovery system
US7945674B2 (en) 2003-04-02 2011-05-17 Aol Inc. Degrees of separation for handling communications
US7613773B2 (en) * 2002-12-31 2009-11-03 Rensselaer Polytechnic Institute Asynchronous network audio/visual collaboration system
US7263614B2 (en) 2002-12-31 2007-08-28 Aol Llc Implicit access for communications pathway
US7949759B2 (en) 2003-04-02 2011-05-24 AOL, Inc. Degrees of separation for handling communications
US20040139042A1 (en) * 2002-12-31 2004-07-15 Schirmer Andrew L. System and method for improving data analysis through data grouping
US9742615B1 (en) 2002-12-31 2017-08-22 Aol Inc. Popularity index
US7620688B2 (en) 2003-01-03 2009-11-17 Microsoft Corporation Progress mode for electronic mail component
US7386590B2 (en) * 2003-01-03 2008-06-10 Microsoft Corporation System and method for improved synchronization between a server and a client
US7366760B2 (en) * 2003-01-03 2008-04-29 Microsoft Corporation System and method for improved client server communications of email messages
US8225194B2 (en) 2003-01-09 2012-07-17 Kaleidescape, Inc. Bookmarks and watchpoints for selection and presentation of media streams
US7783614B2 (en) 2003-02-13 2010-08-24 Microsoft Corporation Linking elements of a document to corresponding fields, queries and/or procedures in a database
KR100526177B1 (en) * 2003-02-18 2005-11-03 삼성전자주식회사 Media contents file management system and method of home media center
US7913176B1 (en) 2003-03-03 2011-03-22 Aol Inc. Applying access controls to communications with avatars
US7908554B1 (en) 2003-03-03 2011-03-15 Aol Inc. Modifying avatar behavior based on user action or mood
US20040179037A1 (en) 2003-03-03 2004-09-16 Blattner Patrick D. Using avatars to communicate context out-of-band
US20040230659A1 (en) * 2003-03-12 2004-11-18 Chase Michael John Systems and methods of media messaging
WO2004084020A2 (en) 2003-03-13 2004-09-30 Drm Technologies, Llc Secure streaming container
US20040210639A1 (en) 2003-03-26 2004-10-21 Roy Ben-Yoseph Identifying and using identities deemed to be known to a user
US7287256B1 (en) * 2003-03-28 2007-10-23 Adobe Systems Incorporated Shared persistent objects
US20040196306A1 (en) * 2003-04-04 2004-10-07 Microsoft Corporation Method and system for flagging digital ink note content
US7711550B1 (en) 2003-04-29 2010-05-04 Microsoft Corporation Methods and system for recognizing names in a computer-generated document and for providing helpful actions associated with recognized names
JP4133559B2 (en) * 2003-05-02 2008-08-13 株式会社コナミデジタルエンタテインメント Audio reproduction program, audio reproduction method, and audio reproduction apparatus
US20040236830A1 (en) * 2003-05-15 2004-11-25 Steve Nelson Annotation management system
US20040230655A1 (en) * 2003-05-16 2004-11-18 Chia-Hsin Li Method and system for media playback architecture
US7373590B2 (en) * 2003-05-19 2008-05-13 Microsoft Corporation Shared electronic ink annotation method and system
US7392475B1 (en) * 2003-05-23 2008-06-24 Microsoft Corporation Method and system for automatic insertion of context information into an application program module
JP4014160B2 (en) * 2003-05-30 2007-11-28 インターナショナル・ビジネス・マシーンズ・コーポレーション Information processing apparatus, program, and recording medium
KR100781507B1 (en) * 2003-06-07 2007-12-03 삼성전자주식회사 Apparatus and method for displaying multimedia data, and recording medium having the method recorded thereon
US9026901B2 (en) * 2003-06-20 2015-05-05 International Business Machines Corporation Viewing annotations across multiple applications
US7620648B2 (en) * 2003-06-20 2009-11-17 International Business Machines Corporation Universal annotation configuration and deployment
US7559026B2 (en) * 2003-06-20 2009-07-07 Apple Inc. Video conferencing system having focus control
US8321470B2 (en) * 2003-06-20 2012-11-27 International Business Machines Corporation Heterogeneous multi-level extendable indexing for general purpose annotation systems
US7397495B2 (en) * 2003-06-20 2008-07-08 Apple Inc. Video conferencing apparatus and method
US20040260714A1 (en) * 2003-06-20 2004-12-23 Avijit Chatterjee Universal annotation management system
US20040268225A1 (en) * 2003-06-26 2004-12-30 Walsh Raymond V. Method and system for controlling navigation of a graphical user interface
US7739588B2 (en) 2003-06-27 2010-06-15 Microsoft Corporation Leveraging markup language data for semantically labeling text strings and data and for providing actions based on semantically labeled text strings and data
US7743329B2 (en) * 2003-06-27 2010-06-22 Microsoft Corporation Incorporating interactive media into a playlist
US7444598B2 (en) 2003-06-30 2008-10-28 Microsoft Corporation Exploded views for providing rich regularized geometric transformations and interaction models on content for viewing, previewing, and interacting with documents, projects, and tasks
US8645471B2 (en) 2003-07-21 2014-02-04 Synchronoss Technologies, Inc. Device message management system
US7653693B2 (en) 2003-09-05 2010-01-26 Aol Llc Method and system for capturing instant messages
US20050027664A1 (en) * 2003-07-31 2005-02-03 Johnson David E. Interactive machine learning system for automated annotation of information in text
US20050038788A1 (en) * 2003-08-14 2005-02-17 International Business Machines Corporation Annotation security to prevent the divulgence of sensitive information
CA2535407A1 (en) 2003-08-15 2005-02-24 Blackboard Inc. Content system and associated methods
US8239400B2 (en) * 2003-08-21 2012-08-07 International Business Machines Corporation Annotation of query components
US7899843B2 (en) * 2003-09-19 2011-03-01 International Business Machines Corporation Expanding the scope of an annotation to an entity level
US20050068573A1 (en) * 2003-09-25 2005-03-31 Hart Peter E. Networked printing system having embedded functionality for printing time-based media
US7573593B2 (en) * 2003-09-25 2009-08-11 Ricoh Company, Ltd. Printer with hardware and software interfaces for media devices
JP2005108230A (en) * 2003-09-25 2005-04-21 Ricoh Co Ltd Printing system with embedded audio/video content recognition and processing function
US8077341B2 (en) 2003-09-25 2011-12-13 Ricoh Co., Ltd. Printer with audio or video receiver, recorder, and real-time content-based processing logic
US7505163B2 (en) * 2003-09-25 2009-03-17 Ricoh Co., Ltd. User interface for networked printer
US7570380B2 (en) * 2003-09-25 2009-08-04 Ricoh Company, Ltd. Printer user interface
US20050071746A1 (en) * 2003-09-25 2005-03-31 Hart Peter E. Networked printer with hardware and software interfaces for peripheral devices
US20050071763A1 (en) * 2003-09-25 2005-03-31 Hart Peter E. Stand alone multimedia printer capable of sharing media processing tasks
US7864352B2 (en) * 2003-09-25 2011-01-04 Ricoh Co. Ltd. Printer with multimedia server
US7528976B2 (en) * 2003-09-25 2009-05-05 Ricoh Co., Ltd. Stand alone printer with hardware/software interfaces for sharing multimedia processing
US7528977B2 (en) * 2003-09-25 2009-05-05 Ricoh Co., Ltd. Printer with hardware and software interfaces for peripheral devices
US7508535B2 (en) * 2003-09-25 2009-03-24 Ricoh Co., Ltd. Stand alone multimedia printer with user interface for allocating processing
US7440126B2 (en) * 2003-09-25 2008-10-21 Ricoh Co., Ltd Printer with document-triggered processing
US7418656B1 (en) 2003-10-03 2008-08-26 Adobe Systems Incorporated Dynamic annotations for electronics documents
US7421741B2 (en) * 2003-10-20 2008-09-02 Phillips Ii Eugene B Securing digital content system and method
CN100555264C (en) * 2003-10-21 2009-10-28 国际商业机器公司 The annotate method of electronic document, device and system
US7870152B2 (en) * 2003-10-22 2011-01-11 International Business Machines Corporation Attaching and displaying annotations to changing data views
US7634509B2 (en) * 2003-11-07 2009-12-15 Fusionone, Inc. Personal information space management system and method
US7900133B2 (en) 2003-12-09 2011-03-01 International Business Machines Corporation Annotation structure type determination
JP2005190088A (en) * 2003-12-25 2005-07-14 Matsushita Electric Ind Co Ltd E-mail processor and e-mail processing system
US8156444B1 (en) 2003-12-31 2012-04-10 Google Inc. Systems and methods for determining a user interface attribute
JP2005196504A (en) * 2004-01-07 2005-07-21 Shinichiro Yamamoto System capable of communication with tag by providing tag function to client computer
US8201079B2 (en) * 2004-01-15 2012-06-12 International Business Machines Corporation Maintaining annotations for distributed and versioned files
US7689578B2 (en) * 2004-01-15 2010-03-30 International Business Machines Corporation Dealing with annotation versioning through multiple versioning policies and management thereof
US7254593B2 (en) * 2004-01-16 2007-08-07 International Business Machines Corporation System and method for tracking annotations of data sources
US7707039B2 (en) 2004-02-15 2010-04-27 Exbiblio B.V. Automatic modification of web pages
US8442331B2 (en) 2004-02-15 2013-05-14 Google Inc. Capturing text from rendered documents using supplemental information
US7343552B2 (en) * 2004-02-12 2008-03-11 Fuji Xerox Co., Ltd. Systems and methods for freeform annotations
US7962846B2 (en) 2004-02-13 2011-06-14 Microsoft Corporation Organization of annotated clipping views
US10635723B2 (en) 2004-02-15 2020-04-28 Google Llc Search engines and systems with handheld document data capture devices
US7812860B2 (en) 2004-04-01 2010-10-12 Exbiblio B.V. Handheld device for capturing text from both a document printed on paper and a document displayed on a dynamic display device
US8069194B1 (en) 2004-03-04 2011-11-29 Ophivain Applications NY L.L.C. Automated annotation of a resource on a computer network using a network address of the resource
US7126816B2 (en) * 2004-03-12 2006-10-24 Apple Computer, Inc. Camera latch
US8595146B1 (en) 2004-03-15 2013-11-26 Aol Inc. Social networking permissions
WO2005089286A2 (en) 2004-03-15 2005-09-29 America Online, Inc. Sharing social network information
US20050210394A1 (en) * 2004-03-16 2005-09-22 Crandall Evan S Method for providing concurrent audio-video and audio instant messaging sessions
US7669117B2 (en) * 2004-03-18 2010-02-23 International Business Machines Corporation Method and system for creation and retrieval of global annotations
US20050210038A1 (en) * 2004-03-18 2005-09-22 International Business Machines Corporation Method for providing workflow functionality and tracking in an annotation subsystem
US8274666B2 (en) * 2004-03-30 2012-09-25 Ricoh Co., Ltd. Projector/printer for displaying or printing of documents
JP2005293239A (en) * 2004-03-31 2005-10-20 Fujitsu Ltd Information sharing device, and information sharing method
US20050229243A1 (en) * 2004-03-31 2005-10-13 Svendsen Hugh B Method and system for providing Web browsing through a firewall in a peer to peer network
US8595214B1 (en) 2004-03-31 2013-11-26 Google Inc. Systems and methods for article location and retrieval
US8234414B2 (en) * 2004-03-31 2012-07-31 Qurio Holdings, Inc. Proxy caching in a photosharing peer-to-peer network to improve guest image viewing performance
US8146156B2 (en) 2004-04-01 2012-03-27 Google Inc. Archive of text captures from rendered documents
US8081849B2 (en) 2004-12-03 2011-12-20 Google Inc. Portable scanning and memory device
WO2008028674A2 (en) 2006-09-08 2008-03-13 Exbiblio B.V. Optical scanners, such as hand-held optical scanners
US7990556B2 (en) 2004-12-03 2011-08-02 Google Inc. Association of a portable scanner with input/output and storage devices
US20060098900A1 (en) 2004-09-27 2006-05-11 King Martin T Secure data gathering from rendered documents
US9143638B2 (en) 2004-04-01 2015-09-22 Google Inc. Data capture from rendered documents using handheld device
US7894670B2 (en) 2004-04-01 2011-02-22 Exbiblio B.V. Triggering actions in response to optically or acoustically capturing keywords from a rendered document
US20060081714A1 (en) 2004-08-23 2006-04-20 King Martin T Portable scanning device
US9116890B2 (en) 2004-04-01 2015-08-25 Google Inc. Triggering actions in response to optically or acoustically capturing keywords from a rendered document
US9008447B2 (en) 2004-04-01 2015-04-14 Google Inc. Method and system for character recognition
US8713418B2 (en) 2004-04-12 2014-04-29 Google Inc. Adding value to a rendered document
WO2005101233A1 (en) * 2004-04-13 2005-10-27 Byte Size Systems Method and system for manipulating threaded annotations
US8489624B2 (en) 2004-05-17 2013-07-16 Google, Inc. Processing techniques for text capture from a rendered document
US8874504B2 (en) 2004-12-03 2014-10-28 Google Inc. Processing techniques for visual capture data from a rendered document
US8620083B2 (en) 2004-12-03 2013-12-31 Google Inc. Method and system for character recognition
US7496847B2 (en) * 2004-04-29 2009-02-24 International Business Machines Corporation Displaying a computer resource through a preferred browser
CN100541479C (en) * 2004-05-03 2009-09-16 Lg电子株式会社 The method and apparatus of the Content Management bookmark information of being stored in the media server for networking
US9542076B1 (en) 2004-05-12 2017-01-10 Synchronoss Technologies, Inc. System for and method of updating a personal profile
CN1998224A (en) 2004-05-12 2007-07-11 富盛旺公司 Advanced contact identification system
JP2006019934A (en) * 2004-06-30 2006-01-19 Kddi Corp Method for setting call of packet switched network
JP4517750B2 (en) * 2004-06-30 2010-08-04 富士ゼロックス株式会社 Document processing system
US8346620B2 (en) 2004-07-19 2013-01-01 Google Inc. Automatic modification of web pages
KR20060007131A (en) * 2004-07-19 2006-01-24 엘지전자 주식회사 Data processing method in data broacasting
US7502806B2 (en) * 2004-08-23 2009-03-10 Quiro Holdings, Inc. Method and system for providing image rich web pages from a computer system over a network
US20060047673A1 (en) * 2004-08-27 2006-03-02 Molander Mark E System and methods for controlling navigation area content via inline and dynamic search control
JP4081056B2 (en) * 2004-08-30 2008-04-23 株式会社東芝 Information processing apparatus, information processing method, and program
US8150926B2 (en) * 2004-09-09 2012-04-03 Microsoft Corporation Organizing electronic mail messages into conversations
US7853606B1 (en) * 2004-09-14 2010-12-14 Google, Inc. Alternate methods of displaying search results
US7719971B1 (en) 2004-09-15 2010-05-18 Qurio Holdings, Inc. Peer proxy binding
WO2006041832A2 (en) * 2004-10-05 2006-04-20 Vectormax Corporation Method and system for broadcasting multimedia data
US7752539B2 (en) * 2004-10-27 2010-07-06 Nokia Corporation Receiving and sending content items associated with a multimedia file
US7698386B2 (en) * 2004-11-16 2010-04-13 Qurio Holdings, Inc. Serving content from an off-line peer server in a photosharing peer-to-peer network in response to a guest request
EP1659795A3 (en) * 2004-11-23 2009-02-25 Palo Alto Research Center Incorporated Methods, apparatus and program products for presenting supplemental content with recorded content
US20060212472A1 (en) * 2004-11-30 2006-09-21 Wavenetworx Inc. Method and system for creating a rich media content portal using third-party commercial portal application software
US7730143B1 (en) 2004-12-01 2010-06-01 Aol Inc. Prohibiting mobile forwarding
US9002949B2 (en) 2004-12-01 2015-04-07 Google Inc. Automatically enabling the forwarding of instant messages
US8060566B2 (en) 2004-12-01 2011-11-15 Aol Inc. Automatically enabling the forwarding of instant messages
US7689655B2 (en) * 2004-12-06 2010-03-30 Aol Inc. Managing and collaborating with digital content using a dynamic user interface
US8230326B2 (en) * 2004-12-17 2012-07-24 International Business Machines Corporation Method for associating annotations with document families
US9652809B1 (en) 2004-12-21 2017-05-16 Aol Inc. Using user profile information to determine an avatar and/or avatar characteristics
US9021456B2 (en) * 2004-12-22 2015-04-28 International Business Machines Corporation Using collaborative annotations to specify real-time process flows and system constraints
US7865815B2 (en) * 2004-12-28 2011-01-04 International Business Machines Corporation Integration and presentation of current and historic versions of document and annotations thereon
US20060143157A1 (en) * 2004-12-29 2006-06-29 America Online, Inc. Updating organizational information by parsing text files
US7272592B2 (en) * 2004-12-30 2007-09-18 Microsoft Corporation Updating metadata stored in a read-only media file
US20060161838A1 (en) * 2005-01-14 2006-07-20 Ronald Nydam Review of signature based content
US9275052B2 (en) * 2005-01-19 2016-03-01 Amazon Technologies, Inc. Providing annotations of a digital work
WO2006077535A1 (en) * 2005-01-20 2006-07-27 Koninklijke Philips Electronics N.V. Multimedia presentation creation
US20060167956A1 (en) * 2005-01-27 2006-07-27 Realnetworks, Inc. Media content transfer method and apparatus (aka shadow cache)
US20060268121A1 (en) * 2005-02-20 2006-11-30 Nucore Technology Inc. In-camera cinema director
US20060187331A1 (en) * 2005-02-20 2006-08-24 Nucore Technology, Inc. Digital camera having electronic visual jockey capability
US7952535B2 (en) * 2005-02-20 2011-05-31 Mediatek Singapore Pte Ltd Electronic visual jockey file
US20060200517A1 (en) * 2005-03-03 2006-09-07 Steve Nelson Method and apparatus for real time multi-party conference document copier
US20070220102A1 (en) * 2005-03-11 2007-09-20 Sam Bogoch Digital asset management collaboration system
US20060212509A1 (en) * 2005-03-21 2006-09-21 International Business Machines Corporation Profile driven method for enabling annotation of World Wide Web resources
US7546524B1 (en) 2005-03-30 2009-06-09 Amazon Technologies, Inc. Electronic input device, system, and method using human-comprehensible content to automatically correlate an annotation of a paper document with a digital version of the document
US8234679B2 (en) 2005-04-01 2012-07-31 Time Warner Cable, Inc. Technique for selecting multiple entertainment programs to be provided over a communication network
US7734631B2 (en) * 2005-04-25 2010-06-08 Microsoft Corporation Associating information with an electronic document
TWI297863B (en) * 2005-05-09 2008-06-11 Compal Electronics Inc Method for inserting a picture in a video frame
US7606580B2 (en) 2005-05-11 2009-10-20 Aol Llc Personalized location information for mobile devices
US7765265B1 (en) 2005-05-11 2010-07-27 Aol Inc. Identifying users sharing common characteristics
US8843564B2 (en) * 2005-05-13 2014-09-23 Blackberry Limited System and method of automatically determining whether or not to include message text of an original electronic message in a reply electronic message
US7636883B2 (en) 2005-05-18 2009-12-22 International Business Machines Corporation User form based automated and guided data collection
EP1889169A4 (en) * 2005-05-19 2011-12-28 Fusionone Inc Mobile device address book builder
US7809127B2 (en) * 2005-05-26 2010-10-05 Avaya Inc. Method for discovering problem agent behaviors
US7899868B1 (en) 2005-06-22 2011-03-01 Emc Corporation Method and apparatus for defining messaging identities
US8244811B1 (en) * 2005-06-22 2012-08-14 Emc Corporation Method and apparatus for searching messaging identities
JP4696721B2 (en) * 2005-06-27 2011-06-08 富士ゼロックス株式会社 Document management server, document management system
US20100017885A1 (en) * 2005-07-01 2010-01-21 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Media markup identifier for alterable promotional segments
US20080086380A1 (en) * 2005-07-01 2008-04-10 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Alteration of promotional content in media works
US9426387B2 (en) 2005-07-01 2016-08-23 Invention Science Fund I, Llc Image anonymization
US20090037278A1 (en) * 2005-07-01 2009-02-05 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Implementing visual substitution options in media works
US9583141B2 (en) 2005-07-01 2017-02-28 Invention Science Fund I, Llc Implementing audio substitution options in media works
US20080052104A1 (en) * 2005-07-01 2008-02-28 Searete Llc Group content substitution in media works
US8910033B2 (en) * 2005-07-01 2014-12-09 The Invention Science Fund I, Llc Implementing group content substitution in media works
US9230601B2 (en) 2005-07-01 2016-01-05 Invention Science Fund I, Llc Media markup system for content alteration in derivative works
US7860342B2 (en) 2005-07-01 2010-12-28 The Invention Science Fund I, Llc Modifying restricted images
US20070294720A1 (en) * 2005-07-01 2007-12-20 Searete Llc Promotional placement in media works
US8732087B2 (en) 2005-07-01 2014-05-20 The Invention Science Fund I, Llc Authorization for media content alteration
US8203609B2 (en) 2007-01-31 2012-06-19 The Invention Science Fund I, Llc Anonymization pursuant to a broadcasted policy
US20070005423A1 (en) * 2005-07-01 2007-01-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Providing promotional content
US20090150444A1 (en) * 2005-07-01 2009-06-11 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Media markup for audio content alteration
US20080013859A1 (en) * 2005-07-01 2008-01-17 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Implementation of media content alteration
US20090150199A1 (en) * 2005-07-01 2009-06-11 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Visual substitution options in media works
US20100154065A1 (en) * 2005-07-01 2010-06-17 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Media markup for user-activated content alteration
US9092928B2 (en) 2005-07-01 2015-07-28 The Invention Science Fund I, Llc Implementing group content substitution in media works
US20090037243A1 (en) * 2005-07-01 2009-02-05 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Audio substitution options in media works
US20090210946A1 (en) * 2005-07-01 2009-08-20 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Media markup for promotional audio content
US20070263865A1 (en) * 2005-07-01 2007-11-15 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Authorization rights for substitute media content
US20090151004A1 (en) * 2005-07-01 2009-06-11 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Media markup for visual content alteration
US20090300480A1 (en) * 2005-07-01 2009-12-03 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Media segment alteration with embedded markup identifier
US20090204475A1 (en) * 2005-07-01 2009-08-13 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Media markup for promotional visual content
US20090235364A1 (en) * 2005-07-01 2009-09-17 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Media markup for promotional content alteration
US20080028422A1 (en) * 2005-07-01 2008-01-31 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Implementation of media content alteration
US20080010083A1 (en) * 2005-07-01 2008-01-10 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Approval technique for media content alteration
US20080052161A1 (en) * 2005-07-01 2008-02-28 Searete Llc Alteration of promotional content in media works
US20070266049A1 (en) * 2005-07-01 2007-11-15 Searete Llc, A Limited Liability Corportion Of The State Of Delaware Implementation of media content alteration
US9065979B2 (en) 2005-07-01 2015-06-23 The Invention Science Fund I, Llc Promotional placement in media works
US8688801B2 (en) * 2005-07-25 2014-04-01 Qurio Holdings, Inc. Syndication feeds for peer computer devices and peer networks
JP4687324B2 (en) * 2005-08-18 2011-05-25 富士ゼロックス株式会社 Information processing apparatus and association method
US7779347B2 (en) * 2005-09-02 2010-08-17 Fourteen40, Inc. Systems and methods for collaboratively annotating electronic documents
US8015482B2 (en) * 2005-09-14 2011-09-06 Microsoft Corporation Dynamic anchoring of annotations to editable content
US7992085B2 (en) 2005-09-26 2011-08-02 Microsoft Corporation Lightweight reference user interface
US7788590B2 (en) 2005-09-26 2010-08-31 Microsoft Corporation Lightweight reference user interface
US8005889B1 (en) 2005-11-16 2011-08-23 Qurio Holdings, Inc. Systems, methods, and computer program products for synchronizing files in a photosharing peer-to-peer network
KR100703705B1 (en) * 2005-11-18 2007-04-06 삼성전자주식회사 Multimedia comment process apparatus and method for movie
US20070130323A1 (en) * 2005-12-02 2007-06-07 Landsman Richard A Implied presence detection in a communication system
US9384178B2 (en) * 2005-12-09 2016-07-05 Adobe Systems Incorporated Review of signature based content
WO2007068091A1 (en) * 2005-12-12 2007-06-21 Audiokinetic Inc. Method and system for multi-version digital authoring
US8788572B1 (en) 2005-12-27 2014-07-22 Qurio Holdings, Inc. Caching proxy server for a peer-to-peer photosharing system
US20070168436A1 (en) * 2006-01-19 2007-07-19 Worldvuer, Inc. System and method for supplying electronic messages
US7836390B2 (en) * 2006-01-26 2010-11-16 Microsoft Corporation Strategies for processing annotations
US20070212507A1 (en) * 2006-03-13 2007-09-13 Arst Kevin M Document Flagging And Indexing System
WO2008026074A2 (en) * 2006-03-14 2008-03-06 Technische Universitat Darmstadt Distributed interactive augmentation of display output
US20070226606A1 (en) * 2006-03-27 2007-09-27 Peter Noyes Method of processing annotations using filter conditions to accentuate the visual representations of a subset of annotations
JP5649303B2 (en) * 2006-03-30 2015-01-07 エスアールアイ インターナショナルSRI International Method and apparatus for annotating media streams
US7954049B2 (en) 2006-05-15 2011-05-31 Microsoft Corporation Annotating multimedia files along a timeline
US20070276852A1 (en) * 2006-05-25 2007-11-29 Microsoft Corporation Downloading portions of media files
JP4946189B2 (en) * 2006-06-13 2012-06-06 富士ゼロックス株式会社 Annotation information distribution program and annotation information distribution apparatus
US20070294297A1 (en) * 2006-06-19 2007-12-20 Lawrence Kesteloot Structured playlists and user interface
JP4876734B2 (en) * 2006-06-22 2012-02-15 富士ゼロックス株式会社 Document use management system and method, document management server and program thereof
US8280014B1 (en) * 2006-06-27 2012-10-02 VoiceCaptionIt, Inc. System and method for associating audio clips with objects
US8640023B1 (en) * 2006-07-10 2014-01-28 Oracle America, Inc. Method and system for providing HTML page annotations using AJAX and JAVA enterprise edition
US7769144B2 (en) * 2006-07-21 2010-08-03 Google Inc. Method and system for generating and presenting conversation threads having email, voicemail and chat messages
US8121263B2 (en) 2006-07-21 2012-02-21 Google Inc. Method and system for integrating voicemail and electronic messaging
US20080046925A1 (en) * 2006-08-17 2008-02-21 Microsoft Corporation Temporal and spatial in-video marking, indexing, and searching
US7693906B1 (en) 2006-08-22 2010-04-06 Qurio Holdings, Inc. Methods, systems, and products for tagging files
US7739255B2 (en) * 2006-09-01 2010-06-15 Ma Capital Lllp System for and method of visual representation and review of media files
JP2008078690A (en) * 2006-09-19 2008-04-03 Fuji Xerox Co Ltd Image processing system
US8554827B2 (en) * 2006-09-29 2013-10-08 Qurio Holdings, Inc. Virtual peer for a content sharing system
US7782866B1 (en) 2006-09-29 2010-08-24 Qurio Holdings, Inc. Virtual peer in a peer-to-peer network
AU2007306939B2 (en) 2006-10-11 2012-06-07 Tagmotion Pty Limited Method and apparatus for managing multimedia files
US20080109305A1 (en) * 2006-11-08 2008-05-08 Ma Capital Lllp Using internet advertising as a test bed for radio advertisements
US20080109409A1 (en) * 2006-11-08 2008-05-08 Ma Capital Lllp Brokering keywords in radio broadcasts
US20080109845A1 (en) * 2006-11-08 2008-05-08 Ma Capital Lllp System and method for generating advertisements for use in broadcast media
US7979388B2 (en) * 2006-11-17 2011-07-12 Microsoft Corporation Deriving hierarchical organization from a set of tagged digital objects
US8176191B2 (en) * 2006-11-30 2012-05-08 Red Hat, Inc. Automated identification of high/low value content based on social feedback
US20080155627A1 (en) * 2006-12-04 2008-06-26 O'connor Daniel Systems and methods of searching for and presenting video and audio
JP5003131B2 (en) * 2006-12-04 2012-08-15 富士ゼロックス株式会社 Document providing system and information providing program
WO2008071992A2 (en) * 2006-12-15 2008-06-19 Duncan Hugh Barclay Improvements to a communications system
US8161387B1 (en) 2006-12-18 2012-04-17 At&T Intellectual Property I, L. P. Creation of a marked media module
US8082504B1 (en) 2006-12-18 2011-12-20 At&T Intellectual Property I, L.P. Creation of a reference point to mark a media presentation
US7559017B2 (en) * 2006-12-22 2009-07-07 Google Inc. Annotation framework for video
JP4305510B2 (en) * 2006-12-28 2009-07-29 富士ゼロックス株式会社 Information processing system, information processing apparatus, and program
JP5082460B2 (en) * 2007-01-19 2012-11-28 富士ゼロックス株式会社 Information processing apparatus, program, and information processing system
JP5023715B2 (en) * 2007-01-25 2012-09-12 富士ゼロックス株式会社 Information processing system, information processing apparatus, and program
US20080180539A1 (en) * 2007-01-31 2008-07-31 Searete Llc, A Limited Liability Corporation Image anonymization
US8768744B2 (en) 2007-02-02 2014-07-01 Motorola Mobility Llc Method and apparatus for automated user review of media content in a mobile communication device
US7739304B2 (en) * 2007-02-08 2010-06-15 Yahoo! Inc. Context-based community-driven suggestions for media annotation
US20080201632A1 (en) * 2007-02-16 2008-08-21 Palo Alto Research Center Incorporated System and method for annotating documents
US8438214B2 (en) 2007-02-23 2013-05-07 Nokia Corporation Method, electronic device, computer program product, system and apparatus for sharing a media object
US20080263103A1 (en) 2007-03-02 2008-10-23 Mcgregor Lucas Digital asset management system (DAMS)
US8583637B2 (en) * 2007-03-21 2013-11-12 Ricoh Co., Ltd. Coarse-to-fine navigation through paginated documents retrieved by a text search engine
US8584042B2 (en) * 2007-03-21 2013-11-12 Ricoh Co., Ltd. Methods for scanning, printing, and copying multimedia thumbnails
US8812969B2 (en) * 2007-03-21 2014-08-19 Ricoh Co., Ltd. Methods for authoring and interacting with multimedia representations of documents
US20080235564A1 (en) * 2007-03-21 2008-09-25 Ricoh Co., Ltd. Methods for converting electronic content descriptions
US20080244755A1 (en) * 2007-03-30 2008-10-02 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Authorization for media content alteration
JP2008257317A (en) * 2007-04-02 2008-10-23 Fuji Xerox Co Ltd Information processing apparatus, information processing system and program
US9501803B2 (en) * 2007-04-12 2016-11-22 Siemens Industry, Inc. Devices, systems, and methods for monitoring energy systems
US20080270161A1 (en) * 2007-04-26 2008-10-30 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Authorization rights for substitute media content
US9215512B2 (en) 2007-04-27 2015-12-15 Invention Science Fund I, Llc Implementation of media content alteration
US20080276159A1 (en) * 2007-05-01 2008-11-06 International Business Machines Corporation Creating Annotated Recordings and Transcripts of Presentations Using a Mobile Device
US8880529B2 (en) 2007-05-15 2014-11-04 Tivo Inc. Hierarchical tags with community-based ratings
AU2008254894C1 (en) 2007-05-15 2014-10-02 Tivo Solutions Inc. Multimedia content search and recording scheduling system
US7937663B2 (en) * 2007-06-29 2011-05-03 Microsoft Corporation Integrated collaborative user interface for a document editor program
US8091103B2 (en) * 2007-07-22 2012-01-03 Overlay.Tv Inc. Server providing content directories of video signals and linkage to content information sources
JP2009042856A (en) * 2007-08-07 2009-02-26 Fuji Xerox Co Ltd Document management device, document management system, and program
CN101364979B (en) * 2007-08-10 2011-12-21 鸿富锦精密工业(深圳)有限公司 Downloaded material parsing and processing system and method
US20090049374A1 (en) * 2007-08-16 2009-02-19 Andrew Echenberg Online magazine
KR100920795B1 (en) * 2007-08-28 2009-10-08 김정태 System for tracing copyright of contents and control method thereof
US20090064005A1 (en) * 2007-08-29 2009-03-05 Yahoo! Inc. In-place upload and editing application for editing media assets
JP5119840B2 (en) * 2007-10-02 2013-01-16 富士ゼロックス株式会社 Information processing apparatus, information processing system, and program
US20090113281A1 (en) * 2007-10-31 2009-04-30 Tyler Close Identifying And Displaying Tags From Identifiers In Privately Stored Messages
US20090119332A1 (en) * 2007-11-01 2009-05-07 Lection David B Method And System For Providing A Media Transition Having A Temporal Link To Presentable Media Available From A Remote Content Provider
US20090132924A1 (en) * 2007-11-15 2009-05-21 Yojak Harshad Vasa System and method to create highlight portions of media content
US20090157827A1 (en) * 2007-12-14 2009-06-18 Srinivas Bangalore System and method for generating response email templates
US8340492B2 (en) * 2007-12-17 2012-12-25 General Instrument Corporation Method and system for sharing annotations in a communication network
US20090164572A1 (en) * 2007-12-20 2009-06-25 Motorola, Inc. Apparatus and method for content item annotation
US20090172714A1 (en) * 2007-12-28 2009-07-02 Harel Gruia Method and apparatus for collecting metadata during session recording
US20100241947A1 (en) * 2007-12-31 2010-09-23 Michael Dahn Advanced features, service and displays of legal and regulatory information
US8181111B1 (en) 2007-12-31 2012-05-15 Synchronoss Technologies, Inc. System and method for providing social context to digital activity
US8756101B2 (en) 2008-01-25 2014-06-17 Tunein, Inc. User and stream demographics metadata guide based content services
US8595373B2 (en) * 2008-01-25 2013-11-26 Tuneln, Inc. Guide based content services
WO2009095086A1 (en) 2008-01-29 2009-08-06 Nokia Siemens Networks Oy Method and device for providing content information and system comprising such device
US8181197B2 (en) 2008-02-06 2012-05-15 Google Inc. System and method for voting on popular video intervals
EP2091047B1 (en) * 2008-02-14 2012-11-14 ORT Medienverbund GmbH Method for processing a video
US20090210778A1 (en) * 2008-02-19 2009-08-20 Kulas Charles J Video linking to electronic text messaging
US8112702B2 (en) 2008-02-19 2012-02-07 Google Inc. Annotating video intervals
US8793256B2 (en) 2008-03-26 2014-07-29 Tout Industries, Inc. Method and apparatus for selecting related content for display in conjunction with a media
US8843552B2 (en) * 2008-04-21 2014-09-23 Syngrafii Inc. System, method and computer program for conducting transactions remotely
US20090300517A1 (en) * 2008-05-31 2009-12-03 International Business Machines Corporation Providing user control of historical messages in electronic mail chain to be included in forwarded or replied electronic mail message
US8566353B2 (en) * 2008-06-03 2013-10-22 Google Inc. Web-based system for collaborative generation of interactive videos
US20090307199A1 (en) * 2008-06-10 2009-12-10 Goodwin James P Method and apparatus for generating voice annotations for playlists of digital media
US8892553B2 (en) * 2008-06-18 2014-11-18 Microsoft Corporation Auto-generation of events with annotation and indexing
US20100235379A1 (en) * 2008-06-19 2010-09-16 Milan Blair Reichbach Web-based multimedia annotation system
GB2461771A (en) * 2008-07-11 2010-01-20 Icyte Pty Ltd Annotation of electronic documents with preservation of document as originally annotated
US20100017694A1 (en) * 2008-07-18 2010-01-21 Electronic Data Systems Corporation Apparatus, and associated method, for creating and annotating content
US8990848B2 (en) 2008-07-22 2015-03-24 At&T Intellectual Property I, L.P. System and method for temporally adaptive media playback
US7996422B2 (en) 2008-07-22 2011-08-09 At&T Intellectual Property L.L.P. System and method for adaptive media playback based on destination
US10127231B2 (en) * 2008-07-22 2018-11-13 At&T Intellectual Property I, L.P. System and method for rich media annotation
US8484297B2 (en) * 2008-07-31 2013-07-09 Palo Alto Research Center Incorporated Method for collaboratively tagging and highlighting electronic documents
US20100036856A1 (en) 2008-08-05 2010-02-11 International Business Machines Corporation Method and system of tagging email and providing tag clouds
US8548503B2 (en) 2008-08-28 2013-10-01 Aol Inc. Methods and system for providing location-based communication services
US8510664B2 (en) * 2008-09-06 2013-08-13 Steven L. Rueben Method and system for displaying email thread information
US8751559B2 (en) * 2008-09-16 2014-06-10 Microsoft Corporation Balanced routing of questions to experts
JP5051080B2 (en) * 2008-09-25 2012-10-17 富士通株式会社 Information display device, information display method, and program
US20100153848A1 (en) * 2008-10-09 2010-06-17 Pinaki Saha Integrated branding, social bookmarking, and aggregation system for media content
US8548946B2 (en) * 2008-10-14 2013-10-01 Microsoft Corporation Content package for electronic distribution
US8589502B2 (en) * 2008-12-31 2013-11-19 International Business Machines Corporation System and method for allowing access to content
DE202010018601U1 (en) 2009-02-18 2018-04-30 Google LLC (n.d.Ges.d. Staates Delaware) Automatically collecting information, such as gathering information using a document recognizing device
US9195739B2 (en) * 2009-02-20 2015-11-24 Microsoft Technology Licensing, Llc Identifying a discussion topic based on user interest information
US8447066B2 (en) 2009-03-12 2013-05-21 Google Inc. Performing actions based on capturing information from rendered documents, such as documents under copyright
CN102349087B (en) 2009-03-12 2015-05-06 谷歌公司 Automatically providing content associated with captured information, such as information captured in real-time
US8826117B1 (en) 2009-03-25 2014-09-02 Google Inc. Web-based system for video editing
US8973153B2 (en) * 2009-03-30 2015-03-03 International Business Machines Corporation Creating audio-based annotations for audiobooks
US8132200B1 (en) 2009-03-30 2012-03-06 Google Inc. Intra-video ratings
US8554848B2 (en) * 2009-04-16 2013-10-08 At&T Intellectual Property 1, L.P. Collective asynchronous media review
US20100325557A1 (en) * 2009-06-17 2010-12-23 Agostino Sibillo Annotation of aggregated content, systems and methods
US8276077B2 (en) 2009-07-10 2012-09-25 The Mcgraw-Hill Companies, Inc. Method and apparatus for automatic annotation of recorded presentations
US20110010397A1 (en) * 2009-07-13 2011-01-13 Prateek Kathpal Managing annotations decoupled from local or remote sources
CN101997845A (en) * 2009-08-12 2011-03-30 英业达股份有限公司 Release system and method of extended data
US20110066945A1 (en) * 2009-09-14 2011-03-17 Yuk-Shan Lee Video Greeting Cards
US20110087764A1 (en) * 2009-10-14 2011-04-14 Dror Yaffe Engine for generating and managing communications concerning on-line collaboration among a plurality of users in consideration with a computerized application
US9124642B2 (en) * 2009-10-16 2015-09-01 Qualcomm Incorporated Adaptively streaming multimedia
KR20110047768A (en) 2009-10-30 2011-05-09 삼성전자주식회사 Apparatus and method for displaying multimedia contents
US8438131B2 (en) * 2009-11-06 2013-05-07 Altus365, Inc. Synchronization of media resources in a media archive
US8255006B1 (en) 2009-11-10 2012-08-28 Fusionone, Inc. Event dependent notification system and method
US8373741B2 (en) * 2009-11-20 2013-02-12 At&T Intellectual Property I, Lp Apparatus and method for collaborative network in an enterprise setting
US20110125560A1 (en) * 2009-11-25 2011-05-26 Altus Learning Systems, Inc. Augmenting a synchronized media archive with additional media resources
US9081799B2 (en) 2009-12-04 2015-07-14 Google Inc. Using gestalt information to identify locations in printed information
US9323784B2 (en) 2009-12-09 2016-04-26 Google Inc. Image search using text-based elements within the contents of images
US9760868B2 (en) * 2009-12-15 2017-09-12 International Business Machines Corporation Electronic document annotation
US9836482B2 (en) * 2009-12-29 2017-12-05 Google Inc. Query categorization based on image results
US20110258526A1 (en) * 2010-04-20 2011-10-20 International Business Machines Corporation Web content annotation management web browser plug-in
US8521822B2 (en) * 2010-07-23 2013-08-27 Blackberry Limited Recipient change notification
US9699503B2 (en) 2010-09-07 2017-07-04 Opentv, Inc. Smart playlist
US10210160B2 (en) 2010-09-07 2019-02-19 Opentv, Inc. Collecting data from different sources
US8949871B2 (en) 2010-09-08 2015-02-03 Opentv, Inc. Smart media selection based on viewer user presence
KR20120026762A (en) * 2010-09-10 2012-03-20 삼성전자주식회사 User terminal apparatus, server and display method, information porviding method thereof
US8626358B2 (en) * 2010-09-29 2014-01-07 Honeywell International Inc. Automatic presentation of a shortcut prompt to view a downlink request message responsive to a confirm-response message
US8943428B2 (en) 2010-11-01 2015-01-27 Synchronoss Technologies, Inc. System for and method of field mapping
US9189818B2 (en) 2010-12-10 2015-11-17 Quib, Inc. Association of comments with screen locations during media content playback
US8918447B2 (en) * 2010-12-10 2014-12-23 Sap Se Methods, apparatus, systems and computer readable mediums for use in sharing information between entities
US20120179755A1 (en) * 2010-12-27 2012-07-12 Fishkin James S Deliberative Polling Incorporating Ratings By A Random Sample
US20130334300A1 (en) 2011-01-03 2013-12-19 Curt Evans Text-synchronized media utilization and manipulation based on an embedded barcode
JP5742378B2 (en) * 2011-03-30 2015-07-01 ソニー株式会社 Information processing apparatus, playlist generation method, and playlist generation program
US9317861B2 (en) * 2011-03-30 2016-04-19 Information Resources, Inc. View-independent annotation of commercial data
WO2013016596A2 (en) * 2011-07-28 2013-01-31 Scrawl, Inc. System for annotating documents served by a document system without functional dependence on the document system
US20130031457A1 (en) * 2011-07-28 2013-01-31 Peter Griffiths System for Creating and Editing Temporal Annotations of Documents
US20130031455A1 (en) 2011-07-28 2013-01-31 Peter Griffiths System for Linking to Documents with Associated Annotations
US10079039B2 (en) * 2011-09-26 2018-09-18 The University Of North Carolina At Charlotte Multi-modal collaborative web-based video annotation system
US9483454B2 (en) * 2011-10-07 2016-11-01 D2L Corporation Systems and methods for context specific annotation of electronic files
US9077779B2 (en) * 2011-10-28 2015-07-07 Cinemo Gmbh Client device, method and computer program for playing media content
EP2605469A1 (en) * 2011-12-13 2013-06-19 Thomson Licensing Method and apparatus to control a multipath adaptive streaming session
US9245020B2 (en) * 2011-12-14 2016-01-26 Microsoft Technology Licensing, Llc Collaborative media sharing
US8805418B2 (en) 2011-12-23 2014-08-12 United Video Properties, Inc. Methods and systems for performing actions based on location-based rules
US9628296B2 (en) 2011-12-28 2017-04-18 Evernote Corporation Fast mobile mail with context indicators
WO2013128061A1 (en) * 2012-02-27 2013-09-06 Nokia Corporation Media tagging
US9846696B2 (en) 2012-02-29 2017-12-19 Telefonaktiebolaget Lm Ericsson (Publ) Apparatus and methods for indexing multimedia content
US9225936B2 (en) * 2012-05-16 2015-12-29 International Business Machines Corporation Automated collaborative annotation of converged web conference objects
US8917908B2 (en) * 2012-07-12 2014-12-23 Palo Alto Research Center Incorporated Distributed object tracking for augmented reality application
US8522130B1 (en) * 2012-07-12 2013-08-27 Chegg, Inc. Creating notes in a multilayered HTML document
US9633015B2 (en) 2012-07-26 2017-04-25 Telefonaktiebolaget Lm Ericsson (Publ) Apparatus and methods for user generated content indexing
US20140164901A1 (en) * 2012-07-26 2014-06-12 Tagaboom, Inc. Method and apparatus for annotating and sharing a digital object with multiple other digital objects
US20140143218A1 (en) * 2012-11-20 2014-05-22 Apple Inc. Method for Crowd Sourced Multimedia Captioning for Video Content
US9678617B2 (en) 2013-01-14 2017-06-13 Patrick Soon-Shiong Shared real-time content editing activated by an image
US8935734B2 (en) 2013-02-01 2015-01-13 Ebay Inc. Methods, systems and apparatus for configuring a system of content access devices
US9002837B2 (en) * 2013-03-15 2015-04-07 Ipar, Llc Systems and methods for providing expert thread search results
US10341275B2 (en) * 2013-04-03 2019-07-02 Dropbox, Inc. Shared content item commenting
US10445367B2 (en) 2013-05-14 2019-10-15 Telefonaktiebolaget Lm Ericsson (Publ) Search engine for textual content and non-textual content
US10891428B2 (en) * 2013-07-25 2021-01-12 Autodesk, Inc. Adapting video annotations to playback speed
EP3039811B1 (en) 2013-08-29 2021-05-05 Telefonaktiebolaget LM Ericsson (publ) Method, content owner device, computer program, and computer program product for distributing content items to authorized users
KR102147935B1 (en) * 2013-08-29 2020-08-25 삼성전자주식회사 Method for processing data and an electronic device thereof
US10311038B2 (en) 2013-08-29 2019-06-04 Telefonaktiebolaget Lm Ericsson (Publ) Methods, computer program, computer program product and indexing systems for indexing or updating index
TWI651640B (en) * 2013-10-16 2019-02-21 3M新設資產公司 Organize digital notes on the user interface
US9898451B2 (en) 2013-11-26 2018-02-20 Adobe Systems Incorporated Content adaptation based on selected reviewer comment
US20150161087A1 (en) 2013-12-09 2015-06-11 Justin Khoo System and method for dynamic imagery link synchronization and simulating rendering and behavior of content across a multi-client platform
WO2015171835A1 (en) 2014-05-06 2015-11-12 Tivo Inc. Cloud-based media content management
US10019517B2 (en) 2014-05-06 2018-07-10 Tivo Solutions Inc. Managing media content upload groups
US9553843B1 (en) 2014-10-08 2017-01-24 Google Inc. Service directory profile for a fabric network
JP6083712B2 (en) 2014-10-15 2017-02-22 インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation Apparatus and method for supporting message sharing
US10140379B2 (en) 2014-10-27 2018-11-27 Chegg, Inc. Automated lecture deconstruction
US9753921B1 (en) * 2015-03-05 2017-09-05 Dropbox, Inc. Comment management in shared documents
US20160294891A1 (en) 2015-03-31 2016-10-06 Facebook, Inc. Multi-user media presentation system
US10606941B2 (en) * 2015-08-10 2020-03-31 Open Text Holdings, Inc. Annotating documents on a mobile device
US10185468B2 (en) 2015-09-23 2019-01-22 Microsoft Technology Licensing, Llc Animation editor
US10545624B2 (en) 2016-03-21 2020-01-28 Microsoft Technology Licensing, Llc User interfaces for personalized content recommendation
US10282402B2 (en) 2017-01-06 2019-05-07 Justin Khoo System and method of proofing email content
US20190066051A1 (en) * 2017-08-24 2019-02-28 Moxtra Inc. Message thread workflow
WO2019060898A1 (en) * 2017-09-25 2019-03-28 Dash Radio Inc. Method and system for selecting different versions of electronic media compositions in real time
US10901687B2 (en) * 2018-02-27 2021-01-26 Dish Network L.L.C. Apparatus, systems and methods for presenting content reviews in a virtual world
GB201804383D0 (en) 2018-03-19 2018-05-02 Microsoft Technology Licensing Llc Multi-endpoint mixed reality meetings
US11102316B1 (en) 2018-03-21 2021-08-24 Justin Khoo System and method for tracking interactions in an email
US10853514B2 (en) 2018-05-10 2020-12-01 Dell Products, L.P. System and method to manage versioning and modifications of content in a centralized content handling system
US11538045B2 (en) 2018-09-28 2022-12-27 Dish Network L.L.C. Apparatus, systems and methods for determining a commentary rating
US10681402B2 (en) 2018-10-09 2020-06-09 International Business Machines Corporation Providing relevant and authentic channel content to users based on user persona and interest
US11527329B2 (en) 2020-07-28 2022-12-13 Xifin, Inc. Automatically determining a medical recommendation for a patient based on multiple medical images from multiple different medical imaging modalities

Family Cites Families (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4649499A (en) 1984-03-07 1987-03-10 Hewlett-Packard Company Touchscreen two-dimensional emulation of three-dimensional objects
AU2868092A (en) 1991-09-30 1993-05-03 Riverrun Technology Method and apparatus for managing information
US5524193A (en) 1991-10-15 1996-06-04 And Communications Interactive multimedia annotation method and apparatus
US5333266A (en) 1992-03-27 1994-07-26 International Business Machines Corporation Method and apparatus for message handling in computer systems
DE69426714T2 (en) * 1993-06-30 2001-08-02 Canon Kk Document processing method and device
EP0650126A1 (en) 1993-10-21 1995-04-26 BULL HN INFORMATION SYSTEMS ITALIA S.p.A. Annotation data processing system with hypermedia processable and active annotations
US5583980A (en) 1993-12-22 1996-12-10 Knowledge Media Inc. Time-synchronized annotation method
US5699089A (en) 1994-03-03 1997-12-16 Applied Voice Technology Central control for sequential-playback objects
US5600775A (en) 1994-08-26 1997-02-04 Emotion, Inc. Method and apparatus for annotating full motion video and other indexed data structures
US5633916A (en) 1994-12-30 1997-05-27 Unisys Corporation Universal messaging service using single voice grade telephone line within a client/server architecture
US5729730A (en) 1995-03-28 1998-03-17 Dex Information Systems, Inc. Method and apparatus for improved information storage and retrieval system
GB2301260A (en) 1995-05-26 1996-11-27 Ibm Voice mail system
US5572643A (en) 1995-10-19 1996-11-05 Judson; David H. Web browser with dynamic display of information objects during linking
US5838313A (en) * 1995-11-20 1998-11-17 Siemens Corporate Research, Inc. Multimedia-based reporting system with recording and playback of dynamic annotation
US6081829A (en) 1996-01-31 2000-06-27 Silicon Graphics, Inc. General purpose web annotations without modifying browser
US5903892A (en) * 1996-05-24 1999-05-11 Magnifi, Inc. Indexing of media content on a network
US5923848A (en) * 1996-05-31 1999-07-13 Microsoft Corporation System and method for resolving names in an electronic messaging environment
AU725370C (en) 1996-06-18 2003-01-02 Cranberry Properties, Llc Integrated voice, facsimile and electronic mail messaging system
JPH1021261A (en) 1996-07-05 1998-01-23 Hitachi Ltd Method and system for multimedia data base retrieval
US5969716A (en) 1996-08-06 1999-10-19 Interval Research Corporation Time-based media processing system
US5893110A (en) 1996-08-16 1999-04-06 Silicon Graphics, Inc. Browser driven user interface to a media asset database
US5732216A (en) 1996-10-02 1998-03-24 Internet Angles, Inc. Audio message exchange system
US5809250A (en) 1996-10-23 1998-09-15 Intel Corporation Methods for creating and sharing replayable modules representive of Web browsing session
US6006241A (en) 1997-03-14 1999-12-21 Microsoft Corporation Production of a video stream with synchronized annotations over a computer network
US5991365A (en) * 1997-03-12 1999-11-23 Siemens Corporate Research, Inc. Remote phone-based access to a universal multimedia mailbox
US6173317B1 (en) 1997-03-14 2001-01-09 Microsoft Corporation Streaming and displaying a video stream with synchronized annotations over a computer network
US6449653B2 (en) * 1997-03-25 2002-09-10 Microsoft Corporation Interleaved multiple multimedia stream for synchronized transmission over a computer network
US6009462A (en) * 1997-06-16 1999-12-28 Digital Equipment Corporation Replacing large bit component of electronic mail (e-mail) message with hot-link in distributed computer system
US6259445B1 (en) * 1997-07-07 2001-07-10 Informix, Inc. Computer-based documentation and instruction
US6360234B2 (en) 1997-08-14 2002-03-19 Virage, Inc. Video cataloger system with synchronized encoders
US6173287B1 (en) 1998-03-11 2001-01-09 Digital Equipment Corporation Technique for ranking multimedia annotations of interest
US6105055A (en) 1998-03-13 2000-08-15 Siemens Corporate Research, Inc. Method and apparatus for asynchronous multimedia collaboration
US6584479B2 (en) 1998-06-17 2003-06-24 Xerox Corporation Overlay presentation of textual and graphical annotations
US6144375A (en) 1998-08-14 2000-11-07 Praja Inc. Multi-perspective viewer for content-based interactivity
US6366296B1 (en) 1998-09-11 2002-04-02 Xerox Corporation Media browser using multimodal analysis
US7051275B2 (en) 1998-09-15 2006-05-23 Microsoft Corporation Annotations for multiple versions of media content
US6317141B1 (en) 1998-12-31 2001-11-13 Flashpoint Technology, Inc. Method and apparatus for editing heterogeneous media objects in a digital imaging device
US6452615B1 (en) 1999-03-24 2002-09-17 Fuji Xerox Co., Ltd. System and apparatus for notetaking with digital video and ink

Also Published As

Publication number Publication date
AU5926099A (en) 2000-04-03
WO2000016541A1 (en) 2000-03-23
US20030196164A1 (en) 2003-10-16
US20010042098A1 (en) 2001-11-15
AU5926499A (en) 2000-04-03
WO2000016221A1 (en) 2000-03-23
US7162690B2 (en) 2007-01-09
US20060143560A1 (en) 2006-06-29
US6484156B1 (en) 2002-11-19
US7051275B2 (en) 2006-05-23
US6917965B2 (en) 2005-07-12

Similar Documents

Publication Publication Date Title
US7111009B1 (en) Interactive playlist generation using annotations
US6484156B1 (en) Accessing annotations across multiple target media streams
US7506262B2 (en) User interface for creating viewing and temporally positioning annotations for media content
US6546405B2 (en) Annotating temporally-dimensioned multimedia content
US7562302B1 (en) System and method for automatic generation of visual representations and links in a hierarchical messaging system
US6557042B1 (en) Multimedia summary generation employing user feedback
EP1068579B1 (en) System and method for providing interactive components in motion video
US7660416B1 (en) System and method for media content collaboration throughout a media production process
US8091026B2 (en) Methods and apparatuses for processing digital objects
US9092173B1 (en) Reviewing and editing word processing documents
US7506246B2 (en) Printing a custom online book and creating groups of annotations made by various users using annotation identifiers before the printing
Chiu et al. LiteMinutes: an Internet-based system for multimedia meeting minutes
US20030124502A1 (en) Computer method and apparatus to digitize and simulate the classroom lecturing
US20020026521A1 (en) System and method for managing and distributing associated assets in various formats
US20040268224A1 (en) Authoring system for combining temporal and nontemporal digital media
US20070250899A1 (en) Nondestructive self-publishing video editing system
CN101657814A (en) Systems and methods for specifying frame-accurate images for media asset management
JP2002057981A (en) Interface to access data stream, generating method for retrieval for access to data stream, data stream access method and device to access video from note
JP2003150542A (en) Method for sharing annotation information to be added to digital content, program and computer system
JP2007036830A (en) Moving picture management system, moving picture managing method, client, and program
US9286309B2 (en) Representation of last viewed or last modified portion of a document
US8418051B1 (en) Reviewing and editing word processing documents
US8296647B1 (en) Reviewing and editing word processing documents
CN107066437B (en) Method and device for labeling digital works
Mu et al. Enriched video semantic metadata: Authorization, integration, and presentation

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ DE DK DM EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
AK Designated states

Kind code of ref document: C2

Designated state(s): AE AL AM AT AU AZ BA BB BG BR BY CA CH CN CR CU CZ DE DK DM EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: C2

Designated state(s): GH GM KE LS MW SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

COP Corrected version of pamphlet

Free format text: PAGES 1/9-9/9, DRAWINGS, REPLACED BY NEW PAGES 1/9-9/9; DUE TO LATE TRANSMITTAL BY THE RECEIVING OFFICE

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase